Optimization, quickness in select, and PHP code

3

Following the reasoning of this question , I did this method, where I select multiple users from once to return only 1 in random order, that is, my query that is in the function makes a sweep in the whole table to return only one result, this is overloading my server, where I have more than 2 thousand registered accounts. p>

Here is my code:

public function random($username) {
    if ($this->vip($username)) {
        $sql = "SELECT * FROM 'viperusers' WHERE vip < 1 ORDER BY RAND() LIMIT 1";
    } else {
        $sql = "SELECT * FROM 'viperusers' WHERE vip = 0 ORDER BY RAND() LIMIT 1";
    }

    $query = $this->_db->select($sql);

    if (empty($query[0]->username)) {
        return $this->random($username);
    }

    return $query;
}

I can not find another way, to do without using RAND() in my SELECT , remembering that I have another table called vipersends that saved my ds_user_id and ds_user_id of the other user. It's overloading.

    
asked by anonymous 08.12.2017 / 20:38

2 answers

2

You can get number of accounts, and use php rand ().

public function random($username) {  


    $sql = "SELECT count(idx_user) FROM 'viperusers'" ;
    $query = $this->_db->select($sql);
    $total_de_contas = $query[0];
    $rr = rand(0, $total_de_contas);


    $sql = "SELECT * FROM 'viperusers' WHERE idx_user = $rr";

    $query = $this->_db->select($sql);

    if (empty($query[0]->username)) {
        return $this->random($username);
    }

    return $query;
}
    
08.12.2017 / 20:51
2

This can have several reasons, including the form itself that organized the database or even the hardware. Running a test, on a table containing 239289977 of rows and more than 46GB, SELECT * FROM tabela ORDER BY RAND() LIMIT 1 took exactly 00:02:07 . It's a big time, obviously, but the number of registrations is much higher than 2000, with that amount I believe the time would be as fast as a% conventional.

SELECT can contribute to slowness, since it creates a temporary table. Temporary tables are saved to disk (SSD / HDD) and this will destroy any performance. (I do not know if this has been modified in recent versions of MySQL!) . You can force MySQL to use RAM, but as long as you have enough RAM, you can use:

SELECT SQL_SMALL_RESULT WHERE vip = 0 ORDER BY RAND() LIMIT 1

This is also limited by RAND() , set in my.cnf. In general, this will not change the situation much.

The indices will be ignored in the temporary table, that is, the only index that can be imported is tmp_table_size , only.

A better solution requires you to have sequential data, all 2000 values should be from 0 to 1999, with no gap and all of them are selectable (which is not the case).

So that's enough to do:

$maxsql = mysqli_query($conn, 'SELECT id FROM tabela ORDER BY id DESC LIMIT 1');
list($max) = mysqli_fetch_row($maxsql);


$random = random_int(0, $max);
$sql = 'SELECT * FROM tabela WHERE vip = 0 AND id >= ' . random;
//...

However, in your case you have both vip and vip = 0 , which makes it unfeasible, because it becomes non-uniform. Note that vip <1 is id=999 , it can not be selected in the above situation, this will cause vip=-1 to be selected with both id=1000 and $random=999 .

This can be solved by creating two distinct tables, a $random=1000 and a tabela_normal .

In addition, in the end you can also pre-compute data that is costly. If you have tabela_vip , but run this many times, it would be better to generate a LIMIT 1 and store it once. The cost of a LIMIT 50 is much lower, in your case, than the cost of SELECT , so reading a previously generated data is much faster. This would drastically reduce the number of requests. Again, this is in the architecture itself, how you built it.

    
09.12.2017 / 17:12