Problems with performance using UPDATE

2

I'm building an import application that takes the lines of a txt file and does the update in two tables, it works normally, but the files usually have more than 5 thousand lines, and I'm not able to do the update of all tables with my current method. How could I optimize this code?

foreach($file as $line):
            $row = explode(';', $line);

            $prodcode = $row[0];
            $prodcurrentinv = $row[3];
            $prodprice = number_format(intval($row[5]), 2, '.', '') . '00';
            $prodretailprice = $prodprice;

            $query = $pdo->prepare('UPDATE isc_products SET prodcurrentinv = :prodcurrentinv, prodretailprice = :prodretailprice, prodprice = :prodprice WHERE prodcode = :prodcode LIMIT 1');
            $query->bindParam(':prodcurrentinv', $prodcurrentinv, PDO::PARAM_STR, 11);
            $query->bindParam(':prodretailprice', $prodretailprice, PDO::PARAM_STR, 11);
            $query->bindParam(':prodprice', $prodprice, PDO::PARAM_STR, 11);
            $query->bindParam(':prodcode', $prodcode, PDO::PARAM_STR, 12);
            $query->execute();

            $query = $pdo->prepare('UPDATE isc_product_variation_combinations SET vcstock = :vcstock WHERE vcsku = :vcsku LIMIT 1');
            $query->bindParam(':vcstock', $prodcurrentinv, PDO::PARAM_STR, 11);
            $query->bindParam(':vcsku', $prodcode, PDO::PARAM_STR, 12);
            $query->execute();
        endforeach;
    
asked by anonymous 14.03.2014 / 17:12

3 answers

4

You should consider operations made in bulk, as discussed in this answer in English .

Basically, you create a temporary table and populate it via bulk. Once you have done this, you update your table doing a join with this temporary table.

You should now research how to do bulk operations in MySQL.

Bulk operations should always be considered when the data mass is too large. It is often this way that companies integrate systems that generate huge amounts of data.

Usually it works like this: you mount a text file with the data, where the columns can be separated by tab, for example. Once you have done this, you put this file somewhere where the bank server has access. With specific commands you can get the database to load this file and insert the data into a table. Because I'm specific I can not help you more than that. You need to see how it works in MySQL.

In any case, the concept is general. It has in Oracle, SQL Server, MySQL, etc.

    
14.03.2014 / 18:30
2

You can try to update multiple records per query using CASE .

For example, instead of:

UPDATE tabela SET valor = 123 WHERE indice = 1;
UPDATE tabela SET valor = 456 WHERE indice = 2;
UPDATE tabela SET valor = 789 WHERE indice = 3;

You can use:

UPDATE tabela SET valor = CASE indice
    WHEN 1 THEN 123
    WHEN 2 THEN 456
    WHEN 3 THEN 789
END

Following this format, you can set up your script to do the following:

// prepara os dados
foreach($file as &$line)
{
    $row = explode(';', $line);
    $line = array(
        $row[0],
        $row[3],
        number_format(intval($row[5]), 2, '.', '') . '00'
    );
}

// monta a query
$query = 'UPDATE 'isc_products' SET ';

$query .= ''prodcurrentinv' = CASE 'prodcode' ';
foreach ($file as $line)
    $query .= ' WHEN ' . $line[0] . ' THEN ' . $line[1];
$query .= ' END';

$query .= ', 'prodprice' = 'prodretailprice' = CASE 'prodcode' ';
foreach ($file as $line)
    $query .= ' WHEN ' . $line[0] . " THEN '" . $line[2] . "'";
$query .= ' END';

// executa o trem
$pdo->exec($query);

// monta a outra query
$query = 'UPDATE 'isc_product_variation_combinations' SET 'vcstock' = CASE 'vcsku' ';

foreach ($file as $line)
    $query .= ' WHEN ' . $line[0] . " THEN '" . $line[1] . "'";
$query .= ' END';

// executa o trem
$pdo->exec($query);

Note that it is impractical to use prepared statements in this case, so be sure to filter the data that will be inserted into the query to avoid SQL injection.

    
14.03.2014 / 19:17
1

You should have the bank do all of this in a single query.

You are currently using foreach to execute each row containing a query. Actually, depending on the amount of runs it can slow down and end up resulting in a Timeout.

A general tip, do everything you can in a single query, avoid making multiple querys, the more you want, the lower your system's performance will be.

But if you still believe that it's still necessary to make these 5,000 queries for things to work, reflect a bit, there may be some problem with your system logic, try to improve it.

You can also generate foreach with all your querys in a variable, and execute it only at the end, even though it will not be the ideal solution.

LIMIT in foreach

Try to impose a limit on your foreach .

$i = 0;
foreach($file as $line){
    if($i <= 100){
       // executa query
    }
    $i++
}

LIMIT

Try using LIMIT , so you can limit to 1,000 records for example. And run again until all your records are updated according to the conditions.

set_time_limit ()

You can also try changing the script's runtime using the set_time_limit() . The default execution time for a script is 30 seconds, you can set it to set_time_limit(0) so you have no runtime.

Reference:

14.03.2014 / 17:34