PHP code refactoring to provide image with specific size

6

The code below aims to provide an image with specific measures to minimize its size and thus optimize web-site loading.

Problem

The image takes more than 1 second to render, making it a heavy load on the initial upload of the web-site:

The collected size is 76.5KB but the wait time by the server is scary being between 800ms and 900ms:

Original image
If you pull the original image, it takes about 430ms to 160.7KB.

PHP code

The code below receives the width and height of the screen. Prepares the image for the received measurements and returns it to the browser:

ob_start("ob_gzhandler");

$file = "bg_body.jpg";

if (is_file($file)) {    
    $source_image = imagecreatefromjpeg($file);
    $source_imagex = $dest_imagex = imagesx($source_image);
    $source_imagey = $dest_imagey = imagesy($source_image);

    if (isset($_GET["w"]) && ctype_digit($_GET["w"])) {
        $dest_imagex = $_GET["w"];
    }

    if (isset($_GET["h"]) && ctype_digit($_GET["h"])) {
        $dest_imagey = $_GET["h"];
    }

    $dest_image = imagecreatetruecolor($dest_imagex, $dest_imagey);
    imagecopyresampled($dest_image, $source_image, 0, 0, 0, 0, $dest_imagex, $dest_imagey, $source_imagex, $source_imagey);
    header("Content-Type: image/jpeg");
    imagejpeg($dest_image,NULL,70);

} else {
    echo "Image file not found!";
}

ob_end_flush();

Question

How can I optimize this code with the ultimate goal of reducing the time it takes to generate the image to send to the browser?

    
asked by anonymous 12.03.2014 / 19:55

1 answer

4

Although there may be some specific optimization that can be done in the image resizing algorithm, I believe the correct output for your problem is to cache images with different sizes at the time of upload and not at the time of viewing or download of the image.

I've been reading about how Facebook makes billions of photos . Basically they store the image in 4 different sizes and try to make sure the reading is done quickly.

On your case, one possibility is to create some size variations and then send the one closest to the size requested by the user, the final resizing, a fine adjustment done via width and height in HTML or CSS.

Another technique would be to store the resized images after generating them, and if they already exist, read the file directly. This will leave the first access slower, since in addition to resizing the image it will be necessary to write it to the disk, however this is good if the number of views is large, since only the first access will be impacted and the others will be served without further processing.

In this second scenario, you can still use ranges of values so you do not incur in generating images of similar sizes. For example, the 1000-dot-wide image is used if the user requests images in the range of 801 to 1000, while the 800-pixel version is used if the request is 601 to 600 pixels.

Finally, the two proposals are to create caches of images (1) at the time of sending or (2) at the first request.

Obviously, the two have different effects on performance and storage requirements. Creating images on demand slows the first access but saves disk.

    
12.03.2014 / 20:28