Subdomain for files, Site Optimization

5

Well, I've heard that for a greater number of parallel downloads the indication is to put them in a subdomain, and even a fact that cookies do not go in the request would also make it faster.

At last, truth or myth, how far can this practice help? (JS, CSS) is also valid?

    
asked by anonymous 09.06.2015 / 02:34

4 answers

6

Depends on several factors,

If the subdomain continues to stay on the same server as the website, I do not think it will get any optimization.

2 - If you use some technology to protect your website like cloudflare you can delay your downloads or even optimize, it all depends on where the user is located and your Internet speed. >

3 - I do not think cookies could delay downloads, I do not see a reason for them to happen.

Basically, if you want to have a site that uploads the fast files to the users, you can create a subdomain where the subdomain points to another server of yours, where this server is only for downloading and uploading files. You will have a suberbo website where the downloads can be super fast example ( downloads1.seuwebsite.com can point to 1 server your downloads2.seuwebsite.com can point to another server your) usually this technique is used when the machine is no longer available space, or even 1 other server with super fast internet for larger files.

I hope I have helped.

    
22.07.2015 / 11:39
5

If the subdomain is on the same server, it will not help at all, as it will continue to download from the same location!

The interesting thing for you would be to hire a Content Distribution Network (CDN), so you can know the location of the person and indicate the nearest server with your files without any manual work!

The CDN can be CloudFlare, it can be MaxCDN and several other companies like Amazon that offer products similar to CDN or with functions very similar to those of one!

Minimizing JS and CSS can help speed!

This practice can help you index Google in a higher position, attract more people to your site by being fast and faster to make any changes!

    
22.07.2015 / 23:13
1

Through the Firefox YSlow plugin (it depends on having Firebug too) I came to this page: link . It's 2007, it may have changed quite a bit since the publication, but it presents a study exactly what it asked.

In summary: Part of the principle that increasing the number of domains should improve the time to load all the requested elements, because the HTTP / 1.1 specification suggests that browsers should download two components in parallel per hostname (pointed out by Henry above). This limit can be changed via configuration in both IE and Firefox (there is no information from others), but whoever messes with the server will not configure any client browser, and the configuration mode is not very user-friendly.

But his test results were not exactly as expected. It has two tests, 20 images each, one being small and one medium. Both had improvement in the increase of 1 to 2 hosts, but the one of small images did not have significant impact to increase more the number of hosts and the one of average images had negative impact! Being that with 3 hosts already it was worse than with 1 host. At the end he comes to the conclusion that the best should be 2 to 4 hosts.

The CDN that Vinicius cited helped a lot as well, trying to narrow the gap between the client's browser and the source of the requested content.

Another reason to put more hosts would be to have hostname (s) with dynamic content and other (s) with static content, using content expiration header to be cached, but this can be done on a single hostname by configuration medium with the apache module mod_expires for example.

If the intention is to know what to improve on a page to make it faster or easier, Firefox with Firebug and YSlow can help a lot, I do not know a similar tool in other browsers, but there should be. >

If you have resources available for testing, Safari (at least on OS X) has a very good feature of timing the time it takes to load a page indicating each action. I'm not sure, but I think I've seen this in Chrome on Windows too.

    
29.07.2015 / 03:27
0

This is a HTTP 1.1 problem, which we currently use is a sequential protocol. This means that when we open the connection, we can make 1 request at a time. Go 1 request, hopefully, the answer arrives; only then can we trigger another request.

To try to reduce the negative impact of this behavior, browsers running HTTP 1.1 open more than one connection at a time. Nowadays this number is usually 4 to 8 simultaneous connections per hostname.

HTTP 2.0 will already have multiplexing, avoiding to make those gambiarras that we make of subdomains, but you can already use SPDY which is a protocol done by google, which already has this option, but to install it you have to have HTTPS installed on your server.

And leaving the CSS and JS files in a file only helps, since it will only make one request per file.

    
28.07.2015 / 13:51