This is only because you are downloading external content. Whenever it executes this it will connect with http://blog.exal.com.br/feed
, it will look up the IP of the address and connect, then it will wait for the response of the website and get the data, download, then run the rest of the code based on what it was obtained, simple as well.
If you make one:
curl "http://blog.exal.com.br/feed/" -o /dev/null -w "Tempo para conectar: %{time_connect}\nTempo para começar transferencia: %{time_starttransfer}\nTempo Total: %{time_total}"
On Windows, use -o null
to remove the Failed writing body
alert because /dev/null
does not exist in Windows. : S
This will tell you how much time it took for your server (or any device) to connect to and get the data from the other website.
In my tests:
Tempo para conectar: 0.154
Tempo para começar transferencia: 1.501
Tempo Total: 1.878
Tempo para conectar: 0.031
Tempo para começar transferencia: 1.406
Tempo Total: 1.406
Tempo para conectar: 0.137
Tempo para começar transferencia: 1.184
Tempo Total: 1.557
...
-
Tempo para conectar
( time_connect
): Indicates the time it takes for CURL to create the TCP connection until it connects correctly to the server (or proxy).
-
Tempo para começar transferencia
( time_starttransfer
): Indicates the time it takes for CURL to receive the first byte of response, this is the time it took for CURL to start receiving some data.
-
Tempo Total
( time_total
): indicates the total time it took to get the result.
Where is the problem?
The main problem is in the website that generates the feed ( http://blog.exal.com.br/feed
), since it takes an average of 1350 milliseconds to return the information, or to start returning, so it takes time for your website to display such information .
How to solve?
It depends, there is not enough information in the question, the question actually was just "I wonder if this is due to my code snippet or the RSS feed itself.", this is answered above.
However, there are some solutions you can do AT YOUR SIDE.
Use cache:
Why do you need to connect to the other site at all times? It updates how much in how much time?
One option is to create a cronjob
and save it to a /cache/feed
file and load it every time.
Create a atualizaFeed.php
:
$getFeed = file_get_contents('http://blog.exal.com.br/feed');
file_put_contents('alguma/pasta/feed.xml', $getFeed);
Create a cronjob
, example: * * * * * php atualizaFeed.php
, you can use crontab -e
to edit via vi
.
You can simply forget PHP and use directly: * * * * * curl "http://blog.exal.com.br/feed/" -o /alguma/pasta/feed.xml
. ;)
Then just upload the information from the file that is updated every minute, using:
$feed = file_get_contents('alguma/pasta/feed.xml');
$rss = new SimpleXmlElement($feed);
//...
The alguma/pasta/feed.xml
file will always be updated every minute. So when the user accesses the page the file that is already on your server will be read, this is infinitely faster than waiting for the external server to respond to each connection.
You can also create a database if you want and update when there are new publications, in short, several options ...