Get content from a list of a site and save it to the PHP database

0

I want to enter the site below, save all html in a variable, clean the html and save the content I want in the MySQL database via PHP 7 . p>

  

The site is: link

At first I "saved the HTML of the page" into a variable, as per the code below:

$mundo = 'ferobra';
$url = 'http://guildstats.eu/bosses?monsterName=&world=' . $mundo . '&rook=0' . $mundo;

function curl_get_contents($url)
{
  $ch = curl_init();
  curl_setopt($ch, CURLOPT_URL, $url);
  curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
  curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1);
  $data = curl_exec($ch);
  curl_close($ch);
  return $data;
}

$pagina = curl_get_contents($url);

My problem now is to clear HTML , save it to arrays and then populate the database.

Can anyone help me?

    
asked by anonymous 02.03.2018 / 13:47

2 answers

0

From what I understand you want to save only a part of the html in the right bank? Well you can try to make an explode in the bank by passing the beginning of the html you want and then get the index of that html and play in the Bank

$html = explode("começo_do_html_a_salvar", "$pagina")

Then you'll explode at the end of the HTML to get only HTML

$htmlToSave = explode("final_do_html_A_salvar" , $html[0]);

Then you only save in the bank passing $htmlToSave[0]

    
02.03.2018 / 13:56
0

There is a technique for what you want to do, it is an interesting subject, the name of the technique is Web scraping, with this name you can find many explanations on the internet how to do this, I particularly recommend you use the

preg_match_all();

It works together with regular expression (Regex).

    
02.03.2018 / 14:07