If all are in the same sites or mysql is accessible between sites
You can simply create a function like this:
global.php:
<?php
function exibirDados(query)
{
//Evita conectar multiplas vezes
static $conexao;
//Cache da resposta
static $cache;
if (isset($cache[$query])) {
echo $cache[$query];
return;
}
if (!$conexao) {
$conexao = mysqli_connect("127.0.0.1", "my_user", "my_password", "my_db");
}
if (!$conexao) {
echo "Erro ao se conectar com o MySQL:" . PHP_EOL;
echo "numero: " . mysqli_connect_errno() . PHP_EOL;
echo "erro: " . mysqli_connect_error() . PHP_EOL;
exit;
}
$sql = mysqli_query($conexao, query);
$tr1 = mysqli_num_rows($sql);
$resultado .= '';
for ($i=0;$i++;$i<$tr1)
{
$resultado .= '..... formate aqui';
}
$cache[$query] = $resultado;
echo $resultado;
And then in your file you can do something like this:
<?php
require_once 'global.php';
?>
bla bla bla bla <?php exibirDados('SELECT * ....'); ?>
bla bla bla bla <?php exibirDados('SELECT * ....'); ?>
bla bla bla bla <?php exibirDados('SELECT * ....'); ?>
bla bla bla bla <?php exibirDados('SELECT * ....'); ?>
If they are different websites
Note that when doing this you will have to do more "downloads", which can increase page consumption, not the code itself, but the idea.
Whatever you want, you can use preg_replace_callback
with curl
e
in the domain that has the data, make a page like:
-
resultado.php
<?php
if (empty($_GET['pagina'])) {
return '[pagina não definida]';
}
$pagina = intval($_GET['pagina']);
if ($pagina < 1) {
return '[pagina invalida]';
}
$conexao = mysqli_connect(localhost, root, 123456, bancoteste);
$result = mysqli_query($conexao, "select id, coluna1, coluna2 from tabela LIMIT ..., ...");
while ($row = mysqli_fetch_assoc($result)) {
echo ...;
}
mysqli_free_result($row);
Then in the domain that will receive the data you can create something like:
-
global.php:
<?php
function downloadData($url)
{
static $dados;
//Cache de dados, para evitar multiplos downloads repetidos
if ($dados[$url]) {
return $dados[$url];
}
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_HEADER, 0);
$resposta = curl_exec($ch);
if (!$resposta) {
$dados[$url] = '[dados inacessives]';
} else {
$dados[$url] = $resposta;
}
curl_close($ch);
$ch = null;
return $resposta;
}
function buscaDados($entrada)
{
$replace = function($match) {
return downloadData($match[1]);
};
return preg_replace_callback('#\[(http:\/\/[a-zA-Z0-9\/%\-_]+?)\]#', $replace, $entrada);
}
Then include the global.php in the pages that will use, should be something like:
<?php
require_once 'global.php';
echo buscaDados('bla bla bla [http://site1/15/resultado.php] bla bla bla [http://site2/14/busca.php]');
Of course preferable is you might use a format like JSON to handle the data well and maybe use an authentication method, but that's another story.
Important: On the sites that will include you should return only the important parts and preferably you have control over all included website, if not have to use link will only retain the required part.
Note : Perhaps not the best approaches, I recommend thinking about maybe an API or something in an existing project organization structure, such as MVC, if you know how to use it, use it without understanding that the reason is to organize, or use anyway, will also be a problem.
Sometimes reinventing the wheel is cool, but as long as well designed and with a good test time to make sure it will be useful to put into production.