Delete repeated records in PHP

1

Good morning, I'm here with a pretty big question. I've been in this for a whole day and I still have not been able to resolve the situation. I have a database in sql server and I need to display a buttons with the records of the description of a table, but it turns out that in this table there are records that can be repeated, in this case I have restaurant 4 times because the restaurant has more than one table. I'll show you the printout of the result and leave the code here if you can help me.

<head>
  <link rel="stylesheet" type="text/css" href="css/style.css">
</head>

<?php

	 $sql="SELECT P.id as Id, 
		   P.pointOfSale as [Description], 
		   P.printerExtrat1 as PrinterExtrat1, 
		   P.printerExtrat1FonteTicket as PrinterExtrat1Font, 
		   P.printerExtrat2 as PrinterExtrat2, 
		   P.printerExtrat2FonteTicket as PrinterExtrat2Font, 
		   P.printerExtrat3 as PrinterExtrat3, 
		   P.printerExtrat3FonteTicket as PrinterExtrat3Font, 
		   T.id as IdTable, 
		   T.tableDescription as TableDescription 
		   FROM PointOfSale P LEFT 
		   JOIN PointOfSalesTables T ON P.id=T.idPointOfSale and T.isActive=1 
		   WHERE P.isActive=1 
		   ORDER BY P.pointOfSale";
		$result = sqlsrv_query($conn, $sql);
		
	while ($row1 = sqlsrv_fetch_array( $result, SQLSRV_FETCH_NUMERIC)) {
		$resultado1 = $row1[1];
		?>
		
		<body>
			<form method="post">
				<a  class="banquetes"><?php echo $resultado1; ?></a>
			</form>
		</body>
		<?php
		}
?>

I can not show more code because the database links are confidential.

    
asked by anonymous 22.02.2018 / 12:15

3 answers

5

One possibility is to add this line in your query :

GROUP BY P.id

Attention to position, in SQL the clauses have specific order:

...
WHERE P.isActive=1 
GROUP BY P.id               -- ou T.idPointOfSale, claro
ORDER BY P.pointOfSale";

The GROUP BY causes the results to be (as the name already says) grouped in a line only for each occurrence of the selected field (in this case, a unique ID of each point of sale)

Adjust the field above if necessary. I did not suggest using the name because you could have 2 "restaurant" if dealing with different things, so I preferred ID . It may be that in your case the name is more appropriate. The important thing is to understand how it works.

Note that this does not apply to you, but GROUP BY is for more than limiting the output. It is very important for you to count, add, determine the largest or smallest record in your query .

Example:

SELECT tipo, COUNT(*), SUM( valor ) FROM tabela GROUP BY tipo

This would cause only one row of each tipo to be shown, then the count of how many rows of each tipo , and the sum of the values only of that tipo .

Alternatively you could study the use of DISTINCT , but only knowing more details of its implementation and purpose (although, internally, in many DBs the DISTINCT is implemented as a specific case of GROUP BY , many times resulting in the same execution plan):

 SELECT  DISTINCT P.id     AS Id, 
         P.pointOfSale     AS [Description], 
         P.printerExtrat1  AS PrinterExtrat1, 
    
22.02.2018 / 12:24
0

It is possible to solve this in two ways, either by query or by php,

With the query you can make a group by of that column, therefore it will not give you repeated data.

In php you can do something of the genre (I do not advise)

 $resultado1 = $row1[1];
 if($lastWord == $resultado1){
 //MOSTRAR HTML
 }else{
  //Ñ MOSTRAR HTML
 }
 $lastWord = $resultado1
    
22.02.2018 / 12:24
-1

Melissa, From what I understood from the script (I could not open the image due to firewall restrictions) the attempts by group by and / or dictinct will not work, because in return you will have something like:

Id - Description - .... - IdTable ...
1  - Restaurante1 - .... - 1
2  - Restaurante1 - .... - 3

I believe you will have differences in the columns Id, IdTable, TableDescription ... I suggest that you remove the columns that you will not use in the query because with $ row1 [1] you only show the [Description]. And add Distinct as already mentioned.

    
22.02.2018 / 13:33