What is the best practice for making a large table query?

1

I have a system composed of several tables. One of the tables is called Publications and contains 15 varied fields, 3 of them being the main ones of my doubt:

 1. titulo - varchar(100)
 2. subtitulo - varchar(200)
 3. texto - text

In a short time (18 months +/-) this table will have more than 1 million records and we are concerned with the performance of searches for records.

What is the best way to do keyword queries entered by users on a form using LIKE or FULLTEXT or whatever?

We are concerned with searches for compound words such as "house music" and especially speed and performance in the processes.

As I begin, I'm used to simpler and less impressive queries and I'm not sure how to deal with all the variables involved.

    
asked by anonymous 29.08.2017 / 19:51

1 answer

4

"Best practice" is to create a development environment that simulates the situation you will encounter and create solutions to see what is best suited.

The simplest solution is LIKE , try it and see if the results are satisfactory. If they are not go to a dedicated and properly configured text search system.

It seems to me that performance should be no problem on reasonable machines. You would have to see if you have a lot of concurrent or little access, if it's just local. Developing the application correctly also counts. The correct architecture may weigh more than choosing this or that feature of the database.

Depending on the type of response you want, you would need a specialized textual search engine.

    
29.08.2017 / 19:57