Control of database (MySQL) in production environment

0

For a good part of my project I have been working with a git repository on a local server, (Debian 7), this week I migrated to gitlab, with no complications in the change, I only changed because of portability, since I'm always working on a local or other outside the same network.

In this migration, I came across a problem with the issue of controlling my database already in production.

The problem is directly that in the production environment, previously, since the repository was on the same database server, I had a script that read the database every 24 hours, and dump the tables with changes in a folder structure sorting by date (done in PHP, very simple thing).

Something like this:

bd_bckps/
        /01-01-2017/..
                   /customers.sql
                   /logs.sql
        /02-01-2017/..
                   /logs.sql
                   /internal_users.sql
        /[prossegue os dias]

Turning to gitlab, the database backup continues to work, but out of the versioning structure and out of the general backup, and future changes to the development bank that need to be upstream for production will be out of control.

Well, I believe that this form of DB control I use is not very secure, and it's horrible when I need to restore something, as it is necessary to find out almost manually what happened.

The question is:

  

Is there any way to automate the push of git for a certain time? (Shel Script?)

Or:

  

What database version control method would you recommend to use in a project with massive amounts of data?

PS: I still had not worked with a bank that was going to get so big, and I do not have much experience with MySQL.

    
asked by anonymous 07.08.2017 / 20:57

1 answer

1

You could use crontab, for example every day at midnight, access the folder / directory, commit with current date and time, and push)

0 0 * * * git add . && commit -m 'date' && git push origin master  >/dev/null 2>&1
    
08.08.2017 / 07:45