This is the following, I'm having concerns about a database being used in a project. The project consists of the development of a software for joining respective data to two solar spectrum sensors, and then storing them in a database along with other information from a "master" table. The concept is that for every row in the master table there will be thousands of rows in the spectrum table.
SUMMARY
We have the following diagram:
The problem I'm facing is that the rows in the master table will be added 24/7 a minute, and the rows in the esp table will be inserted more than a thousand at a time, but only 5 times a day, with a range of one hour.
PROBLEM
It may have been a bit confusing, but the biggest concern is not understanding this diagram.
What I want to do is calculate what the average volume of data that will go into the bank will be by year / month / week. is there a formula or a way to calculate how much space I will need to store this data?
Thank you!