If your database is PostgreSQL
You can put a PL / Python function right inside the bank - the documentation is here:
link
I do not know if Python within postgresql will have limitations (such as accessing the filesystem, importing third-party modules that are not related to postgres, etc.), but it probably has no limitations: it is very difficult to create effective limitations of this kind in Python - and to have a "look" for security, it is better to document what functions in Python can do everything, and to increase the range of choices.
If the system that inserts into the database is in Python
This brings us to something else: if the system that is inserting into the database is in Python, then the best Python ORMs (like SQLALchemy) have an event system - your Python code as a handler for SQLAlchemy event.
If it is neither PostgreSQL nor Python insert:
You will need to send a signal out of the bank somehow from scripts used as trigger inside the bank in other languages. This would require a lot of specific configuration, and could be problematic when the database is on a machine separate from your Python script (let's suppose you put a trigger that makes a socket call, that activates your Python script - when in production and network would reach the bank set up securely, it would be a headache to have that call the bank back to your server where the Python script is.)
So the idea of having a Python script that looks at the bank actively at every short interval of time is not bad not. What you can do instead of doing a long query that looks at the entire table is:
- create a simple table, just to say if there was change or not -
could be a table with a single row and a single column type "bool (changed)"
- Put a trigger on the database to set a "changed = true" on this table each time one of the changes you are interested in happens.
- By your script looking at the bank every 1 or 2 seconds, checking this simple table - and only make longer queries if there was a change