I have a database in Firebird and I'm passing the data to PostgreSQL, so I made an application that reads Firebird data and inserts it in PostgreSQL.
The database is from a supermarket but has little time to use, the bank in FB weighs just over 1Gb, is not such a small bank, but is far from large.
Well, in the application I mentioned above, I create the database in PostgreSQL and soon after I do the import load, basically they are inserts, I kick some 3 million records, nothing absurd.
However, after inserting, if I try to remove a product that already has sales the bank accepts, it does not check the FK ... but if I drop the FK and recreate everything works perfectly ... And if I do a backup / restore also, everything returns to the most perfect tuning ...
I think the database is corrupting or something like that in the import, the application is done in C # and I use NpgSQL, and the inserts are done via functions that run crud.
The question is, is this a PostgreSQL problem? I really am almost giving up the bank, despite all the work I've had to write a bank of almost 150 tables in the nail to migrate I'm not feeling safe in it ... and look what I'm coming Firebird ...