It is possible to do this yes - I wonder if it will be useful to simply store the values in a text file: generally recovering the desired data in the database and processing them directly is better than pear the same data as a file text, where we have no indexing tool, ways to search, etc ...
It puzzles me that you're having a "memory error" with records as simple as that - even millions of records should not drain the memory of a system with RAM in the house of gigabytes.
Now, going back specifically to what you ask - the Python database connectors - all - implement in addition to
fetchall
,
fetchmany
- that returns only the requested number of records.
So, a way to transfer a query to a local file (which I do not think is useful - unless it's all you want to do with the data) would be:
import csv
c = conn.cursor()
c.execute("SELECT latitude, longitude, gid FROM pontos")
with open("arquivo_destino.csv", "wt") as file_:
writer = csv.writer(file_)
writer.writerow(("latitude", "longitude"))
records = True
while records:
records = c.fetchmany(100)
writer.writerows(records)
...
The connection cursor itself can also be used as an iterator,
returning a query result at a time if it is used in a query - this way of using it is best if you consume your data, instead of simply writing it to a local file:
c = conn.cursor()
c.execute("SELECT latitude, longitude, gid FROM pontos")
for record in c:
# do things with row result
(But I've already seen bugged implementations of the bank in which interacting the cursor in this way was very slow - if so, better to do a combination with fetchmany as well.)