I can not import data from a CSV to PostGreSQL with dbWriteTable from R

3

I am trying to import a CSV into a PostGreSQL database from R. On the PostGreSQL server, I created an empty database, called "Data".

    library(RPostgreSQL)
    library(sqldf)

    drv <- dbDriver("PostgreSQL")

    con <- dbConnect(drv, 
             dbname="Dados",
             port = 1704, 
             host="localhost",
             user = "postgres",
             password = "dados")

The connection succeeds:

    dbListConnections(drv)
    # [[1]]
    # An object of class "PostgreSQLConnection"
    # Slot "Id":
    # [1] 8652    0

Then I create any data.frame and then save it to a file

    tabela <- data.frame(var1 = c(1,2,3,NA),
                var2 = c(2,3,4,1))

    write.table(tabela,"tabela.csv", sep = "\t")

But when I run the command dbWriteTable an error occurs:

    dbWriteTable(conn = con, name = "tabela1", value = paste0(getwd(),"/tabela.csv"))
    # Error in postgresqlExecStatement(conn, statement, ...) : 
    #   RS-DBI driver: (could not Retrieve the result : ERRO:  não pôde abrir 
    #   arquivo "c:/users/rogerio/desktop/tabela.csv" para leitura: Permission 
    #   denied )
    # [1] FALSE
    # Warning message:
    # In postgresqlImportFile(conn, name, value, ...) :
    #   could not load data into table

"Table1" is actually created in the DataBase, in PostGreSQL, but no data is imported.

The same procedure works fine in an SQLite connection ...

Does anyone have any tips?

    
asked by anonymous 09.10.2014 / 23:44

1 answer

1

I have found that a path can be as follows:

  library(data.table)
  postgresqlWriteTable(con = con, name = "tabela1", 
                 value = fread("tabela.csv"))

But I still do not know if this is a good alternative ... fread is loading everything into RAM, like a data.frame / data.table. Then this is passed to the SQL server.

If the bank is very large, this can give biziu ...

    
10.10.2014 / 00:04