PostgreSQL

The mainly difference, which help to decide the selecting decision for the software design is the embedded mode. H2 supports this kind of the mode while PostgreSQL doesn't support it. Further H2 is platform independent and mostly used for the small application and PostgreSQL only supports some kind of the OS.

What is the most important difference between H2 and PostgreSQL

During the software implementation, it is always difficult for us to decide which one of the database should we use as the technology stack. Especially for the database. So what is the mainly difference between H2 and PostgreSQL.

Using RPostgresql package

1. The package must be loaded into the workspace
install.packages("RPostgreSQL")
library(RPostgreSQL)

2. A driver must be loaded
drv <- dbDriver("PostgreSQL")

3. Create a connection to a database
con <- dbConnect(drv, host="localhost",port="5432", user="postgres", dbname="test", password="blub")

4. Submits a statement (this can be also a insert, drop or update SQL statment)
rs <- dbSendQuery(con, "select * from Test")

5. Fetch all elements from the result set
fetch(rs,n=-1)

## Closes the connection
dbDisconnect(con)
## Frees all the resources on the driver
dbUnloadDriver(drv)

Taggings:

Connect R to a Postgresql database

I have got information in R which should be stored. Also querying and reloading the data should be possible.

Taggings:

Solution for "Connect to a Postgresql database and select or edit data through R"

    1. download and install the odbc driver for the postgresql database on your client from http://www.postgresql.org/ftp/odbc/versions/. The msi directory is for windows users and the src directory for *nix users.
    2. set up a odbc dsn entry on the client computer. Here is a link for windows users http://support.microsoft.com/kb/305599. And here is a link for ubuntu users https://help.ubuntu.com/community/ODBC
    3. Install RODBC for R. RODBC is a library which provides the necessary functions to connect to a database and modify data.

    install.packages("RODBC");
    4. Connecting to a odbc dsn entry is quite simple:

    channel <- odbcConnect("odbc_dns_entry_name");
    5. Insert Data as SQL Statement:

    odbcQuery(channel,"insert ...");
    6. Selecting data from a table:

    sqlFetch(channel, "table_name");

Further information of the RODBC functionalities could be found here

Taggings:

Connect to a Postgresql database and select or edit data through R

I am a R user and want to connect to my postgresql database in order to select data for statistical calculations. Afterwards I would also safe the results into the database. Plain SQL statements would suffice. I am using R 2.12 and postgresql 9.1

Solution for: Enable SSL TCP Connections for Postgresql 9.1

    1. set ssl to on in postgresql.conf.
    2. creating a Self-Signed Certificate:

    openssl req -new -text -out server.req
    openssl rsa -in privkey.pem -out server.key
    rm privkey.pem
    openssl req -x509 -in server.req -text -key server.key -out server.crt
    chmod og-rwx server.key

    This will create a server.key certificate which not need a password in order to start up.
    3. copy server.key to the server's data directory

source
developer.postgresql.org/pgdocs/postgres/ssl-tcp.html

Enable SSL TCP Connections for Postgresql 9.1

How can I enable a secure ssl connection for an postgresql 9.1 server.

File System Level Backup

The solution we found where file system level backups, they are faster by a factor of 5 to 10.

These have two restrictions:

  • The database server must be shut down while performing the backup. This is no problem as the database is only needed during office hours, so shutting down the server at 22:00, performing the backup, and restarting the server is no problem.
  • Only full backups and restorations are possible - partially restoring data does not work. - Not a problem as there is little daily activity except documents being added. The files themselves are very important (health insurance data), but if half a day is lost the files can be added again the next day and nothing important is lost.
  • This method can be extended by creating incremental backups (rsync is your friend) or creating snapshots, if the operating system permits it (ZFS has been reported to work well).

Taggings:

PostgreSQL Database Backup Takes Too Long

While giving a training course on PostgreSQL, the problem arose that a pg_dump of their database would take too long. The database contains more or less nothing else than BLOBs (PDF files). Currently only 500MB, but there are 45GB of archived files (from the last 10 years or so) and new ones will be added on a daily basis. So at least 50GB (older files could be removed) will be stored in the database. Doing a test dump with pg_dump takes approximately 3 minutes for 500MB (on the given system) - which means 50GB will take somewhere around 5 hours. That is definitely too long, as the backup will need to run every night and additional file system backups (with IBM Tivoli) need to be performed as well.
Subscribe to PostgreSQL