I am looking to import a SAS table sas7dat format into postgresql. Or do i need to convert the table into csv then import? Thank you!
-
1If you have the product "SAS/ACCESS Interface to PostgreSQL" licensed you can use `Proc COPY`. I am unaware of a PostgreSQL routine that can read SAS data sets natively. – Richard Apr 15 '18 at 20:22
-
thks!! i guess i have to do the long way. :P – jingjing liu Apr 16 '18 at 02:19
2 Answers
If you have already SAS and Postgres at your organization then you probably you might have SAS/Access interface to Postgres. you can use proc setinit;run; to check whether you have SAS/Access interface to Postgres. If you have the SAS/ACCESS then you can use libname method as shown in below example.
/* define libname for postgres*/
libname A1 postgres server=mysrv1 port=5432 user=myusr1 password='mypwd1' database=mydb1;
/* define libname for SAS table*/
libname mydata '/folders/myfolders/';
/* then use datastep or SQL to create your postgress table*/
data A1.yourtable;
set mydata.yourtable;
run;
If you do not have SAS/ACCESS to postgres then you may have to do in 2 steps.(but check whether you have any etl tools available in your company)
first you have to use proc export to CSV. see the link below
Efficiently convert a SAS dataset into a CSV
then move csv data into postgres
How to import CSV file data into a PostgreSQL table?
- 3,220
- 3
- 13
- 20
-
Thank you so much! but sadly my company or my sas server does not have sas/access interface to postgres. – jingjing liu Apr 16 '18 at 02:17
-
I found another solution which requires Python to import sas7bdat file into Postgres.
import pandas as pd
from sqlalchemy import create_engine
engine = create_engine("postgresql://user:password@localhost:5432/databasename", echo=False)
df = pd.read_sas('sas7bdat file location')
df.to_sql('tablename', con=engine, if_exists='replace')
- 178
- 1
- 4