Importing multiple CSV files from local directory and doing an inner join

  merge, pandas, python, python-3.x

I have around 10 csv files in my folder. I want to create a 1:1 merge on one common column, called "ACCESS_ID".

After specifying the path directory, I create a DataFrame to put all the local csv files in:

os.chdir('C:/Users/xx/Downloads/merge/')
csvs_all = pd.DataFrame()

I can individually read in all the .csv files using pd.read_csv() function and store all the dataframes to a list, but I am trying to automate the process as much as possible.

for f in glob.glob('C:/Users/xx/Downloads/merge/*.csv'):
    df = pd.read_csv(f)
    csvs_all=pd.concat([dfs,df])

This code is not working and is creating multiple duplicate entries. Some of the column entries are exhibiting NA values. Is there a way to use pd.merge on multiple datasets/csv files? I would like to do the following:

csvs_all = pd.merge([all my csv files], on=’ACCESS_ID’)

Source: Python Questions

LEAVE A COMMENT