I have a running script that convert excel to json format. But my problem is the excel file are more than 100 and each excel has multiple sheets, so I need to edit the file name in my script and run it again every excel file to convert in json format.
My question is, It's possible to run the script once only and convert all excel file inside the folder to .json format.( Json file name should be the same on excel file name)?
This is my script:
df = pd.concat(pd.read_excel('../Excel/Data1.xls', sheet_name=None, header=None))
#set custom culomn
df.columns = ["Menu", "Sub-Menu", "Sub-Menu-1"]
df = pd.DataFrame.to_json(df, orient='records',force_ascii=True,index=True)
json_list= json.loads(df)
def clean_nones(value):
"""
Recursively remove all None values from dictionaries and lists, and returns
the result as a new dictionary or list.
"""
if isinstance(value, list):
return [clean_nones(x) for x in value if x is not None]
elif isinstance(value, dict):
return {
key: clean_nones(val)
for key, val in value.items()
if val is not None
}
else:
return value
with open('../Json/Data1.json', 'w') as json_file:
json.dump(clean_nones(json_list), json_file, ensure_ascii=False, indent=4)
Folder structure:
|_Excel
| |_Data1.xls
| |_Data2.xls
| |_Data3.xls
|_Json
|_Data1.json
|_Data2.json
|_Data3.json