Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Microsoft is giving away 50,000 FREE Microsoft Certification exam vouchers. Get Fabric certified for FREE! Learn more

Find articles, guides, information and community news

Most Recent
SachinNandanwar
Super User
Super User

Let's say you have one or more CSV files and you want to convert them to Parquet format and also upload them to a Lakehouse table. The available options for this in the Fabric environment are either through a notebook or a data pipeline but there are aren't any pre-built out-of-the-box solutions.

 

For instance, you might have an application that generates CSV files, and you want to upload the CSV data directly to the Lakehouse at that moment. One approach could be to make application store the CSV files in ADLS2 storage and use an event-based pipeline triggered by storage events to upload the data to the Lakehouse. However, what if storing the files on cloud storage isn't an option and the files will always be stored on a on prem storage ?

Article was originally publised here

Read more...