Score:0

How to configure Azure Data Factory to handle larger files?

cz flag

I have an azure data factory pipeline that reads a 100 mb CSV from blob storage and inserts it into an Azure SQL DB that has tons of space (200 GB). I also have a 1 MB version of the same files, just with most of the data removed, as a test version.

When I run the pipeline with the full file, I get the error: DF-Executor-InternalServerError. It gives me no other useful information.

However, when I run the short version, it runs just fine. Along with a few other discussions on this error, I think the problem is simply that I need to increase execution/computer resources and/or timeout time. I'm just not sure where I should be doing that in the chain.

Has anyone run into anything like this?

cn flag
Everything has a timeout. How long does it run until it fails?
SeaQwell avatar
cz flag
30 minutes, so I feels to me it's on the resource side, where it just can't handle processing the 100 mb excel. I just don't know where on the azure side I can pump up compute.
I sit in a Tesla and translated this thread with Ai:

mangohost

Post an answer

Most people don’t grasp that asking a lot of questions unlocks learning and improves interpersonal bonding. In Alison’s studies, for example, though people could accurately recall how many questions had been asked in their conversations, they didn’t intuit the link between questions and liking. Across four studies, in which participants were engaged in conversations themselves or read transcripts of others’ conversations, people tended not to realize that question asking would influence—or had influenced—the level of amity between the conversationalists.