Quick Import fails when importing a large number of rows.

+1 vote

When using "Quick import" for a large number of spreadsheet rows (over 10000 or so), we're getting "Internal Server Error" and only the first 10k (or so, a variable amount) get loaded.

Other related errors are:

  • Request timed out.
  • Import Failed while Saving Data, Row #10574, Thread was being aborted.
  • Process Error, Thread was being aborted.

Is there a setting we can adjust to allow for more records to be uploaded per batch?

in How To by (7.0k points)

1 Answer

0 votes
Best answer

There are two options to increase the number of rows that dbFront's Quick-Import functionality can handle.

Option 1: Increase the thread execution timeout.

By default, the thread execution timeout is set to 110 seconds for Dotnet applications.
You can adjust the thread timeout by merging the following into the file [c:\inetpub\dbFront\web.config].

    <httpRuntime executionTimeout="180" />

Option 2: Allocate more resources.

Because this is a VM, one of the easiest ways to boost the performance is to allocate more memory or additional processors.

​Can you verify that you have at least two CPUs and 8 gigs of memory allocated to this VM?

Newer versions of dbFront include that information in the server statistics but your version is a bit older.

​This affects how much dbFront can process in the allocated time.

by (64.5k points)
Welcome to the dbFront Q&A site, where you can ask questions and receive answers from other members of the community.
 | Minimalist Answer Theme by Digitizor Media
Powered by Question2Answer