# Trying to resolve a 12031 error



## Tristaan

Okay, without going into detail that may violate my companies non-disclosure...

...we're developing an application that sends an XML message to a server-side ASP page. That ASP page then sends the XML to a server-side DLL for processing.

When the XML file is very large (> 2 MB), we're experiencing a 12031 error on the client side. Research has shown this is a Connection Reset error.

Both the server-side DLL and the client side application are written in Delphi 7 and, as far as I can tell, all three "stages" in this process are using IXMLHTTPRequest for transmitting and processing the XML.

Further research on my part seems to indicate that the 12031 error is happening because the MSXML class is timing out on processing the large file. 

I know the detail is rather sketchy, but does anyone have any suggestions on resolving this problem?


----------



## Rockn

You may want to look at your IIS servers script timeout settings.


----------



## Tristaan

Assume I'm a total tyro and don't know where to find those and you would be right (I work in the QA department and I know how to break software but not necessarily all configurations necessary to set up software).

Also, I did check the settings on the particular web site and noticed that the connection time out was set to 120 seconds. It is most definitely taking more than 2 minutes to perform this task and, from what I can tell, we're not sending "Keep-Alive" messages across the connection. 

I've upped that setting to 360 to see if it has any impact. But any further suggestions are welcome (including instructions to the script time out settings).


----------



## Rockn

Script timeout settings are located in the Windows Management Console.

http://support.microsoft.com/default.aspx?scid=kb;en-us;268364


----------



## Tristaan

Actually, it turns out that it was the Connection Time out on the Web Site within IIS. It was set for 2 minutes. The XML transmission and the server side processing time was taking in excess of an hour and there were no keep alive messages being sent, therefore, after two minutes, the connection reset, generating a 12031.

So, it was a timeout setting... but it was ConnectionTime out.

Thanks, RockN, for confirming our suspicions.

Next question on the same lines....obviously, we may not want to have the connection timeout on the entire website set so high. Does anyone have a programmatic solution to be able to get past this timeout or, perhaps, for the specific task, disable the timeout until the task is done?


----------



## Rockn

Why in gods name would you even want any kind of scripting that takes that long to process? Is this a client side thing that initiates the whole process? I would be mad after two minutes had elapsed. Might need to investigate an updated DLL or something to speed up the process.


----------



## Tristaan

We are examining some redesign work on our DLL to try and speed up the process.

However, the task being peformed is essentially a web admin task and not necessarily a general user task. The idea is that an XML message containing commands to update the web site's SQL database is transmitted. The XML is placed into memory and a process to update the web site's SQL database is started and the data is processed. 

The XML file being transmitted on the REALLY large data is approximately 40 MB in size (takes many hours to process that, even on a fast server class machine). However, standard data size is approximately 1-2 MB of XML which updates within the regular 2 minute time frame. And the frequency at which data that size will be transmitted is VERY infrequent, with a standard change transmission consisting of maybe 30-40K of XML.

The problem is that the user that we are developing this process for has data consisting of, for a single SQL table, 91000 records to be transmitted and inserted. This is in addition to the several hundred records in other tables. 

For now, we've bumped the connection timeout up to 6 minutes and documented to the user that, until we can figure out a better design for the process, to limit the amount of their upload. There are "workarounds" to get the data up there and, since the 91000 is like a once-a-year upload, they are easily done.


----------

