-
Notifications
You must be signed in to change notification settings - Fork 89
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
LoadRecordSet timing out within ArrayPush > Ubound Array #74
Comments
Hi. Good catch. |
@rcdmk I am also trying to (help) find a solution. There is some good testing at : https://stackoverflow.com/questions/4605270/add-item-to-array-in-vbscript |
Due to the nature of the array system in VBS, I'll have to use another data structure for this, maybe. |
You could consider using the GetRows method, this will give you an array and with incredibly high performance. That way you could do some rather simple native array manipulations to load it in. I've found that the communication overhead of ADODB when iterating over large recordsets (especially with tons of columns) imposes significant performance impacts when using MoveNext. I believe ADODB requests each record by record from the database rather than fetching the lot initially. My rule of thumb is that if you've written your query properly, and you're not going to do an early exit, there is no need for ADODB's row-by-row overhead. |
That is a good point and indeed I always use GetRows in my apps. Thinking a lil bit now, I could have gotten the field names from the recordset and then used GetRows to fetch the results and iterate them by index. I'll try that in the next release. Thanks. |
@rcdmk Have you had any success with GetRows() implementation? I'm trying to read a RecordSet from the results of the DB query, and convert that into a JSON. I has approximately 4000 rows, so getRows() would be the fastest way to do that I think. I'm trying to do it in Javascript. Pls let me know if you have a working solution. Thank you | john |
Hey folks. I've manage to take the loading and make it less painful. With about 100 rows and 4 columns the load time drops down to less than a second. I'm looking at half a second to get the entire recordset I'm loading into the JSON object. I did this by taking many of the suggestions above and applying them to the code. I've also added two new functions, ArrayConcat and addMultiple. What these do is allow us to send in arrays and do one redim rather than having to do several. However, the problem I'm running up against is serializing the JSON. I did some research and turns out VBScript string concatenation is really slow. I saw two methods for dealing with this. One is to use an array and join, and the other is to use an ADODB.Stream. I opted to go for the second. However, the serialization is still taking longer than I'd like. I've gotten it down to a little under two seconds to serialize the recordset by using the ADODB.Stream. Anyone have any suggestions on how to get it faster? Obligatory code to review: https://github.com/RonaldZielaznicki/aspJSON/blob/Showcase/jsonObject.class.asp |
Hello all, Didn't have a lot of time to work on this, but I'm back and trying a few things. I've made a few optimizations on the current development branch (4999ef7), if you would like to test that out. Now, in regards to concatenation, VBScript is very limited in that regard, but I'll give it another try with a custom string builder. In the past, I tried using arrays and joining the result and got no performance improvement (it was worse in some cases), but I'll see if the new optimizations help with that too. Best, |
I have been running in some very slow results and even timeouts when calling LoadRecordSet.
Here are some details:
Total number of records : 100 rows
Total number of columns per row: 60
Some testing shows:
I was able to narrow down the bottleneck to :
public function ArrayPush() which rebuilds the array each time.
Any thoughts on getting around this bottleneck?
The text was updated successfully, but these errors were encountered: