Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LoadRecordSet timing out within ArrayPush > Ubound Array #74

Open
RaviRamDhali opened this issue May 17, 2018 · 8 comments
Open

LoadRecordSet timing out within ArrayPush > Ubound Array #74

RaviRamDhali opened this issue May 17, 2018 · 8 comments
Labels
enhancement fix-on-development-branch A fix for this issue was implemented on current development branch.

Comments

@RaviRamDhali
Copy link

I have been running in some very slow results and even timeouts when calling LoadRecordSet.

Here are some details:
Total number of records : 100 rows
Total number of columns per row: 60

Some testing shows:

Total number of records : 100 rows
Total number of columns per row:  10
Time: 1 sec
Total number of records : 100 rows
Total number of columns per row:  30
Time: 3 sec
Total number of records : 100 rows
Total number of columns per row:  50
Time: 8 - 12 sec ****
Total number of records : 100 rows
Total number of columns per row:  60
Time: 20 sec OR timeout ****

I was able to narrow down the bottleneck to :
public function ArrayPush() which rebuilds the array each time.

Any thoughts on getting around this bottleneck?

image

@rcdmk
Copy link
Owner

rcdmk commented May 17, 2018

Hi. Good catch.
I can improve that for sure. Just have to use a different approach for this case.

@RaviRamDhali
Copy link
Author

@rcdmk I am also trying to (help) find a solution. There is some good testing at : https://stackoverflow.com/questions/4605270/add-item-to-array-in-vbscript

@rcdmk
Copy link
Owner

rcdmk commented Jun 10, 2018

Due to the nature of the array system in VBS, I'll have to use another data structure for this, maybe.
I'm not really happy with the overhead in the current method, but it's the best way I've found to work natively.
I'm still researching for solutions. The link you provided gave me some ideas, but I have no real solution yet.

@SeanLMcCullough
Copy link

You could consider using the GetRows method, this will give you an array and with incredibly high performance. That way you could do some rather simple native array manipulations to load it in.

I've found that the communication overhead of ADODB when iterating over large recordsets (especially with tons of columns) imposes significant performance impacts when using MoveNext. I believe ADODB requests each record by record from the database rather than fetching the lot initially. My rule of thumb is that if you've written your query properly, and you're not going to do an early exit, there is no need for ADODB's row-by-row overhead.

@rcdmk
Copy link
Owner

rcdmk commented Jun 12, 2018

That is a good point and indeed I always use GetRows in my apps.
In this particular case, I had to get the column names for the properties and this is not available in the GetRows result array.

Thinking a lil bit now, I could have gotten the field names from the recordset and then used GetRows to fetch the results and iterate them by index.

I'll try that in the next release.

Thanks.

@JohnSTRO-pi
Copy link

@rcdmk Have you had any success with GetRows() implementation? I'm trying to read a RecordSet from the results of the DB query, and convert that into a JSON. I has approximately 4000 rows, so getRows() would be the fastest way to do that I think. I'm trying to do it in Javascript. Pls let me know if you have a working solution.

Thank you | john

@RonaldZielaznicki
Copy link

RonaldZielaznicki commented Aug 3, 2021

Hey folks. I've manage to take the loading and make it less painful. With about 100 rows and 4 columns the load time drops down to less than a second. I'm looking at half a second to get the entire recordset I'm loading into the JSON object.

I did this by taking many of the suggestions above and applying them to the code. I've also added two new functions, ArrayConcat and addMultiple. What these do is allow us to send in arrays and do one redim rather than having to do several.

However, the problem I'm running up against is serializing the JSON. I did some research and turns out VBScript string concatenation is really slow. I saw two methods for dealing with this. One is to use an array and join, and the other is to use an ADODB.Stream. I opted to go for the second. However, the serialization is still taking longer than I'd like.

I've gotten it down to a little under two seconds to serialize the recordset by using the ADODB.Stream. Anyone have any suggestions on how to get it faster?

Obligatory code to review: https://github.com/RonaldZielaznicki/aspJSON/blob/Showcase/jsonObject.class.asp

@rcdmk
Copy link
Owner

rcdmk commented Oct 3, 2022

Hello all,

Didn't have a lot of time to work on this, but I'm back and trying a few things.

I've made a few optimizations on the current development branch (4999ef7), if you would like to test that out.
ArrayPush is not used anymore and, instead, I've implemented a buffering logic to avoid resizing the array on every push.

Now, in regards to concatenation, VBScript is very limited in that regard, but I'll give it another try with a custom string builder. In the past, I tried using arrays and joining the result and got no performance improvement (it was worse in some cases), but I'll see if the new optimizations help with that too.

Best,
Ricardo

@rcdmk rcdmk added the fix-on-development-branch A fix for this issue was implemented on current development branch. label Oct 3, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement fix-on-development-branch A fix for this issue was implemented on current development branch.
Projects
None yet
Development

No branches or pull requests

5 participants