Performance improvement advice for batch API call #395
Unanswered
LinhChay00
asked this question in
Q&A
Replies: 1 comment
-
Thanks for opening this and sorry about the slow reply here. In general the Time Machine endpoint is pretty slow (see this comment for more info) and there was a recent change to the time machine endpoint where depending on your plan the number of requests is limited. I'll ping @alexander0042 or you can try emailing [email protected] to see if there's anything you can do to speed up response times on your end. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi everyone,
I am trying to scrape the weather data (including temperature, wind, summary) for a long period (all days in around 10 years) in around 95 places. I wrote a script to catch the data in batch (because Pirate Weather only allows to have one API call at a time). Yet my script runs really slowly even when I just test for 2 days in 2 different places
Does anyone have any advice for writing the script to catch the data in batch in a more effective way? Or do you know any sample script for batch catching that I can refer to? Please kindly have to advice
I tried to write a "loop" script and ask ChatGPT for using asyncio,aiohttp (the script is attached), but both of them do not seem to help the performance.
script_text.txt
Any suggestions or advice from you is highly appreciative to me, Thank you all for your help in advance
Beta Was this translation helpful? Give feedback.
All reactions