cannot crawl the same URL twice #2134
Closed
creative-ae
started this conversation in
General
Replies: 3 comments 3 replies
-
+1 for this. I think it's stored in memory and the memory needs to be purged, but I can't figure out how to do that :( |
Beta Was this translation helpful? Give feedback.
0 replies
-
Found the answer: #2026 |
Beta Was this translation helpful? Give feedback.
1 reply
-
There are several ways to go about this:
|
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Which package is this bug report for? If unsure which one to select, leave blank
@crawlee/puppeteer (PuppeteerCrawler)
Issue description
the crawler actually crawl the URL 1 time only,
when i try to crawl the same url again, i get the cached data.
Code sample
Package version
3.5.4
Node.js version
18
Operating system
mac
Apify platform
I have tested this on the
next
releaseNo response
Other context
No response
Beta Was this translation helpful? Give feedback.
All reactions