-
-
Notifications
You must be signed in to change notification settings - Fork 4.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
drop(integration): Always fetch max number of pages #83984
base: master
Are you sure you want to change the base?
Conversation
It is very easy to write code and not pass the parameter, thus, we won't get the whole set of repositories. This will cause the calls to be slower for customers with thousands of repos, however, it will be more accurate. We still have `self.page_number_limit` to stop the requests after 50 page fetches (5,000 repositories).
Codecov ReportAttention: Patch coverage is ✅ All tests successful. No failed tests found.
Additional details and impacted files@@ Coverage Diff @@
## master #83984 +/- ##
===========================================
+ Coverage 33.07% 87.56% +54.49%
===========================================
Files 8079 9545 +1466
Lines 450123 541175 +91052
Branches 21290 21274 -16
===========================================
+ Hits 148861 473870 +325009
+ Misses 300907 66950 -233957
Partials 355 355 |
repos = self.get_with_pagination( | ||
"/installation/repositories", | ||
response_key="repositories", | ||
page_number_limit=self.page_number_limit if fetch_max_pages else 1, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
self.page_number_limit
is the default, thus, we don't need to pass it anymore.
response_key="repositories", | ||
page_number_limit=self.page_number_limit if fetch_max_pages else 1, | ||
) | ||
return [repo for repo in repos if not repo.get("archived")] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is now handled by the callers:
for i in [repo for repo in repos if not repo.get("archived")] |
for i in [repo for repo in repos if not repo.get("archived")] |
It is very easy to write code and not pass the parameter, thus, we won't get the whole set of repositories.
This will cause the calls to be slower for customers with thousands of repositories, however, it will be more accurate.
We still have
self.page_number_limit
to stop the requests after 50-page fetches (5,000 repositories).