You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have searched for open issues that report the same problem
I have checked that the bug affects the latest version of the library
Related to discussion in #1279 .
The get_equipment method could benefit from a more descriptive name. Currently the method de-duplicates and preserves the order of the list which isn't exactly equipment related.
Some potential new names:
deduplicate_ordered_list
deduplicate_preserve_order (I'm partial to this one if the functionality remains the same)
serialize_list_to_tags
I could find 3 instances where the functionality maintaining the order is required (argiro, beyondfrosting & glutenonashoestring). This could easily be implemented directly into each individual scraper or functionality could remain on the 29 current scrapers that contain the function as a "safety" feature for instances where untested pages contain duplicate items in their list.
The text was updated successfully, but these errors were encountered:
jknndy
changed the title
Consider Renaming and Refactoring get_equipment Method for Clarity and Efficiency
Consider Renamingget_equipment Method for Clarity
Oct 22, 2024
jknndy
changed the title
Consider Renamingget_equipment Method for Clarity
Consider Renaming get_equipment Method for Clarity
Oct 22, 2024
Pre-filing checks
Related to discussion in #1279 .
The get_equipment method could benefit from a more descriptive name. Currently the method de-duplicates and preserves the order of the list which isn't exactly equipment related.
Some potential new names:
I could find 3 instances where the functionality maintaining the order is required (
argiro
,beyondfrosting
&glutenonashoestring
). This could easily be implemented directly into each individual scraper or functionality could remain on the 29 current scrapers that contain the function as a "safety" feature for instances where untested pages contain duplicate items in their list.The text was updated successfully, but these errors were encountered: