Updated some entries in the file COUNTER_Robots_list and added 1636 n… #62
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Updated the robots list with 1636 new robots.
Most of the new entries have been gathered from https://github.com/monperrus/crawler-user-agents/blob/master/crawler-user-agents.json using using it as a guide and detecting new bots from our user-agents dataset.
This list is designed to be used as a REGEX pattern to identify crawlers/bots from our user-agent entries and exclude them from our metrics if detected.
The following files have been modified:
CHANGES.md
COUNTER_Robots_list.json
convert_to_txt
generated/COUNTER_Robots_list.txt