-
Notifications
You must be signed in to change notification settings - Fork 824
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add an example of a custom classifier #4
Comments
Here is one of mine. For log lines like this: I set up a Glue custom classifier with: Grok pattern: %{OURLOGWITHJSON} Custom patterns: (Note Logstash works with GREEDYJSON ({.*}) but Glue's Grok parser rejects that) and I get rows with four fields: The Grok patterns are a bit more complicated than the minimum to match that, |
I updated the text above, so the backslashes are now correctly shown in the GREEDYJSON pattern... (The text above elided the backslashes in front of the braces in the GREEDYJSON pattern -- I needed to add those in order for Glue's Grok parser to accept the pattern.) |
I got the same issue than @vatjujar vatjujar. |
We are also experiencing the same issue while trying to parse apache styled log lines—everything works perfect in online grok debuggers, but manually running a crawler shows nothing...a more detailed example would be greatly appreciated! |
I have given many tries but not working , all my grok patterns work well with grok debugger but not in AWS Glue |
I tried writing a pattern for single quoted semi json data file and it works on the debugger. However, not in Glue. Any help is much appreciated! |
As shown above, I had to include backslashes before the brace characters (see "GREEDYJSON") to get it to match the JSON part of my log lines (to a string field named json, which I later unbox in a Glue script like this: The backslashes weren't necessary in the online Grok debugger or in Logstash, but were necessary in Glue's Grok patterns. Dunno if that's your issue or not, but you might try throwing around some backslashes to see if it helps! |
I just ran into this same issue. The problem was that in order to test an updated classifier, you need to create a whole new crawler. Simply updating the classifier and rerunning the crawler will NOT result in the updated classifier being used. This is not intuitive at all and lacks documentation in relevant places. The only place this is explicitly mentioned (that I found) is in https://docs.aws.amazon.com/glue/latest/dg/add-classifier.html - "To reclassify data to correct an incorrect classifier, create a new crawler with the updated classifier" This nugget of information needs to be added to every other place custom classifiers are documented in bold capital letters. |
I have lot text files in our S3 under different folders in columnar data sections. Automatic crawler does not recognize the schema in those files. How do we setup custom crawler for text files with column data. |
How do we have crawler setup on S3 buckets with "ini" file formats? |
Thank you @vkubushyn , you saved me some time. I faced the same here. |
in addition:
|
I'd like to see an example of custom classifier that is proven to work with custom data. The reason for the request is my headache when trying to write my own and my efforts simply do not work. My code (and patterns) work perfectly in online Grok debuggers, but they do not work in AWS. I do not get any errors in the logs either. My data simply does not get classified and table schemas are not created.
So, the classifier example should include a custom file to classify, maybe a log file of some sort. The file itself should include various types of information so that the example would demonstrate various pattern matches. Then the example should present the classifier rule, maybe even include a custom keyword to demonstrate the usage of that one too. Also, a deliberate mistake should also be demoed (both in input data and patterns) and how to debug this situation in AWS.
Thanks in advance!
The text was updated successfully, but these errors were encountered: