-
-
Notifications
You must be signed in to change notification settings - Fork 4.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(aci): registry for dual writing conditions #82021
Conversation
event_value = get_nested_value(data, data_filter) | ||
def evaluate_value( | ||
value: GroupEvent, comparison: Any, condition: str, **kwargs: Any | ||
) -> DetectorPriorityLevel | int | float | bool | None: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
my local mypy was complaining about this
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
DataConditionResult
is probs the best one to use from sentry.workflow_engine.types
(it's the same as what you have)
Codecov ReportAll modified and coverable lines are covered by tests ✅ ✅ All tests successful. No failed tests found. Additional details and impacted files@@ Coverage Diff @@
## master #82021 +/- ##
=======================================
Coverage 80.35% 80.36%
=======================================
Files 7279 7280 +1
Lines 321319 321381 +62
Branches 20958 20958
=======================================
+ Hits 258206 258275 +69
+ Misses 62699 62692 -7
Partials 414 414 |
data: dict[str, Any], dcg: DataConditionGroup | ||
) -> DataCondition: | ||
return DataCondition.objects.create( | ||
type=Condition.GROUP_EVENT_ATTR_COMPARISON, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
my worry with using the same type for different conditions is that they'll need to be serialized differently, but i think we can figure it out by making an enum for the possible condition
values here. we also have conditions you can no longer select in the UI but we'll still have to show because we don't delete them
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
if it's not toooooo troublesome.. min splitting the changes to the platform and the changes for the migration into 2 prs? i think that'd help me understand what code is being added for each task.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🤔 i think this module name might be a little confusing - it seems like this is creating instances of conditions for the issue rules?
Maybe we can make a migrations/rule.py
or migrations/rule/*
if you want a full module. colleen has a similar pr that also has a bunch of migration helpers, and i feel like all that code could be co-located as they can probs be removed after the rollout is completed as part of the "Cleanup" milestone.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
migrations is a good name
event_value = get_nested_value(data, data_filter) | ||
def evaluate_value( | ||
value: GroupEvent, comparison: Any, condition: str, **kwargs: Any | ||
) -> DetectorPriorityLevel | int | float | bool | None: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
DataConditionResult
is probs the best one to use from sentry.workflow_engine.types
(it's the same as what you have)
elif op is not None: | ||
result = op(cast(Any, value), self.comparison) | ||
if len(signature(op).parameters) == 1: | ||
result = op(cast(Any, value)) == self.comparison |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can you explain this a bit? i'm wondering if we can use one of the existing operators rather than introducing this if else nesting for the 1 operator.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
operator.truth
and some other operators like abs
take in 1 argument, the other operators take in 2. if you want to compare the ones with a single argument to the comparison then we need special logic
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yeah, but do we need to add the truth operator? could we just say Value === True
instead of adding the truth operator? we also don't have a use case for abs
, since the goal of data conditions is to compare a dynamic value (value) to a static value (comparison), it feels a little weird to have a single evaluation.
@@ -42,7 +42,7 @@ def bulk_get_query_object(data_sources) -> dict[int, T | None]: | |||
|
|||
class DataConditionHandler(Generic[T]): | |||
@staticmethod | |||
def evaluate_value(value: T, comparison: Any, condition: str) -> DataConditionResult: | |||
def evaluate_value(value: T, comparison: Any, condition: str, **kwargs) -> DataConditionResult: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can you explain how these kwargs are being used?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we discussed this before, we can pass the other parts of PostProcessJob
that aren't the GroupEvent
as kwargs
so they can be evaluated in workflow conditions, and keep process_workflows
general for other group types that don't have kwargs. there is an EventState
object we check for RegressionCondition and ReappearedEventCondition today, for example
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
okay cool, wanted to confirm that was the intent -- at this level, i don't think we should be referring to them as kwargs though, instead we should be evaluating them as the value (whichever kwarg you want to evaluate).
this also seems like it's only adding the passing of that stuff in a limited scope, instead we should pull this out as a separate PR an look at that feature as a whole. then we can start to build off of it, otherwise there's a lot of noise and it's hard to follow eevrything.
event_value = get_nested_value(value, condition) | ||
if event_value is None: | ||
event_value = get_nested_value( | ||
kwargs, condition | ||
) # TODO: remove when GroupEvent contains EventState |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
do we need to split here? could we just invoke the condition with the kwargs as the value? (if it's a different input type, we can add another method here with a different type and reuse the get_nested_value
method in that handler)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
sure i can do that
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
i put this here in the case that we actually do decide to move these out of a separate kwarg and into the GroupEvent
itself
@saponifi3d i feel like it's helpful to know why we are creating data conditions a certain way if you can see how the fields are being used in the handler, lmk if you feel differently |
i can split it |
too many merge conflicts. replaced by #82511 |
(and migrate every event, regression, reappeared conditions)
Add a registry for dual writing a condition on a
Rule
to aDataCondition
based on the conditionid
.EveryEventCondition is an example where we can replace the condition with a Python operator (
operator.truth
), since we are evaluating on aGroupEvent
which is always truthy.Regression and ReappearedCondition are simple conditions that directly evaluate on
GroupState
which will be passed inkwargs
toDataCondition.evaluate_value
.