Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

use vectorized parameter where possible #2

Open
wants to merge 2 commits into
base: 240521_refine_ut_framework
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions docs/additional-functionality/advanced_configs.md
Original file line number Diff line number Diff line change
Expand Up @@ -207,6 +207,7 @@ Name | SQL Function(s) | Description | Default Value | Notes
<a name="sql.expression.BitwiseNot"></a>spark.rapids.sql.expression.BitwiseNot|`~`|Returns the bitwise NOT of the operands|true|None|
<a name="sql.expression.BitwiseOr"></a>spark.rapids.sql.expression.BitwiseOr|`\|`|Returns the bitwise OR of the operands|true|None|
<a name="sql.expression.BitwiseXor"></a>spark.rapids.sql.expression.BitwiseXor|`^`|Returns the bitwise XOR of the operands|true|None|
<a name="sql.expression.BoundReference"></a>spark.rapids.sql.expression.BoundReference| |Reference to a bound variable|true|None|
<a name="sql.expression.CaseWhen"></a>spark.rapids.sql.expression.CaseWhen|`when`|CASE WHEN expression|true|None|
<a name="sql.expression.Cast"></a>spark.rapids.sql.expression.Cast|`bigint`, `binary`, `boolean`, `cast`, `date`, `decimal`, `double`, `float`, `int`, `smallint`, `string`, `timestamp`, `tinyint`|Convert a column of one type of data into another type|true|None|
<a name="sql.expression.Cbrt"></a>spark.rapids.sql.expression.Cbrt|`cbrt`|Cube root|true|None|
Expand Down
48 changes: 48 additions & 0 deletions docs/supported_ops.md
Original file line number Diff line number Diff line change
Expand Up @@ -4112,6 +4112,54 @@ are limited.
<td> </td>
</tr>
<tr>
<td rowSpan="2">BoundReference</td>
<td rowSpan="2"> </td>
<td rowSpan="2">Reference to a bound variable</td>
<td rowSpan="2">None</td>
<td rowSpan="1">project</td>
<td>result</td>
<td>S</td>
<td>S</td>
<td>S</td>
<td>S</td>
<td>S</td>
<td>S</td>
<td>S</td>
<td>S</td>
<td><em>PS<br/>UTC is only supported TZ for TIMESTAMP</em></td>
<td>S</td>
<td>S</td>
<td>S</td>
<td>S</td>
<td><b>NS</b></td>
<td><em>PS<br/>UTC is only supported TZ for child TIMESTAMP;<br/>unsupported child types CALENDAR, UDT</em></td>
<td><em>PS<br/>UTC is only supported TZ for child TIMESTAMP;<br/>unsupported child types CALENDAR, UDT</em></td>
<td><em>PS<br/>UTC is only supported TZ for child TIMESTAMP;<br/>unsupported child types CALENDAR, UDT</em></td>
<td><b>NS</b></td>
</tr>
<tr>
<td rowSpan="1">AST</td>
<td>result</td>
<td>S</td>
<td>S</td>
<td>S</td>
<td>S</td>
<td>S</td>
<td>S</td>
<td>S</td>
<td>S</td>
<td><em>PS<br/>UTC is only supported TZ for TIMESTAMP</em></td>
<td>S</td>
<td><b>NS</b></td>
<td><b>NS</b></td>
<td><b>NS</b></td>
<td><b>NS</b></td>
<td><b>NS</b></td>
<td><b>NS</b></td>
<td><b>NS</b></td>
<td><b>NS</b></td>
</tr>
<tr>
<td rowSpan="3">CaseWhen</td>
<td rowSpan="3">`when`</td>
<td rowSpan="3">CASE WHEN expression</td>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -945,6 +945,19 @@ object GpuOverrides extends Logging {
override def convertToGpu(child: Expression): GpuExpression =
GpuAlias(child, a.name)(a.exprId, a.qualifier, a.explicitMetadata)
}),
expr[BoundReference](
"Reference to a bound variable",
ExprChecks.projectAndAst(
TypeSig.astTypes + GpuTypeShims.additionalCommonOperatorSupportedTypes,
(TypeSig.commonCudfTypes + TypeSig.NULL + TypeSig.MAP + TypeSig.ARRAY + TypeSig.STRUCT +
TypeSig.DECIMAL_128 + TypeSig.BINARY +
GpuTypeShims.additionalCommonOperatorSupportedTypes).nested(),
TypeSig.all),
(currentRow, conf, p, r) => new ExprMeta[BoundReference](currentRow, conf, p, r) {
override def convertToGpu(): GpuExpression = GpuBoundReference(
currentRow.ordinal, currentRow.dataType, currentRow.nullable)(
NamedExpression.newExprId, "")
}),
expr[AttributeReference](
"References an input column",
ExprChecks.projectAndAst(
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -121,6 +121,7 @@ object RapidsSQLTestsBaseTrait {
"org.apache.spark.sql.rapids.ExecutionPlanCaptureCallback")
.set("spark.sql.warehouse.dir", warehouse)
.set("spark.sql.cache.serializer", "com.nvidia.spark.ParquetCachedBatchSerializer")
// TODO: remove hard coded UTC https://github.com/NVIDIA/spark-rapids/issues/10874
.set("spark.sql.session.timeZone", "UTC")
.set("spark.rapids.sql.explain", "ALL")
// uncomment below config to run `strict mode`, where fallback to CPU is treated as fail
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -22,13 +22,14 @@ package org.apache.spark.sql.rapids.utils
import java.io.File
import java.util.TimeZone

import com.nvidia.spark.rapids.{GpuProjectExec, TestStats}
import com.nvidia.spark.rapids.{ ExprChecksImpl, GpuOverrides, GpuProjectExec, ProjectExprContext, TestStats, TypeEnum, TypeSig}
import org.apache.commons.io.{FileUtils => fu}
import org.apache.commons.math3.util.Precision
import org.scalactic.TripleEqualsSupport.Spread
import scala.collection.mutable
import scala.collection.mutable.ArrayBuffer

import org.apache.spark.sql.{Column, Row, SparkSession}
import org.apache.spark.sql.{Column, DataFrame, Row, SparkSession}
import org.apache.spark.sql.catalyst.{CatalystTypeConverters, InternalRow}
import org.apache.spark.sql.catalyst.analysis.ResolveTimeZone
import org.apache.spark.sql.catalyst.expressions._
Expand All @@ -37,7 +38,7 @@ import org.apache.spark.sql.catalyst.util.{ArrayData, GenericArrayData, MapData,
import org.apache.spark.sql.internal.SQLConf
import org.apache.spark.sql.rapids.utils.RapidsQueryTestUtil.isNaNOrInf
import org.apache.spark.sql.types._

import org.apache.spark.unsafe.types.UTF8String

trait RapidsTestsTrait extends RapidsTestsCommonTrait {

Expand Down Expand Up @@ -101,6 +102,7 @@ trait RapidsTestsTrait extends RapidsTestsCommonTrait {
.config("spark.sql.queryExecutionListeners",
"org.apache.spark.sql.rapids.ExecutionPlanCaptureCallback")
.config("spark.sql.warehouse.dir", warehouse)
// TODO: remove hard coded UTC https://github.com/NVIDIA/spark-rapids/issues/10874
.config("spark.sql.session.timeZone","UTC")
.config("spark.rapids.sql.explain", "ALL")
.config("spark.rapids.sql.test.isFoldableNonLitAllowed", "true")
Expand Down Expand Up @@ -221,17 +223,98 @@ trait RapidsTestsTrait extends RapidsTestsCommonTrait {
RapidsTestConstants.SUPPORTED_DATA_TYPE.acceptsType(expr.dataType)
}

/**
* Many of the expressions in RAPIDS do not support vectorized parameters(e.g. regexp_replace)
* So need to check whether the expression being evaluated is qualified for vectorized parameters
*
* If Yes, we'll use pass the parameters of the expression as vectors (Vectorized Parameter).
*
* If No, we'll replace all the parameters with literals (Scalar Parameter) and evaluate
* the expression. We're actually evaluating a constant expression tree in this case,
* but it's fine for testing purposes. Notice that we'll need to make sure Constant Folding is
* disabled.
*
* We always prefer Vectorized Parameters to evaluate expressions. Because Scalar Parameter
* may hide some bugs. For example, an expression `some_expr(NULL)` may correctly return NULL
* only because NullPropagation is working. But if we evaluate the expression with a vector
* containing NUll, it might fail.
*
* @param e the expression being evaluated
* @return true if the expression is qualified for vectorized parameters
*/
def isQualifiedForVectorizedParams(e: Expression): Boolean = {
val map = GpuOverrides.expressions
e.foreachUp(expr => {
logDebug(s"Checking expr $expr :\n")
if (!map.contains(expr.getClass)) {
logDebug(s"Check failed because ${expr.getClass} not found in GpuOverrides.expressions\n")
return false
}
map(expr.getClass).getChecks.foreach(check => {
if (check.isInstanceOf[ExprChecksImpl]) {
val exprChecksImpl = check.asInstanceOf[ExprChecksImpl]
if (!exprChecksImpl.contexts.contains(ProjectExprContext)) {
logDebug(s"Check failed because $exprChecksImpl does not contain ProjectExprContext\n")
return false
}
val context = exprChecksImpl.contexts(ProjectExprContext)
(context.paramCheck.map(_.cudf) ++ context.repeatingParamCheck.map(_.cudf))
.foreach(sig => {
// use reflection to get the private field litOnlyTypes
import scala.reflect.runtime.universe._
val mirror = runtimeMirror(sig.getClass.getClassLoader)
val privateFieldSymbol = typeOf[TypeSig].decl(TermName("litOnlyTypes")).asTerm
val privateFieldMirror =
mirror.reflect(sig).reflectField(privateFieldSymbol)
val litOnlyTypes = privateFieldMirror.get.asInstanceOf[TypeEnum.ValueSet]
if (litOnlyTypes.nonEmpty) {
logDebug(s"Check failed because non empty litOnlyTypes: $litOnlyTypes \n")
return false
}
})
} else {
logDebug(s"Check continues by skipping ${check.getClass}")
}
})
})
logDebug(s"Check succeed")
true
}

def rapidsCheckExpression(origExpr: Expression, expected: Any, inputRow: InternalRow): Unit = {
// many of the expressions in RAPIDS do not support vectorized parameters(e.g. regexp_replace).
// So we downgrade all expression evaluation to use scalar parameters.
// In a follow-up issue (https://github.com/NVIDIA/spark-rapids/issues/10859),
// we'll take care of the expressions those already support vectorized parameters.
val expression = origExpr.transformUp {
case BoundReference(ordinal, dataType, _) =>
Literal(inputRow.asInstanceOf[GenericInternalRow].get(ordinal, dataType), dataType)
var result : Array[Row] = null
var resultDF : DataFrame = null
var expression = origExpr

if(!isQualifiedForVectorizedParams(origExpr)) {
logInfo(s"$origExpr is being evaluated with Scalar Parameter")
println(s"$origExpr is being evaluated with Scalar Parameter")
expression = origExpr.transformUp {
case BoundReference(ordinal, dataType, _) =>
Literal(inputRow.asInstanceOf[GenericInternalRow].get(ordinal, dataType), dataType)
}
resultDF = _spark.range(0, 1).select(Column(expression))
result = resultDF.collect()
} else {
logInfo(s"$expression is being evaluated with Vectorized Parameter")
println(s"$expression is being evaluated with Vectorized Parameter")
val typeHintForOrdinal : Map[Int, DataType] = expression.collect {
// In spark UT testing expressions, they typically use `val s = 's.string.at(0)`
// to define a bound reference with type string.
case b: BoundReference => b.ordinal -> b.dataType
}.toMap
val df = if (inputRow != EmptyRow && inputRow != InternalRow.empty) {
convertInternalRowToDataFrame(inputRow, typeHintForOrdinal)
} else {
// create a fake useless DF
val schema = StructType(StructField("a", IntegerType, nullable = true) :: Nil)
val empData = Seq(Row(1))
_spark.createDataFrame(_spark.sparkContext.parallelize(empData), schema)
}
resultDF = df.select(Column(expression))
result = resultDF.collect()
}
val resultDF = _spark.range(0, 1).select(Column(expression))
val result = resultDF.collect()

TestStats.testUnitNumber = TestStats.testUnitNumber + 1
if (
checkDataTypeSupported(expression) &&
Expand Down Expand Up @@ -293,4 +376,54 @@ trait RapidsTestsTrait extends RapidsTestsCommonTrait {
}
true
}

def convertInternalRowToDataFrame(
inputRow: InternalRow, typeHintForOrdinal: Map[Int, DataType]) : DataFrame = {
val structFileSeq = new ArrayBuffer[StructField]()
val values = inputRow match {
case genericInternalRow: GenericInternalRow =>
genericInternalRow.values
case _ => throw new UnsupportedOperationException("Unsupported InternalRow.")
}
values.zipWithIndex.foreach { pair => {
if (typeHintForOrdinal.contains(pair._2)) {
structFileSeq.append(
StructField(s"col${pair._2}", typeHintForOrdinal(pair._2), pair._1 == null))
} else {
pair._1 match {
case boolean: java.lang.Boolean =>
structFileSeq.append(StructField(s"col${pair._2}", BooleanType, boolean == null))
case short: java.lang.Short =>
structFileSeq.append(StructField(s"col${pair._2}", ShortType, short == null))
case byte: java.lang.Byte =>
structFileSeq.append(StructField(s"col${pair._2}", ByteType, byte == null))
case integer: java.lang.Integer =>
structFileSeq.append(StructField(s"col${pair._2}", IntegerType, integer == null))
case long: java.lang.Long =>
structFileSeq.append(StructField(s"col${pair._2}", LongType, long == null))
case float: java.lang.Float =>
structFileSeq.append(StructField(s"col${pair._2}", FloatType, float == null))
case double: java.lang.Double =>
structFileSeq.append(StructField(s"col${pair._2}", DoubleType, double == null))
case utf8String: UTF8String =>
structFileSeq.append(StructField(s"col${pair._2}", StringType, utf8String == null))
case byteArr: Array[Byte] =>
structFileSeq.append(StructField(s"col${pair._2}", BinaryType, byteArr == null))
case decimal: Decimal =>
structFileSeq.append(
StructField(s"col${pair._2}", DecimalType(decimal.precision, decimal.scale),
decimal == null))
case null =>
structFileSeq.append(StructField(s"col${pair._2}", NullType, nullable = true))
case unsupported@_ =>
throw new UnsupportedOperationException(s"Unsupported type: ${unsupported.getClass}")
}
}
}
}
val fields = structFileSeq.toSeq
_spark.internalCreateDataFrame(
_spark.sparkContext.parallelize(Seq(inputRow)),
StructType(fields))
}
}
1 change: 1 addition & 0 deletions tools/generated_files/311/operatorsScore.csv
Original file line number Diff line number Diff line change
Expand Up @@ -72,6 +72,7 @@ BitwiseAnd,4
BitwiseNot,4
BitwiseOr,4
BitwiseXor,4
BoundReference,4
CaseWhen,4
Cbrt,4
Ceil,4
Expand Down
2 changes: 2 additions & 0 deletions tools/generated_files/311/supportedExprs.csv
Original file line number Diff line number Diff line change
Expand Up @@ -112,6 +112,8 @@ BitwiseXor,S,`^`,None,project,result,NA,S,S,S,S,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA
BitwiseXor,S,`^`,None,AST,lhs,NA,NS,NS,S,S,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA
BitwiseXor,S,`^`,None,AST,rhs,NA,NS,NS,S,S,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA
BitwiseXor,S,`^`,None,AST,result,NA,NS,NS,S,S,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA
BoundReference,S, ,None,project,result,S,S,S,S,S,S,S,S,PS,S,S,S,S,NS,PS,PS,PS,NS
BoundReference,S, ,None,AST,result,S,S,S,S,S,S,S,S,PS,S,NS,NS,NS,NS,NS,NS,NS,NS
CaseWhen,S,`when`,None,project,predicate,S,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA
CaseWhen,S,`when`,None,project,value,S,S,S,S,S,S,S,S,PS,S,S,S,S,NS,PS,PS,PS,NS
CaseWhen,S,`when`,None,project,result,S,S,S,S,S,S,S,S,PS,S,S,S,S,NS,PS,PS,PS,NS
Expand Down
1 change: 1 addition & 0 deletions tools/generated_files/312/operatorsScore.csv
Original file line number Diff line number Diff line change
Expand Up @@ -72,6 +72,7 @@ BitwiseAnd,4
BitwiseNot,4
BitwiseOr,4
BitwiseXor,4
BoundReference,4
CaseWhen,4
Cbrt,4
Ceil,4
Expand Down
2 changes: 2 additions & 0 deletions tools/generated_files/312/supportedExprs.csv
Original file line number Diff line number Diff line change
Expand Up @@ -112,6 +112,8 @@ BitwiseXor,S,`^`,None,project,result,NA,S,S,S,S,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA
BitwiseXor,S,`^`,None,AST,lhs,NA,NS,NS,S,S,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA
BitwiseXor,S,`^`,None,AST,rhs,NA,NS,NS,S,S,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA
BitwiseXor,S,`^`,None,AST,result,NA,NS,NS,S,S,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA
BoundReference,S, ,None,project,result,S,S,S,S,S,S,S,S,PS,S,S,S,S,NS,PS,PS,PS,NS
BoundReference,S, ,None,AST,result,S,S,S,S,S,S,S,S,PS,S,NS,NS,NS,NS,NS,NS,NS,NS
CaseWhen,S,`when`,None,project,predicate,S,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA
CaseWhen,S,`when`,None,project,value,S,S,S,S,S,S,S,S,PS,S,S,S,S,NS,PS,PS,PS,NS
CaseWhen,S,`when`,None,project,result,S,S,S,S,S,S,S,S,PS,S,S,S,S,NS,PS,PS,PS,NS
Expand Down
1 change: 1 addition & 0 deletions tools/generated_files/313/operatorsScore.csv
Original file line number Diff line number Diff line change
Expand Up @@ -72,6 +72,7 @@ BitwiseAnd,4
BitwiseNot,4
BitwiseOr,4
BitwiseXor,4
BoundReference,4
CaseWhen,4
Cbrt,4
Ceil,4
Expand Down
2 changes: 2 additions & 0 deletions tools/generated_files/313/supportedExprs.csv
Original file line number Diff line number Diff line change
Expand Up @@ -112,6 +112,8 @@ BitwiseXor,S,`^`,None,project,result,NA,S,S,S,S,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA
BitwiseXor,S,`^`,None,AST,lhs,NA,NS,NS,S,S,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA
BitwiseXor,S,`^`,None,AST,rhs,NA,NS,NS,S,S,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA
BitwiseXor,S,`^`,None,AST,result,NA,NS,NS,S,S,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA
BoundReference,S, ,None,project,result,S,S,S,S,S,S,S,S,PS,S,S,S,S,NS,PS,PS,PS,NS
BoundReference,S, ,None,AST,result,S,S,S,S,S,S,S,S,PS,S,NS,NS,NS,NS,NS,NS,NS,NS
CaseWhen,S,`when`,None,project,predicate,S,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA
CaseWhen,S,`when`,None,project,value,S,S,S,S,S,S,S,S,PS,S,S,S,S,NS,PS,PS,PS,NS
CaseWhen,S,`when`,None,project,result,S,S,S,S,S,S,S,S,PS,S,S,S,S,NS,PS,PS,PS,NS
Expand Down
1 change: 1 addition & 0 deletions tools/generated_files/320/operatorsScore.csv
Original file line number Diff line number Diff line change
Expand Up @@ -76,6 +76,7 @@ BitwiseAnd,4
BitwiseNot,4
BitwiseOr,4
BitwiseXor,4
BoundReference,4
CaseWhen,4
Cbrt,4
Ceil,4
Expand Down
2 changes: 2 additions & 0 deletions tools/generated_files/320/supportedExprs.csv
Original file line number Diff line number Diff line change
Expand Up @@ -112,6 +112,8 @@ BitwiseXor,S,`^`,None,project,result,NA,S,S,S,S,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA
BitwiseXor,S,`^`,None,AST,lhs,NA,NS,NS,S,S,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA
BitwiseXor,S,`^`,None,AST,rhs,NA,NS,NS,S,S,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA
BitwiseXor,S,`^`,None,AST,result,NA,NS,NS,S,S,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA
BoundReference,S, ,None,project,result,S,S,S,S,S,S,S,S,PS,S,S,S,S,NS,PS,PS,PS,NS,NS,NS
BoundReference,S, ,None,AST,result,S,S,S,S,S,S,S,S,PS,S,NS,NS,NS,NS,NS,NS,NS,NS,NS,NS
CaseWhen,S,`when`,None,project,predicate,S,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA
CaseWhen,S,`when`,None,project,value,S,S,S,S,S,S,S,S,PS,S,S,S,S,NS,PS,PS,PS,NS,NS,NS
CaseWhen,S,`when`,None,project,result,S,S,S,S,S,S,S,S,PS,S,S,S,S,NS,PS,PS,PS,NS,NS,NS
Expand Down
1 change: 1 addition & 0 deletions tools/generated_files/321/operatorsScore.csv
Original file line number Diff line number Diff line change
Expand Up @@ -76,6 +76,7 @@ BitwiseAnd,4
BitwiseNot,4
BitwiseOr,4
BitwiseXor,4
BoundReference,4
CaseWhen,4
Cbrt,4
Ceil,4
Expand Down
2 changes: 2 additions & 0 deletions tools/generated_files/321/supportedExprs.csv
Original file line number Diff line number Diff line change
Expand Up @@ -112,6 +112,8 @@ BitwiseXor,S,`^`,None,project,result,NA,S,S,S,S,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA
BitwiseXor,S,`^`,None,AST,lhs,NA,NS,NS,S,S,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA
BitwiseXor,S,`^`,None,AST,rhs,NA,NS,NS,S,S,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA
BitwiseXor,S,`^`,None,AST,result,NA,NS,NS,S,S,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA
BoundReference,S, ,None,project,result,S,S,S,S,S,S,S,S,PS,S,S,S,S,NS,PS,PS,PS,NS,NS,NS
BoundReference,S, ,None,AST,result,S,S,S,S,S,S,S,S,PS,S,NS,NS,NS,NS,NS,NS,NS,NS,NS,NS
CaseWhen,S,`when`,None,project,predicate,S,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA
CaseWhen,S,`when`,None,project,value,S,S,S,S,S,S,S,S,PS,S,S,S,S,NS,PS,PS,PS,NS,NS,NS
CaseWhen,S,`when`,None,project,result,S,S,S,S,S,S,S,S,PS,S,S,S,S,NS,PS,PS,PS,NS,NS,NS
Expand Down
1 change: 1 addition & 0 deletions tools/generated_files/321cdh/operatorsScore.csv
Original file line number Diff line number Diff line change
Expand Up @@ -76,6 +76,7 @@ BitwiseAnd,4
BitwiseNot,4
BitwiseOr,4
BitwiseXor,4
BoundReference,4
CaseWhen,4
Cbrt,4
Ceil,4
Expand Down
2 changes: 2 additions & 0 deletions tools/generated_files/321cdh/supportedExprs.csv
Original file line number Diff line number Diff line change
Expand Up @@ -112,6 +112,8 @@ BitwiseXor,S,`^`,None,project,result,NA,S,S,S,S,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA
BitwiseXor,S,`^`,None,AST,lhs,NA,NS,NS,S,S,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA
BitwiseXor,S,`^`,None,AST,rhs,NA,NS,NS,S,S,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA
BitwiseXor,S,`^`,None,AST,result,NA,NS,NS,S,S,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA
BoundReference,S, ,None,project,result,S,S,S,S,S,S,S,S,PS,S,S,S,S,NS,PS,PS,PS,NS,NS,NS
BoundReference,S, ,None,AST,result,S,S,S,S,S,S,S,S,PS,S,NS,NS,NS,NS,NS,NS,NS,NS,NS,NS
CaseWhen,S,`when`,None,project,predicate,S,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA,NA
CaseWhen,S,`when`,None,project,value,S,S,S,S,S,S,S,S,PS,S,S,S,S,NS,PS,PS,PS,NS,NS,NS
CaseWhen,S,`when`,None,project,result,S,S,S,S,S,S,S,S,PS,S,S,S,S,NS,PS,PS,PS,NS,NS,NS
Expand Down
1 change: 1 addition & 0 deletions tools/generated_files/322/operatorsScore.csv
Original file line number Diff line number Diff line change
Expand Up @@ -76,6 +76,7 @@ BitwiseAnd,4
BitwiseNot,4
BitwiseOr,4
BitwiseXor,4
BoundReference,4
CaseWhen,4
Cbrt,4
Ceil,4
Expand Down
Loading