Skip to content

Commit

Permalink
Merge remote-tracking branch 'origin/staging' into improvement/hocon-…
Browse files Browse the repository at this point in the history
…reload
  • Loading branch information
mk-software-pl committed Aug 13, 2024
2 parents 03a4f1e + fb5b0ab commit 680090f
Show file tree
Hide file tree
Showing 606 changed files with 9,109 additions and 4,754 deletions.
3 changes: 0 additions & 3 deletions .github/workflows/pr.yml
Original file line number Diff line number Diff line change
Expand Up @@ -612,9 +612,6 @@ jobs:
- repo: nussknacker-quickstart
workflow_id: pr.yml
ref: staging
- repo: nussknacker-quickstart
workflow_id: benchmark-workflow.yml
ref: staging
- repo: nussknacker-sample-components
workflow_id: pr.yml
ref: staging
Expand Down
7 changes: 7 additions & 0 deletions .github/workflows/release.yml
Original file line number Diff line number Diff line change
Expand Up @@ -106,6 +106,13 @@ jobs:
sudo rm -rf "$AGENT_TOOLSDIRECTORY/Python"
sudo rm -rf "$AGENT_TOOLSDIRECTORY/PyPy"
sudo rm -rf "$AGENT_TOOLSDIRECTORY/CodeQL"
sudo rm -rf /usr/lib/firefox
sudo rm -rf /opt/google/chrome
sudo rm -rf /opt/microsoft/msedge
sudo rm -rf /opt/microsoft/powershell
sudo rm -rf /usr/local/lib/android
sudo rm -rf /usr/local/share/chromium
sudo rm -rf /usr/local/share/powershell
- name: "Build"
env:
SONATYPE_USERNAME: ${{ secrets.SONATYPE_USER }}
Expand Down
10 changes: 5 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,18 +33,18 @@ Algorithms for making such decisions can be developed by programmers. With Nussk

An essential part of Nussknacker is a visual design tool for decision algorithms (scenarios in Nussknacker's speak). It allows not-so-technical users, like analysts or business people, to author decision logic in an imperative, easy-to-follow and understandable way. Scenario author uses prebuilt components to define the decision logic - routing, filtering, data transformations, aggregations in time windows (Flink engine only - see below), enrichments with data from external databases or OpenAPI endpoints, applications of ML models, etc. Once authored, with a click of a button, scenarios are deployed for execution. And can be changed and redeployed anytime there’s a need.

The way the data are processed and features available depend on the processing mode and engine used.
The way the data are processed and the features available depend on the processing mode and engine used.

Nussknacker supports three [processing modes](https://nussknacker.io/documentation/docs/about/ProcessingModes/): streaming, request-response and batch (planned in version 1.16). In streaming mode, Nussknacker uses Kafka as its primary interface: input streams of data and output streams of decisions. In request-response mode, it exposes HTTP endpoints with OpenAPI definitions.

There are two engines to which scenarios can be deployed: Flink and Light. Check out [this document](https://nussknacker.io/documentation/docs/about/engines/) to understand which of the two fits your use case better.
There are two engines to which scenarios can be deployed: Flink and Lite. Check out [this document](https://nussknacker.io/documentation/docs/about/engines/) to understand which of the two fits your use case better.

## Why Nussknacker

Nussknacker promises to make developing and deploying real-time decision algorithms as easy as it is to crunch [data at rest](https://en.wikipedia.org/wiki/Data_at_rest) with spreadsheets. Hundreds of millions of non-programmers create spreadsheets to crunch data at rest these days. The same should be possible with real-time data - and this is our promise with Nussknacker. If this promise is fulfilled, domain experts and developers can focus on tasks that each of these two groups is most happy to perform. Domain experts can author the decision algorithms and developers can solve problems beyond the reach of tools like Nussknacker.
Nussknacker is a tool for those who want to act on real-time data as easily as it is with [data at rest](https://en.wikipedia.org/wiki/Data_at_rest) and spreadsheets. Hundreds of millions of non-programmers create spreadsheets to crunch data at rest these days. The same should be possible with real-time data - and this is our promise with Nussknacker. If this promise is fulfilled, domain experts and developers can focus on tasks that each group is most happy to perform. Domain experts can author the decision algorithms and developers can solve problems beyond the reach of tools like Nussknacker.

We discovered that several factors heavily influence the development of algorithms that work with real-time data, including expectations placed on the tools used:
- **Domain experts** - often, these are domain experts who conceptualize the algorithms, and the expertise required is very domain specific. Without proper tools for converting algorithms to code, domain experts have to delegate this work to programmers who are proficient in multiple tools, programming languages, and technologies. This approach costs money and takes time. With Nussknacker, domain experts build the algorithm from prefabricated blocks. The trick is to make these prefabricated blocks infinitely flexible to allow for any data transformation and flow control condition. Nussknacker achieves this by using [SpEL](https://nussknacker.io/documentation/docs/scenarios_authoring/Intro/#spel), an easy-to-learn expression language.
- **Domain experts** - often, these are domain experts who conceptualize the algorithms, and the expertise required is very domain-specific. Without proper tools for converting algorithms to code, domain experts have to delegate this work to programmers who are proficient in multiple tools, programming languages, and technologies. This approach costs money and takes time. With Nussknacker, domain experts build the algorithm from prefabricated blocks. The trick is to make these prefabricated blocks infinitely flexible to allow for any data transformation and flow control condition. Nussknacker achieves this by using [SpEL](https://nussknacker.io/documentation/docs/scenarios_authoring/Intro/#spel), an easy-to-learn expression language.
- **Experimentation** - the algorithms may require a lot of experimentation before one gets them right. If so, the iteration time to implement a change, deploy it, and see the result should be in single minutes if not seconds. With Nussknacker, non-technical users can achieve iteration time below one minute.
- **Data enrichment** - the data stream's or request's informational content can be very limited - for example, Call Data Records (CDRs), clickstream, and sensor readouts. After initial filtering, if one needs to build useful algorithms, the original data has to be enriched with data from external sources. Nussknacker supports SQL, OpenAPI and ML enrichments. As all of them can be treated as a function call they blend very well with the expression language used by Nussknacker and do not add any additional complexity to algorithm authoring.
- **Productivity** - if low-code solutions want to be considered tools rather than toys, they must offer features available in professional developer toolkits. Nussknacker Designer has built-in [syntax checking, code completion](https://nussknacker.io/documentation/docs/about/KeyFeatures/#smart-code-suggestions-and-validation), versioning, [debugging, and testing support](https://nussknacker.io/documentation/docs/next/scenarios_authoring/TestingAndDebugging/).
Expand Down Expand Up @@ -74,7 +74,7 @@ Nussknacker is typically used as a component of a larger system, but it can be u
- [Typical deployment](https://nussknacker.io/documentation/docs/about/typical%20implementation/Streaming/)
- [Authoring scenarios with Nussknacker](https://nussknacker.io/documentation/docs/scenarios_authoring/Intro/)
- [Customer success story](https://nussknacker.io/case-studies/real-time-marketing-for-a-telecom-service-provider/)
- [Nussknacker Enterprise](https://nussknacker.io/documentation/about/NussknackerEnterprise/)
- [Nussknacker Enterprise](https://nussknacker.io/documentation/docs/about/NussknackerEnterprise/)

## Quickstart

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ import pl.touk.nussknacker.engine.api.{Context, MethodToInvoke, ParamName, Servi
import pl.touk.nussknacker.engine.build.ScenarioBuilder
import pl.touk.nussknacker.engine.canonicalgraph.CanonicalProcess
import pl.touk.nussknacker.engine.graph.expression.Expression
import pl.touk.nussknacker.engine.spel.Implicits._
import pl.touk.nussknacker.engine.spel.SpelExtension._
import pl.touk.nussknacker.engine.util.SynchronousExecutionContextAndIORuntime

import java.util.concurrent.TimeUnit
Expand All @@ -24,7 +24,7 @@ class ManyParamsInterpreterBenchmark {
private val process: CanonicalProcess = ScenarioBuilder
.streaming("t1")
.source("source", "source")
.enricher("e1", "out", "service", (1 to 20).map(i => s"p$i" -> ("''": Expression)): _*)
.enricher("e1", "out", "service", (1 to 20).map(i => s"p$i" -> ("''".spel: Expression)): _*)
.emptySink("sink", "sink")

private val interpreterIOSyncService = prepareIoInterpreter(
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ import pl.touk.nussknacker.engine.api.component.ComponentDefinition
import pl.touk.nussknacker.engine.api.process.ServiceExecutionContext
import pl.touk.nussknacker.engine.build.ScenarioBuilder
import pl.touk.nussknacker.engine.canonicalgraph.CanonicalProcess
import pl.touk.nussknacker.engine.spel.Implicits._
import pl.touk.nussknacker.engine.spel.SpelExtension._
import pl.touk.nussknacker.engine.util.SynchronousExecutionContextAndIORuntime

import java.util.concurrent.TimeUnit
Expand All @@ -25,8 +25,8 @@ class OneParamInterpreterBenchmark {
private val process: CanonicalProcess = ScenarioBuilder
.streaming("t1")
.source("source", "source")
.buildSimpleVariable("v1", "v1", "{a:'', b: 2}")
.enricher("e1", "out", "service", "p1" -> "''")
.buildSimpleVariable("v1", "v1", "{a:'', b: 2}".spel)
.enricher("e1", "out", "service", "p1" -> "''".spel)
// Uncomment to assess impact of costly variables
// .buildSimpleVariable("v2", "v2", "{a:'', b: #out, c: {'d','d','ss','aa'}.?[#this.substring(0, 1) == ''] }")
// .buildSimpleVariable("v3", "v3", "{a:'', b: #out, c: {'d','d','ss','aa'}.?[#this.substring(0, 1) == ''] }")
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -3,13 +3,14 @@ package pl.touk.nussknacker.engine.benchmarks.suggester
import pl.touk.nussknacker.engine.api.dict.embedded.EmbeddedDictDefinition
import pl.touk.nussknacker.engine.api.process.ClassExtractionSettings
import pl.touk.nussknacker.engine.dict.{SimpleDictQueryService, SimpleDictRegistry}
import pl.touk.nussknacker.ui.suggester.{CaretPosition2d, ExpressionSuggester}
import pl.touk.nussknacker.ui.suggester.ExpressionSuggester
import pl.touk.nussknacker.engine.testing.ModelDefinitionBuilder
import pl.touk.nussknacker.engine.api.dict.UiDictServices
import pl.touk.nussknacker.engine.api.typed.typing.{Typed, TypingResult}
import pl.touk.nussknacker.engine.definition.clazz.{ClassDefinitionExtractor, ClassDefinitionSet}
import pl.touk.nussknacker.engine.graph.expression.Expression
import pl.touk.nussknacker.engine.graph.expression.Expression.Language
import pl.touk.nussknacker.engine.util.CaretPosition2d

import java.time.{Duration, LocalDateTime}
import scala.concurrent.duration.{Duration => ScalaDuration}
Expand Down
35 changes: 12 additions & 23 deletions build.sbt
Original file line number Diff line number Diff line change
Expand Up @@ -105,9 +105,6 @@ lazy val publishSettings = Seq(
def defaultMergeStrategy: String => MergeStrategy = {
// remove JPMS module descriptors (a proper soultion would be to merge them)
case PathList(ps @ _*) if ps.last == "module-info.class" => MergeStrategy.discard
// this prevents problem with table api in runtime:
// https://stackoverflow.com/questions/60436823/issue-when-flink-upload-a-job-with-stream-sql-query
case PathList("org", "codehaus", "janino", "CompilerFactory.class") => MergeStrategy.discard
// we override Spring's class and we want to keep only our implementation
case PathList(ps @ _*) if ps.last == "NumberUtils.class" => MergeStrategy.first
// merge Netty version information files
Expand Down Expand Up @@ -487,7 +484,7 @@ lazy val distribution: Project = sbt
List(
(flinkDeploymentManager / assembly).value -> "managers/nussknacker-flink-manager.jar",
(liteK8sDeploymentManager / assembly).value -> "managers/lite-k8s-manager.jar",
(liteEmbeddedDeploymentManager / assembly).value -> "managers/lite-embedded-manager.jar"
(liteEmbeddedDeploymentManager / assembly).value -> "managers/lite-embedded-manager.jar",
)
},
componentArtifacts := {
Expand All @@ -511,8 +508,8 @@ lazy val distribution: Project = sbt
},
devArtifacts := {
modelArtifacts.value ++ List(
(flinkDevModel / assembly).value -> "model/devModel.jar",
(devPeriodicDM / assembly).value -> "managers/devPeriodicDM.jar",
(flinkDevModel / assembly).value -> "model/devModel.jar",
(flinkPeriodicDeploymentManager / assembly).value -> "managers/nussknacker-flink-periodic-manager.jar",
)
},
Universal / packageName := ("nussknacker" + "-" + version.value),
Expand Down Expand Up @@ -640,6 +637,7 @@ lazy val flinkDeploymentManager = (project in flink("management"))
lazy val flinkPeriodicDeploymentManager = (project in flink("management/periodic"))
.settings(commonSettings)
.settings(assemblyNoScala("nussknacker-flink-periodic-manager.jar"): _*)
.settings(publishAssemblySettings: _*)
.settings(
name := "nussknacker-flink-periodic-manager",
libraryDependencies ++= {
Expand Down Expand Up @@ -715,18 +713,6 @@ lazy val flinkDevModelJava = (project in flink("management/dev-model-java"))
flinkComponentsUtils % Provided
)

lazy val devPeriodicDM = (project in flink("management/dev-periodic-dm"))
.settings(commonSettings)
.settings(assemblyNoScala("devPeriodicDm.jar"): _*)
.settings(
name := "nussknacker-dev-periodic-dm",
libraryDependencies ++= {
Seq(
)
}
)
.dependsOn(flinkPeriodicDeploymentManager, deploymentManagerApi % Provided)

lazy val flinkTests = (project in flink("tests"))
.settings(commonSettings)
.settings(
Expand Down Expand Up @@ -1163,8 +1149,7 @@ lazy val flinkComponentsUtils = (project in flink("components-utils"))
name := "nussknacker-flink-components-utils",
libraryDependencies ++= {
Seq(
"org.apache.flink" % "flink-streaming-java" % flinkV % Provided,
"org.apache.flink" % "flink-metrics-dropwizard" % flinkV,
"org.apache.flink" % "flink-streaming-java" % flinkV % Provided,
)
}
)
Expand Down Expand Up @@ -1805,6 +1790,7 @@ lazy val flinkKafkaComponents = (project in flink("components/kafka"))
componentsUtils % Provided
)

// TODO: check if any flink-table / connector / format dependencies' scope can be limited
lazy val flinkTableApiComponents = (project in flink("components/table"))
.settings(commonSettings)
.settings(assemblyNoScala("flinkTable.jar"): _*)
Expand All @@ -1817,6 +1803,9 @@ lazy val flinkTableApiComponents = (project in flink("components/table"))
"org.apache.flink" % "flink-table-api-java-bridge" % flinkV,
"org.apache.flink" % "flink-table-planner-loader" % flinkV,
"org.apache.flink" % "flink-table-runtime" % flinkV,
"org.apache.flink" % "flink-clients" % flinkV,
"org.apache.flink" % "flink-connector-files" % flinkV, // needed for testing data generation
"org.apache.flink" % "flink-json" % flinkV, // needed for testing data generation
)
}
)
Expand All @@ -1826,6 +1815,7 @@ lazy val flinkTableApiComponents = (project in flink("components/table"))
commonUtils % Provided,
componentsUtils % Provided,
flinkComponentsUtils % Provided,
jsonUtils % Provided,
)

lazy val copyClientDist = taskKey[Unit]("copy designer client")
Expand Down Expand Up @@ -2011,7 +2001,7 @@ lazy val designer = (project in file("designer/server"))
liteEmbeddedDeploymentManager % Provided,
liteK8sDeploymentManager % Provided,
developmentTestsDeploymentManager % Provided,
devPeriodicDM % Provided,
flinkPeriodicDeploymentManager % Provided,
schemedKafkaComponentsUtils % Provided,
)

Expand Down Expand Up @@ -2041,7 +2031,7 @@ lazy val e2eTests = (project in file("e2e-tests"))
)
.enablePlugins(BuildInfoPlugin)
.settings(buildInfoSettings)
.dependsOn(testUtils % Test)
.dependsOn(testUtils % Test, scenarioApi % Test, designer % Test)

lazy val doTest = Seq(
Test / testOptions += Tests.Setup { () =>
Expand Down Expand Up @@ -2105,7 +2095,6 @@ lazy val modules = List[ProjectReference](
flinkDevModel,
flinkDevModelJava,
flinkTableApiComponents,
devPeriodicDM,
defaultModel,
openapiComponents,
scenarioCompiler,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -27,17 +27,33 @@ import scala.util.Try

sealed trait ScenarioSpecificData extends TypeSpecificData

case class FragmentSpecificData(docsUrl: Option[String] = None) extends TypeSpecificData {
override def toMap: Map[String, String] = Map(FragmentSpecificData.docsUrlName -> docsUrl.getOrElse(""))
override def metaDataType: String = FragmentSpecificData.typeName
case class FragmentSpecificData(
docsUrl: Option[String] = None,
componentGroup: Option[String] = None, // None means the fragment is in the default group for fragments
icon: Option[String] = None
) extends TypeSpecificData {

override def toMap: Map[String, String] = Map(
FragmentSpecificData.docsUrlName -> docsUrl.getOrElse(""),
FragmentSpecificData.componentGroupName -> componentGroup.getOrElse(""),
FragmentSpecificData.iconName -> icon.getOrElse("")
)

override def metaDataType: String = FragmentSpecificData.typeName
}

object FragmentSpecificData {
val typeName = "FragmentSpecificData"
val docsUrlName = "docsUrl"
val typeName = "FragmentSpecificData"
val docsUrlName = "docsUrl"
val componentGroupName = "componentGroup"
val iconName = "icon"

def apply(properties: Map[String, String]): FragmentSpecificData = {
FragmentSpecificData(docsUrl = mapEmptyStringToNone(properties.get(docsUrlName)))
FragmentSpecificData(
docsUrl = mapEmptyStringToNone(properties.get(docsUrlName)),
componentGroup = mapEmptyStringToNone(properties.get(componentGroupName)),
icon = mapEmptyStringToNone(properties.get(iconName))
)
}

}
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
package pl.touk.nussknacker.engine.api.definition

import io.circe.generic.JsonCodec

@JsonCodec case class FixedExpressionValueWithIcon(expression: String, label: String, icon: String)

object FixedExpressionValueWithIcon {
val nullFixedValue: FixedExpressionValueWithIcon = FixedExpressionValueWithIcon("", "", "")
}
Original file line number Diff line number Diff line change
Expand Up @@ -50,10 +50,12 @@ object TypeSpecificDataTestData {
val requestResponseFullProperties: Map[String, String] = Map("slug" -> "exampleSlug")

// fragment
val fragmentEmptyTypeData: FragmentSpecificData = FragmentSpecificData(None)
val fragmentFullTypeData: FragmentSpecificData = FragmentSpecificData(docsUrl = Some("exampleUrl"))
val fragmentEmptyProperties: Map[String, String] = Map("docsUrl" -> "")
val fragmentFullProperties: Map[String, String] = Map("docsUrl" -> "exampleUrl")
val fragmentEmptyTypeData: FragmentSpecificData = FragmentSpecificData(None, None, None)
val fragmentFullTypeData: FragmentSpecificData =
FragmentSpecificData(docsUrl = Some("exampleUrl"), componentGroup = Some("someGroup"), icon = Some("someIcon"))
val fragmentEmptyProperties: Map[String, String] = Map("docsUrl" -> "", "componentGroup" -> "", "icon" -> "")
val fragmentFullProperties: Map[String, String] =
Map("docsUrl" -> "exampleUrl", "componentGroup" -> "someGroup", "icon" -> "someIcon")

val flinkMetaDataName = "StreamMetaData"
val liteStreamMetaDataName = "LiteStreamMetaData"
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -17,12 +17,6 @@ import pl.touk.nussknacker.engine.api.component.{AllProcessingModesComponent, Co
//from java to scala one and is seems difficult to convert java CustomStreamTransformer, Service etc. into scala ones
abstract class CustomStreamTransformer extends Component with AllProcessingModesComponent {

/**
* deprecated - use ContextTransformation.join instead
*/
// TODO: remove after full switch to ContextTransformation API
def canHaveManyInputs: Boolean = false

// For now it is only supported by Flink streaming runtime
def canBeEnding: Boolean = false

Expand Down
Loading

0 comments on commit 680090f

Please sign in to comment.