Packages

package spark

Ordering
  1. Alphabetic
Visibility
  1. Public
  2. All

Type Members

  1. trait SparkStreamlet extends Streamlet[SparkStreamletContext] with Serializable

    The base class for defining Spark streamlets.

    The base class for defining Spark streamlets. Derived classes need to override createLogic to provide the custom implementation for the behavior of the streamlet.

    Here's an example:

    // new custom `SparkStreamlet`
    object MySparkProcessor extends SparkStreamlet {
      // Step 1: Define inlets and outlets. Note for the outlet you need to specify
      //         the partitioner function explicitly
      val in = AvroInlet[Data]("in")
      val out = AvroOutlet[Simple]("out", _.name)
    
      // Step 2: Define the shape of the streamlet. In this example the streamlet
      //         has 1 inlet and 1 outlet
      val shape = StreamletShape(in, out)
    
      // Step 3: Provide custom implementation of `SparkStreamletLogic` that defines
      //         the behavior of the streamlet
      override def createLogic() = new SparkStreamletLogic {
        override def buildStreamingQueries = {
          val dataset = readStream(in)
          val outStream = dataset.select($"name").as[Simple]
          val query = writeStream(outStream, out, OutputMode.Append)
          Seq(query)
        }
      }
    }
    Annotations
    @deprecated
    Deprecated

    (Since version 2.2.0) Use contrib-sbt-spark library instead, see https://github.com/lightbend/cloudflow-contrib

  2. abstract case class SparkStreamletContext(streamletDefinition: StreamletDefinition, session: SparkSession) extends StreamletContext with Product with Serializable
    Annotations
    @deprecated
    Deprecated

    (Since version 2.2.0) Use contrib-sbt-spark library instead, see https://github.com/lightbend/cloudflow-contrib

  3. abstract class SparkStreamletLogic extends StreamletLogic[SparkStreamletContext]

    Provides an entry-point for defining the behavior of a SparkStreamlet.

    Provides an entry-point for defining the behavior of a SparkStreamlet. Overide the method buildStreamingQueries to build the collection of StreamingQuery that needs to run as part of the business logic for the SparkStreamlet.

    Here's an example of how to provide a specialized implementation of SparkStreamletLogic as part of implementing a custom SparkStreamlet:

    // new custom `SparkStreamlet`
    object MySparkProcessor extends SparkStreamlet {
      // define inlets, outlets and shape
    
      // provide custom implementation of `SparkStreamletLogic`
      override def createLogic() = new SparkStreamletLogic {
        override def buildStreamingQueries = {
          val dataset = readStream(in)
          val outStream = dataset.select($"name").as[Simple]
          val query = writeStream(outStream, out, OutputMode.Append)
          Seq(query)
        }
      }
    }
    Annotations
    @deprecated
    Deprecated

    (Since version 2.2.0) Use contrib-sbt-spark library instead, see https://github.com/lightbend/cloudflow-contrib

  4. case class StreamletQueryExecution(queries: Vector[StreamingQuery]) extends Product with Serializable
    Annotations
    @deprecated
    Deprecated

    (Since version 2.2.0) Use contrib-sbt-spark library instead, see https://github.com/lightbend/cloudflow-contrib

Deprecated Value Members

  1. object SparkStreamletRuntime extends StreamletRuntime with Product with Serializable
    Annotations
    @deprecated
    Deprecated

    (Since version 2.2.0) Use contrib-sbt-spark library instead, see https://github.com/lightbend/cloudflow-contrib

  2. object StreamletQueryExecution extends Serializable
    Annotations
    @deprecated
    Deprecated

    (Since version 2.2.0) Use contrib-sbt-spark library instead, see https://github.com/lightbend/cloudflow-contrib

Ungrouped