Packages

c

cloudflow.spark.kafka

SparkStreamletContextImpl

class SparkStreamletContextImpl extends SparkStreamletContext

Annotations
@deprecated
Deprecated

(Since version 2.2.0) Use contrib-sbt-spark library instead, see https://github.com/lightbend/cloudflow-contrib

Linear Supertypes
SparkStreamletContext, Serializable, Serializable, Product, Equals, StreamletContext, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. SparkStreamletContextImpl
  2. SparkStreamletContext
  3. Serializable
  4. Serializable
  5. Product
  6. Equals
  7. StreamletContext
  8. AnyRef
  9. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new SparkStreamletContextImpl(streamletDefinition: StreamletDefinition, session: SparkSession, config: Config)

Type Members

  1. case class MountedPathUnavailableException(volumeMount: VolumeMount) extends Exception with Product with Serializable
    Definition Classes
    StreamletContext

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  5. def checkpointDir(dirName: String): String

    Returns the absolute path to a mounted shared storage that can be used to store reliable checkpoints.

    Returns the absolute path to a mounted shared storage that can be used to store reliable checkpoints. Reliable checkpoints lets the Spark application persist its state across restarts and restart from where it last stopped.

    returns

    the absolute path to a mounted shared storage

    Definition Classes
    SparkStreamletContextImplSparkStreamletContext
  6. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()
  7. val config: Config

    The full configuration for the Streamlet, containing all deployment-time configuration parameters on top of the normal configuration

    The full configuration for the Streamlet, containing all deployment-time configuration parameters on top of the normal configuration

    Definition Classes
    SparkStreamletContextImplStreamletContext
  8. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  9. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  10. def findTopicForPort(port: StreamletPort): Topic

    Get the savepoint path (topic name) from the port

    Get the savepoint path (topic name) from the port

    port

    the StreamletPort

    returns

    the savepoint path

    Definition Classes
    StreamletContext
    Exceptions thrown

    TopicForPortNotFoundException if there is no mapping found

  11. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  12. def getMountedPath(volumeMount: VolumeMount): Path

    The path mounted for a VolumeMount request from a streamlet.

    The path mounted for a VolumeMount request from a streamlet. In a clustered deployment, the mounted path will correspond to the requested mount path in the VolumeMount definition. In a local environment, this path will be replaced by a local folder.

    volumeMount

    the volumeMount declaration for which we want to obtain the mounted path.

    returns

    the path where the volume is mounted.

    Definition Classes
    StreamletContext
    Exceptions thrown

    MountedPathUnavailableException in the case the path is not available.

  13. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  14. val maxOffsetsPerTrigger: Long
  15. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  16. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  17. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  18. def readStream[In](inPort: CodecInlet[In])(implicit encoder: Encoder[In], typeTag: scala.reflect.api.JavaUniverse.TypeTag[In]): Dataset[In]

    Stream from the underlying external storage and return a DataFrame

    Stream from the underlying external storage and return a DataFrame

    inPort

    the inlet port to read from

    returns

    the data read as Dataset[In]

    Definition Classes
    SparkStreamletContextImplSparkStreamletContext
  19. def runtimeBootstrapServers(topic: Topic): String

    The runtime 'bootstrap.servers' for the given topic.

    The runtime 'bootstrap.servers' for the given topic. This is provided by the cloudflow-operator when creating the streamlet's 'cloudflow.runner' configuration. A 'bootstrap.servers' will always be provided as long as a 'default' Kafka cluster is defined during the install of the cloudflow-operator.

    Definition Classes
    StreamletContext
  20. val session: SparkSession
    Definition Classes
    SparkStreamletContext
  21. val storageDir: String
  22. final def streamletConfig: Config

    The subset of configuration specific to a single named instance of a streamlet.

    The subset of configuration specific to a single named instance of a streamlet.

    A Streamlet can specify the set of environment- and instance-specific configuration keys it will use during runtime through configParameters. Those keys will then be made available through this configuration.

    An empty configuration will be returned if the streamlet doesn't contain any configuration parameters.

    Definition Classes
    StreamletContext
  23. def streamletRef: String

    The streamlet reference which identifies the streamlet in the blueprint.

    The streamlet reference which identifies the streamlet in the blueprint. It is used in a Streamlet for logging and metrics, referring back to the streamlet instance using a name recognizable by the user.

    Definition Classes
    StreamletContext
  24. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  25. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  26. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  27. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()
  28. def writeStream[Out](stream: Dataset[Out], outPort: CodecOutlet[Out], outputMode: OutputMode, optionalTrigger: Option[Trigger] = None)(implicit encoder: Encoder[Out], typeTag: scala.reflect.api.JavaUniverse.TypeTag[Out]): StreamingQuery

    Start the execution of a StreamingQuery that writes the encodedStream to an external storage using the designated portOut

    Start the execution of a StreamingQuery that writes the encodedStream to an external storage using the designated portOut

    stream

    stream used to write the result of execution of the StreamingQuery

    outPort

    the port used to write the result of execution of the StreamingQuery

    outputMode

    the output mode used to write. Valid values Append, Update, Complete

    returns

    the StreamingQuery that starts executing

    Definition Classes
    SparkStreamletContextImplSparkStreamletContext

Inherited from SparkStreamletContext

Inherited from Serializable

Inherited from Serializable

Inherited from Product

Inherited from Equals

Inherited from StreamletContext

Inherited from AnyRef

Inherited from Any

Ungrouped