class SparkStreamletContextImpl extends SparkStreamletContext
- Annotations
- @deprecated
- Deprecated
(Since version 2.2.0) Use contrib-sbt-spark library instead, see https://github.com/lightbend/cloudflow-contrib
- Alphabetic
- By Inheritance
- SparkStreamletContextImpl
- SparkStreamletContext
- Serializable
- Serializable
- Product
- Equals
- StreamletContext
- AnyRef
- Any
- Hide All
- Show All
- Public
- All
Instance Constructors
- new SparkStreamletContextImpl(streamletDefinition: StreamletDefinition, session: SparkSession, config: Config)
Type Members
-
case class
MountedPathUnavailableException(volumeMount: VolumeMount) extends Exception with Product with Serializable
- Definition Classes
- StreamletContext
Value Members
-
final
def
!=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
##(): Int
- Definition Classes
- AnyRef → Any
-
final
def
==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
asInstanceOf[T0]: T0
- Definition Classes
- Any
-
def
checkpointDir(dirName: String): String
Returns the absolute path to a mounted shared storage that can be used to store reliable checkpoints.
Returns the absolute path to a mounted shared storage that can be used to store reliable checkpoints. Reliable checkpoints lets the Spark application persist its state across restarts and restart from where it last stopped.
- returns
the absolute path to a mounted shared storage
- Definition Classes
- SparkStreamletContextImpl → SparkStreamletContext
-
def
clone(): AnyRef
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native()
-
val
config: Config
The full configuration for the Streamlet, containing all deployment-time configuration parameters on top of the normal configuration
The full configuration for the Streamlet, containing all deployment-time configuration parameters on top of the normal configuration
- Definition Classes
- SparkStreamletContextImpl → StreamletContext
-
final
def
eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
def
finalize(): Unit
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( classOf[java.lang.Throwable] )
-
def
findTopicForPort(port: StreamletPort): Topic
Get the savepoint path (topic name) from the port
Get the savepoint path (topic name) from the port
- port
the StreamletPort
- returns
the savepoint path
- Definition Classes
- StreamletContext
- Exceptions thrown
TopicForPortNotFoundException
if there is no mapping found
-
final
def
getClass(): Class[_]
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
-
def
getMountedPath(volumeMount: VolumeMount): Path
The path mounted for a VolumeMount request from a streamlet.
The path mounted for a VolumeMount request from a streamlet. In a clustered deployment, the mounted path will correspond to the requested mount path in the VolumeMount definition. In a local environment, this path will be replaced by a local folder.
- volumeMount
the volumeMount declaration for which we want to obtain the mounted path.
- returns
the path where the volume is mounted.
- Definition Classes
- StreamletContext
- Exceptions thrown
MountedPathUnavailableException
in the case the path is not available.
-
final
def
isInstanceOf[T0]: Boolean
- Definition Classes
- Any
- val maxOffsetsPerTrigger: Long
-
final
def
ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
final
def
notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
-
final
def
notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
-
def
readStream[In](inPort: CodecInlet[In])(implicit encoder: Encoder[In], typeTag: scala.reflect.api.JavaUniverse.TypeTag[In]): Dataset[In]
Stream from the underlying external storage and return a DataFrame
Stream from the underlying external storage and return a DataFrame
- inPort
the inlet port to read from
- returns
the data read as
Dataset[In]
- Definition Classes
- SparkStreamletContextImpl → SparkStreamletContext
-
def
runtimeBootstrapServers(topic: Topic): String
The runtime 'bootstrap.servers' for the given topic.
The runtime 'bootstrap.servers' for the given topic. This is provided by the cloudflow-operator when creating the streamlet's 'cloudflow.runner' configuration. A 'bootstrap.servers' will always be provided as long as a 'default' Kafka cluster is defined during the install of the cloudflow-operator.
- Definition Classes
- StreamletContext
-
val
session: SparkSession
- Definition Classes
- SparkStreamletContext
- val storageDir: String
-
final
def
streamletConfig: Config
The subset of configuration specific to a single named instance of a streamlet.
The subset of configuration specific to a single named instance of a streamlet.
A Streamlet can specify the set of environment- and instance-specific configuration keys it will use during runtime through configParameters. Those keys will then be made available through this configuration.
An empty configuration will be returned if the streamlet doesn't contain any configuration parameters.
- Definition Classes
- StreamletContext
-
def
streamletRef: String
The streamlet reference which identifies the streamlet in the blueprint.
The streamlet reference which identifies the streamlet in the blueprint. It is used in a Streamlet for logging and metrics, referring back to the streamlet instance using a name recognizable by the user.
- Definition Classes
- StreamletContext
-
final
def
synchronized[T0](arg0: ⇒ T0): T0
- Definition Classes
- AnyRef
-
final
def
wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native()
-
def
writeStream[Out](stream: Dataset[Out], outPort: CodecOutlet[Out], outputMode: OutputMode, optionalTrigger: Option[Trigger] = None)(implicit encoder: Encoder[Out], typeTag: scala.reflect.api.JavaUniverse.TypeTag[Out]): StreamingQuery
Start the execution of a StreamingQuery that writes the encodedStream to an external storage using the designated portOut
Start the execution of a StreamingQuery that writes the encodedStream to an external storage using the designated portOut
- stream
stream used to write the result of execution of the
StreamingQuery
- outPort
the port used to write the result of execution of the
StreamingQuery
- outputMode
the output mode used to write. Valid values Append, Update, Complete
- returns
the
StreamingQuery
that starts executing
- Definition Classes
- SparkStreamletContextImpl → SparkStreamletContext