Set Up Example Project and Configure Build

The sources for the example described below can be found in the application called sensor-data-scala.
The default build tool for Cloudflow applications is sbt but there is support for using Maven to build as well.

Full sources for all Cloudflow example applications can be found in the examples folder of the cloudflow project on Github. To use the examples directly remember to run export CLOUDFLOW_VERSION=2.2.0 before invoking sbt.

A typical Cloudflow application uses the organization shown below. We will implement the example in Scala.

  1. In a convenient location, such as my-cloudflow-example create the following directory structure:

       |-project
       |--cloudflow-plugins.sbt
       |-src
       |---main
       |-----avro
       |-----blueprint
       |-----resources
       |-----scala
       |-------sensordata
       |-build.sbt

    As we move through the process, the leaf level directories of the above tree will contain the following:

    • project/cloudflow-plugins.sbt : contains the Cloudflow sbt plugin name and version.

    • avro : the avro schema of the domain objects

    • blueprint : the blueprint of the application in a file named blueprint.conf

    • scala : the source code of the application under the package name sensordata

    • build.sbt : the sbt build script

The sbt build script

Cloudflow provides sbt plugins for the Akka, Spark and Flink runtimes. The plugins speed up development by adding the necessary dependencies and abstracting much of the boilerplate necessary to build a complete application. You can use multiple runtimes in the same application, provided that each runtime is defined in its own sub-project.

In this example, we use the CloudflowAkkaPlugin that provides the building blocks for developing a Cloudflow application with Akka Streams. In addition to the backend-specific plugin, we need to add the CloudflowApplicationPlugin that provides the image-building and local-running capabilities to the project.

  1. Create a build.sbt file with the following contents and save it in at the same level as your src directory:

    lazy val sensorData =  (project in file("."))
        .enablePlugins(CloudflowApplicationPlugin, CloudflowAkkaPlugin)
        .settings(
          scalaVersion := "2.13.3",
          runLocalConfigFile := Some("src/main/resources/local.conf"), (1)
          runLocalLog4jConfigFile := Some("src/main/resources/log4j.xml"), (2)
          name := "sensor-data-scala",
    
          libraryDependencies ++= Seq(
            "com.lightbend.akka"     %% "akka-stream-alpakka-file"  % "1.1.2",
            "com.typesafe.akka"      %% "akka-http-spray-json"      % "10.1.12",
            "ch.qos.logback"         %  "logback-classic"           % "1.2.3",
            "com.typesafe.akka"      %% "akka-http-testkit"         % "10.1.12" % "test",
            "org.scalatest"          %% "scalatest"                 % "3.0.8"  % "test"
          )
        )

    The script is a standard Scala sbt build file—​with the addition of the Cloudflow plugin for Akka Streams, the Cloudflow application plugin, and also the Scalafmt plugin that we suggest to keep the style consistent in the application (optional).

  2. Create the file project/cloudflow-plugins.sbt and add the Cloudflow plugin dependency:

    addSbtPlugin("com.lightbend.cloudflow" % "sbt-cloudflow" % "2.2.0")

Local configuration

To be able to run the example locally you should provide the two resources files mentioned in build.sbt.

  1. Create a log4j.xml file with the following contents and save it in your src/main/resources/ directory:

    <?xml version="1.0" encoding="UTF-8" ?>
    <!DOCTYPE log4j:configuration SYSTEM "log4j.dtd">
    <log4j:configuration debug="true"
                         xmlns:log4j='http://jakarta.apache.org/log4j/'>
    
        <appender name="console" class="org.apache.log4j.ConsoleAppender">
            <layout class="org.apache.log4j.PatternLayout">
                <param name="ConversionPattern"
                       value="%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L - %m%n" />
            </layout>
        </appender>
    
        <root>
            <level value="DEBUG" />
            <appender-ref ref="console" />
        </root>
    
    </log4j:configuration>

    you can tune the logging verbosity by changing the level(e.g. to INFO):

    <level value="INFO" />
  2. Create a local.conf file with the following contents and save it, as well, in your src/main/resources/ directory:

    cloudflow {
    	streamlets {
    		# configures the log-level configuration parameter of the valid-logger streamlet
    		valid-logger {
    			config-parameters {
    			  log-level = "info"
    			}
    		}
    		# configures the volume mount for the file-ingress streamlet, when added to the blueprint
    		file-ingress {
    			volume-mounts {
    		  		source-data-mount="/tmp/cloudflow"
    			}
    		}
    	}
    }

What’s next

Now, let’s define the Avro schema.