Set up example project and configure build

Full sources for all Cloudflow example applications can be found in the examples folder of the cloudflow project on Github. The sources for the example described below can be found in the application called sensor-data-scala.

A typical Cloudflow application uses the organization shown below. We will implement the example in Scala.

  1. In a convenient location, such as my-cloudflow-example create the following directory structure:

       |-project
       |--cloudflow-plugins.sbt
       |-src
       |---main
       |-----avro
       |-----blueprint
       |-----resources
       |-----scala
       |-------sensordata
       |-build.sbt

    As we move through the process, the leaf level directories of the above tree will contain the following:

    • project/cloudflow-plugins.sbt : contains the Cloudflow sbt plugin name and version.

    • avro : the avro schema of the domain objects

    • blueprint : the blueprint of the application in a file named blueprint.conf

    • resources : logging (logback.xml) and sandbox configuration (local.conf)

    • scala : the source code of the application under the package name sensordata

    • build.sbt : the sbt build script

  2. From the top-level directory, initialize git (after adding files you will commit them):

    git init
    The Cloudflow sbt plugin assumes that your project is version managed using git. It will use git commit information to generate dynamic versioning for the Docker images it produces. If the project doesn’t have an initialized git index with at least one commit, some commands will fail with a not a git repository error.

The sbt build script

Cloudflow provides sbt plugins for the supported runtimes, Akka Streams, Spark and Flink. The plugins speed development by abstracting much of the boilerplate necessary to build a complete application. You can use multiple runtimes in the same application. In this example, we use only the CloudflowAkkaStreamsApplicationPlugin that provides building blocks for developing a Cloudflow application with Akka Streams.

  1. Create a build.sbt file with the following contents and save it in at the same level as your src directory:

    import sbt._
    import sbt.Keys._
    
    lazy val sensorData =  (project in file("."))
      .enablePlugins(CloudflowAkkaStreamsApplicationPlugin)
      .settings(
        libraryDependencies ++= Seq(
          "com.lightbend.akka"     %% "akka-stream-alpakka-file"  % "1.1.2",
          "com.typesafe.akka"      %% "akka-http-spray-json"      % "10.1.11",
          "ch.qos.logback"         %  "logback-classic"           % "1.2.3",
          "com.typesafe.akka"      %% "akka-http-testkit"         % "10.1.11" % "test"
        ),
        name := "sensor-data",
        organization := "com.lightbend",
    
        scalaVersion := "2.12.10",
        crossScalaVersions := Vector(scalaVersion.value)
      )

    The script is a standard Scala sbt build file—​with the addition of the Cloudflow plugin for Akka Streams.

  2. Add the Cloudflow plugin in project/cloudflow-plugins.sbt:

    addSbtPlugin("com.lightbend.cloudflow" % "sbt-cloudflow" % "1.3.3")

What’s next

Now, let’s define the Avro schema.