case class SilkConfig(version: Option[Int], defaultClassName: Option[String], packingLogicPath: Option[String], pathFormat: String, groups: Map[String, GroupConfig], sensors: SensorMap, classes: Map[String, ClassConfig]) extends Product with Serializable
SiLK data spool configuration.
- version
The version of the config file format used.
- defaultClassName
The default class to be examined if none is specified.
- packingLogicPath
The path to the plugin to be loaded by the packer for determining where to pack flows.
- pathFormat
The format used for filenames in the data spool.
- groups
The sensor groups defined in this configuration.
- sensors
The sensors defined in this configuration, usable as a value of type
Map[Sensor, SensorConfig]
.- classes
The classes defined in this configuration.
- Alphabetic
- By Inheritance
- SilkConfig
- Serializable
- Product
- Equals
- AnyRef
- Any
- Hide All
- Show All
- Public
- Protected
Instance Constructors
- new SilkConfig(version: Option[Int], defaultClassName: Option[String], packingLogicPath: Option[String], pathFormat: String, groups: Map[String, GroupConfig], sensors: SensorMap, classes: Map[String, ClassConfig])
- version
The version of the config file format used.
- defaultClassName
The default class to be examined if none is specified.
- packingLogicPath
The path to the plugin to be loaded by the packer for determining where to pack flows.
- pathFormat
The format used for filenames in the data spool.
- groups
The sensor groups defined in this configuration.
- sensors
The sensors defined in this configuration, usable as a value of type
Map[Sensor, SensorConfig]
.- classes
The classes defined in this configuration.
Value Members
- final def !=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
- final def ##: Int
- Definition Classes
- AnyRef → Any
- final def ==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
- final def asInstanceOf[T0]: T0
- Definition Classes
- Any
- val classes: Map[String, ClassConfig]
- def clone(): AnyRef
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.CloneNotSupportedException]) @native()
- val defaultClassName: Option[String]
- final def eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
- def fileInfoToPath(t: (Instant, FlowType, Sensor)): String
Given a tuple containing a SiLK Record's starting time, FlowType, and Sensor, return a partial path, relative to the root of the SiLK data repository, to the hourly file holding that record.
- def filenameToGlobInfo(path: String): Option[(Instant, FlowType, Sensor)]
- def finalize(): Unit
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.Throwable])
- def flowTypes: FlowTypeMap
The flowtypes defined in this configuration in any class, usable as a value of type
Map[FlowType, FlowTypeConfig]
. - final def getClass(): Class[_ <: AnyRef]
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
- def globAll: String
A string using shell glob syntax which matches all data files for this config.
- val groups: Map[String, GroupConfig]
- final def isInstanceOf[T0]: Boolean
- Definition Classes
- Any
- final def ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
- final def notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
- final def notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
- val packingLogicPath: Option[String]
- val pathFormat: String
- def productElementNames: Iterator[String]
- Definition Classes
- Product
- val sensors: SensorMap
- def supportsSensorDescriptions: Boolean
Returns true if the config version supports sensor descriptions.
- final def synchronized[T0](arg0: => T0): T0
- Definition Classes
- AnyRef
- val version: Option[Int]
- final def wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException])
- final def wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException])
- final def wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException]) @native()
This is documentation for Mothra, a collection of Scala and Spark library functions for working with Internet-related data. Some modules contain APIs of general use to Scala programmers. Some modules make those tools more useful on Spark data-processing systems.
Please see the documentation for the individual packages for more details on their use.
Scala Packages
These packages are useful in Scala code without involving Spark:
org.cert.netsa.data
This package, which is collected as the
netsa-data
library, provides types for working with various kinds of information:org.cert.netsa.data.net
- types for working with network dataorg.cert.netsa.data.time
- types for working with time dataorg.cert.netsa.data.unsigned
- types for working with unsigned integral valuesorg.cert.netsa.io.ipfix
The
netsa-io-ipfix
library provides tools for reading and writing IETF IPFIX data from various connections and files.org.cert.netsa.io.silk
To read and write CERT NetSA SiLK file formats and configuration files, use the
netsa-io-silk
library.org.cert.netsa.util
The "junk drawer" of
netsa-util
so far provides only two features: First, a method for equipping Scala scala.collection.Iterators with exception handling. And second, a way to query the versions of NetSA libraries present in a JVM at runtime.Spark Packages
These packages require the use of Apache Spark:
org.cert.netsa.mothra.datasources
Spark datasources for CERT file types. This package contains utility features which add methods to Apache Spark DataFrameReader objects, allowing IPFIX and SiLK flows to be opened using simple
spark.read...
calls.The
mothra-datasources
library contains both IPFIX and SiLK functionality, whilemothra-datasources-ipfix
andmothra-datasources-silk
contain only what's needed for the named datasource.org.cert.netsa.mothra.analysis
A grab-bag of analysis helper functions and example analyses.
org.cert.netsa.mothra.functions
This single Scala object provides Spark SQL functions for working with network data. It is the entirety of the
mothra-functions
library.