object HeaderEntry
- Grouped
- Alphabetic
- By Inheritance
- HeaderEntry
- AnyRef
- Any
- Hide All
- Show All
- Public
- Protected
Type Members
- case class Annotation(annotation: String) extends HeaderEntry with Product with Serializable
Header entry used to store a generic comment or annotation about the file.
Header entry used to store a generic comment or annotation about the file.
- annotation
Content of text note.
- case class Bag(keyType: Short, keyLength: Short, counterType: Short, counterLength: Short) extends HeaderEntry with Product with Serializable
Header entry used to store information particular to binary bag files.
Header entry used to store information particular to binary bag files. Not used for flow data.
- case class IPSet(childNode: Int, leafCount: Int, leafSize: Int, nodeCount: Int, nodeSize: Int, rootIndex: Int) extends HeaderEntry with Product with Serializable
Header entry used to store information particular to binary IP sets.
Header entry used to store information particular to binary IP sets. Not used for flow data.
- case class Invocation(commandLine: String) extends HeaderEntry with Product with Serializable
Header entry used to store the command line history, with one instance per command invoked to produce the file.
Header entry used to store the command line history, with one instance per command invoked to produce the file.
- commandLine
UNIX command line used to generate this file.
- case class PackedFile(startTime: Long, flowtypeId: FlowType, sensorId: Sensor) extends HeaderEntry with Product with Serializable
Header entry used for data files generated by rwflowpack.
Header entry used for data files generated by rwflowpack. It specifies the following fields that are simplified or the same for all entries in a packed file. (Times are offsets from startTime, flowtypeId and sensorId are the same for all records.)
- startTime
Base start time, in milliseconds since the UNIX epoch. Times in this file are expressed as a delta from this base start time.
- flowtypeId
SiLK flow type for all flows in this file.
- sensorId
SiLK sensor ID for all flows in this file.
- case class PrefixMap(version: Int, mapName: String) extends HeaderEntry with Product with Serializable
Header entry used to store information particular to prefix maps (pmaps).
Header entry used to store information particular to prefix maps (pmaps). Not used for flow data.
- version
The version of this header entry (always 1).
- mapName
Textual human-readable name of this prefix map.
- case class ProbeName(probeName: String) extends HeaderEntry with Product with Serializable
Header entry used to store the textual name of the probe where flow data was originally collected.
Header entry used to store the textual name of the probe where flow data was originally collected.
- probeName
The name of the probe from which data was collected.
- case class Unknown(id: Int, data: Array[Byte]) extends HeaderEntry with Product with Serializable
Header entry used for headers that are unrecognized.
Header entry used for headers that are unrecognized.
- id
Numeric header entry ID
- data
Binary header entry data
Value Members
- final def !=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
- final def ##: Int
- Definition Classes
- AnyRef → Any
- final def ==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
- final def asInstanceOf[T0]: T0
- Definition Classes
- Any
- def clone(): AnyRef
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.CloneNotSupportedException]) @native()
- final def eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
- def equals(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef → Any
- def finalize(): Unit
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.Throwable])
- final def getClass(): Class[_ <: AnyRef]
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
- def hashCode(): Int
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
- final def isInstanceOf[T0]: Boolean
- Definition Classes
- Any
- final def ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
- final def notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
- final def notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
- final def synchronized[T0](arg0: => T0): T0
- Definition Classes
- AnyRef
- def toString(): String
- Definition Classes
- AnyRef → Any
- final def wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException])
- final def wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException])
- final def wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException]) @native()
This is documentation for Mothra, a collection of Scala and Spark library functions for working with Internet-related data. Some modules contain APIs of general use to Scala programmers. Some modules make those tools more useful on Spark data-processing systems.
Please see the documentation for the individual packages for more details on their use.
Scala Packages
These packages are useful in Scala code without involving Spark:
org.cert.netsa.data
This package, which is collected as the
netsa-data
library, provides types for working with various kinds of information:org.cert.netsa.data.net
- types for working with network dataorg.cert.netsa.data.time
- types for working with time dataorg.cert.netsa.data.unsigned
- types for working with unsigned integral valuesorg.cert.netsa.io.ipfix
The
netsa-io-ipfix
library provides tools for reading and writing IETF IPFIX data from various connections and files.org.cert.netsa.io.silk
To read and write CERT NetSA SiLK file formats and configuration files, use the
netsa-io-silk
library.org.cert.netsa.util
The "junk drawer" of
netsa-util
so far provides only two features: First, a method for equipping Scala scala.collection.Iterators with exception handling. And second, a way to query the versions of NetSA libraries present in a JVM at runtime.Spark Packages
These packages require the use of Apache Spark:
org.cert.netsa.mothra.datasources
Spark datasources for CERT file types. This package contains utility features which add methods to Apache Spark DataFrameReader objects, allowing IPFIX and SiLK flows to be opened using simple
spark.read...
calls.The
mothra-datasources
library contains both IPFIX and SiLK functionality, whilemothra-datasources-ipfix
andmothra-datasources-silk
contain only what's needed for the named datasource.org.cert.netsa.mothra.analysis
A grab-bag of analysis helper functions and example analyses.
org.cert.netsa.mothra.functions
This single Scala object provides Spark SQL functions for working with network data. It is the entirety of the
mothra-functions
library.