Packages

  • package root

    This is documentation for Mothra, a collection of Scala and Spark library functions for working with Internet-related data.

    This is documentation for Mothra, a collection of Scala and Spark library functions for working with Internet-related data. Some modules contain APIs of general use to Scala programmers. Some modules make those tools more useful on Spark data-processing systems.

    Please see the documentation for the individual packages for more details on their use.

    Scala Packages

    These packages are useful in Scala code without involving Spark:

    org.cert.netsa.data

    This package, which is collected as the netsa-data library, provides types for working with various kinds of information:

    org.cert.netsa.io.ipfix

    The netsa-io-ipfix library provides tools for reading and writing IETF IPFIX data from various connections and files.

    org.cert.netsa.io.silk

    To read and write CERT NetSA SiLK file formats and configuration files, use the netsa-io-silk library.

    org.cert.netsa.util

    The "junk drawer" of netsa-util so far provides only two features: First, a method for equipping Scala scala.collection.Iterators with exception handling. And second, a way to query the versions of NetSA libraries present in a JVM at runtime.

    Spark Packages

    These packages require the use of Apache Spark:

    org.cert.netsa.mothra.datasources

    Spark datasources for CERT file types. This package contains utility features which add methods to Apache Spark DataFrameReader objects, allowing IPFIX and SiLK flows to be opened using simple spark.read... calls.

    The mothra-datasources library contains both IPFIX and SiLK functionality, while mothra-datasources-ipfix and mothra-datasources-silk contain only what's needed for the named datasource.

    org.cert.netsa.mothra.analysis

    A grab-bag of analysis helper functions and example analyses.

    org.cert.netsa.mothra.functions

    This single Scala object provides Spark SQL functions for working with network data. It is the entirety of the mothra-functions library.

    Definition Classes
    root
  • package org
    Definition Classes
    root
  • package cert
    Definition Classes
    org
  • package netsa
    Definition Classes
    cert
  • package io
    Definition Classes
    netsa
  • package silk

    SiLK file formats, data types, and methods to read them, including support for reading them from Spark.

    SiLK file formats, data types, and methods to read them, including support for reading them from Spark.

    RWRec is the type of SiLK flow records.

    You can use RWRecReader to read SiLK files from Scala, including compressed files if Hadoop native libraries are available. For example:

    import org.cert.netsa.io.silk.RWRecReader
    import java.io.FileInputStream
    
    val inputFile = new FileInputStream("path/to/silk/rw/file")
    
    for ( rec <- RWRecReader.ofInputStream(inputFile) ) {
      println(rec.sIP)
    }
    Definition Classes
    io
    See also

    org.cert.netsa.mothra.datasources.silk.flow for working with SiLK data in Spark using the Mothra SiLK datasource.

  • package config
    Definition Classes
    silk
  • package io
    Definition Classes
    silk
  • BagDataType
  • BagReader
  • BagResult
  • BagWriter
  • CompressionMethod
  • FileFormat
  • FlowType
  • Header
  • HeaderEntry
  • IPSetReader
  • IPSetWriter
  • PrefixMapProtocolPortPair
  • PrefixMapReader
  • PrefixMapResult
  • RWRec
  • RWRecReader
  • RWRecWriter
  • Sensor
  • SilkConfig
  • SilkDataFormatException
  • SilkVersion
  • TCPState

object HeaderEntry

Linear Supertypes
Ordering
  1. Grouped
  2. Alphabetic
  3. By Inheritance
Inherited
  1. HeaderEntry
  2. AnyRef
  3. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. Protected

Type Members

  1. case class Annotation(annotation: String) extends HeaderEntry with Product with Serializable

    Header entry used to store a generic comment or annotation about the file.

    Header entry used to store a generic comment or annotation about the file.

    annotation

    Content of text note.

  2. case class Bag(keyType: Short, keyLength: Short, counterType: Short, counterLength: Short) extends HeaderEntry with Product with Serializable

    Header entry used to store information particular to binary bag files.

    Header entry used to store information particular to binary bag files. Not used for flow data.

  3. case class IPSet(childNode: Int, leafCount: Int, leafSize: Int, nodeCount: Int, nodeSize: Int, rootIndex: Int) extends HeaderEntry with Product with Serializable

    Header entry used to store information particular to binary IP sets.

    Header entry used to store information particular to binary IP sets. Not used for flow data.

  4. case class Invocation(commandLine: String) extends HeaderEntry with Product with Serializable

    Header entry used to store the command line history, with one instance per command invoked to produce the file.

    Header entry used to store the command line history, with one instance per command invoked to produce the file.

    commandLine

    UNIX command line used to generate this file.

  5. case class PackedFile(startTime: Long, flowtypeId: FlowType, sensorId: Sensor) extends HeaderEntry with Product with Serializable

    Header entry used for data files generated by rwflowpack.

    Header entry used for data files generated by rwflowpack. It specifies the following fields that are simplified or the same for all entries in a packed file. (Times are offsets from startTime, flowtypeId and sensorId are the same for all records.)

    startTime

    Base start time, in milliseconds since the UNIX epoch. Times in this file are expressed as a delta from this base start time.

    flowtypeId

    SiLK flow type for all flows in this file.

    sensorId

    SiLK sensor ID for all flows in this file.

  6. case class PrefixMap(version: Int, mapName: String) extends HeaderEntry with Product with Serializable

    Header entry used to store information particular to prefix maps (pmaps).

    Header entry used to store information particular to prefix maps (pmaps). Not used for flow data.

    version

    The version of this header entry (always 1).

    mapName

    Textual human-readable name of this prefix map.

  7. case class ProbeName(probeName: String) extends HeaderEntry with Product with Serializable

    Header entry used to store the textual name of the probe where flow data was originally collected.

    Header entry used to store the textual name of the probe where flow data was originally collected.

    probeName

    The name of the probe from which data was collected.

  8. case class Unknown(id: Int, data: Array[Byte]) extends HeaderEntry with Product with Serializable

    Header entry used for headers that are unrecognized.

    Header entry used for headers that are unrecognized.

    id

    Numeric header entry ID

    data

    Binary header entry data

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##: Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  5. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.CloneNotSupportedException]) @native()
  6. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  7. def equals(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef → Any
  8. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.Throwable])
  9. final def getClass(): Class[_ <: AnyRef]
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  10. def hashCode(): Int
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  11. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  12. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  13. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  14. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  15. final def synchronized[T0](arg0: => T0): T0
    Definition Classes
    AnyRef
  16. def toString(): String
    Definition Classes
    AnyRef → Any
  17. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException])
  18. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException])
  19. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException]) @native()

Inherited from AnyRef

Inherited from Any

Values

Ungrouped