Packages

  • package root

    This is documentation for Mothra, a collection of Scala and Spark library functions for working with Internet-related data.

    This is documentation for Mothra, a collection of Scala and Spark library functions for working with Internet-related data. Some modules contain APIs of general use to Scala programmers. Some modules make those tools more useful on Spark data-processing systems.

    Please see the documentation for the individual packages for more details on their use.

    Scala Packages

    These packages are useful in Scala code without involving Spark:

    org.cert.netsa.data

    This package, which is collected as the netsa-data library, provides types for working with various kinds of information:

    org.cert.netsa.io.ipfix

    The netsa-io-ipfix library provides tools for reading and writing IETF IPFIX data from various connections and files.

    org.cert.netsa.io.silk

    To read and write CERT NetSA SiLK file formats and configuration files, use the netsa-io-silk library.

    org.cert.netsa.util

    The "junk drawer" of netsa-util so far provides only two features: First, a method for equipping Scala scala.collection.Iterators with exception handling. And second, a way to query the versions of NetSA libraries present in a JVM at runtime.

    Spark Packages

    These packages require the use of Apache Spark:

    org.cert.netsa.mothra.datasources

    Spark datasources for CERT file types. This package contains utility features which add methods to Apache Spark DataFrameReader objects, allowing IPFIX and SiLK flows to be opened using simple spark.read... calls.

    The mothra-datasources library contains both IPFIX and SiLK functionality, while mothra-datasources-ipfix and mothra-datasources-silk contain only what's needed for the named datasource.

    org.cert.netsa.mothra.analysis

    A grab-bag of analysis helper functions and example analyses.

    org.cert.netsa.mothra.functions

    This single Scala object provides Spark SQL functions for working with network data. It is the entirety of the mothra-functions library.

    Definition Classes
    root
  • package org
    Definition Classes
    root
  • package cert
    Definition Classes
    org
  • package netsa
    Definition Classes
    cert
  • package io
    Definition Classes
    netsa
  • package silk

    SiLK file formats, data types, and methods to read them, including support for reading them from Spark.

    SiLK file formats, data types, and methods to read them, including support for reading them from Spark.

    RWRec is the type of SiLK flow records.

    You can use RWRecReader to read SiLK files from Scala, including compressed files if Hadoop native libraries are available. For example:

    import org.cert.netsa.io.silk.RWRecReader
    import java.io.FileInputStream
    
    val inputFile = new FileInputStream("path/to/silk/rw/file")
    
    for ( rec <- RWRecReader.ofInputStream(inputFile) ) {
      println(rec.sIP)
    }
    Definition Classes
    io
    See also

    org.cert.netsa.mothra.datasources.silk.flow for working with SiLK data in Spark using the Mothra SiLK datasource.

  • package config
    Definition Classes
    silk
  • package io
    Definition Classes
    silk
  • BagDataType
  • BagReader
  • BagResult
  • BagWriter
  • CompressionMethod
  • FileFormat
  • FlowType
  • Header
  • HeaderEntry
  • IPSetReader
  • IPSetWriter
  • PrefixMapProtocolPortPair
  • PrefixMapReader
  • PrefixMapResult
  • RWRec
  • RWRecReader
  • RWRecWriter
  • Sensor
  • SilkConfig
  • SilkDataFormatException
  • SilkVersion
  • TCPState

case class Header(fileFlags: Byte, fileFormat: FileFormat, fileVersion: Byte, compressionMethod: CompressionMethod, silkVersion: SilkVersion, recordSize: Short, recordVersion: Short, headerEntries: IndexedSeq[HeaderEntry]) extends Product with Serializable

A SiLK file header, including contained header entries. Supports only "new-style" header format (SiLK versions 1.0+).

fileFlags

The bits encoding file flags. Currently only whether the file is big-endian.

fileFormat

The SiLK file format contained within this file.

fileVersion

The SiLK file version--specifically the version of the header format.

compressionMethod

The compression method used by data in this file.

silkVersion

The version of SiLK used to create this file.

recordSize

The size of individual (uncompressed) records in this file.

recordVersion

The record version of the file format.

headerEntries

Sequence of additional extensible header records of various types.

See also

Header.isBigEndian

Linear Supertypes
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. Header
  2. Serializable
  3. Product
  4. Equals
  5. AnyRef
  6. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. Protected

Instance Constructors

  1. new Header(fileFlags: Byte, fileFormat: FileFormat, fileVersion: Byte, compressionMethod: CompressionMethod, silkVersion: SilkVersion, recordSize: Short, recordVersion: Short, headerEntries: IndexedSeq[HeaderEntry])

    fileFlags

    The bits encoding file flags. Currently only whether the file is big-endian.

    fileFormat

    The SiLK file format contained within this file.

    fileVersion

    The SiLK file version--specifically the version of the header format.

    compressionMethod

    The compression method used by data in this file.

    silkVersion

    The version of SiLK used to create this file.

    recordSize

    The size of individual (uncompressed) records in this file.

    recordVersion

    The record version of the file format.

    headerEntries

    Sequence of additional extensible header records of various types.

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##: Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. def annotations: Seq[String]

    Annotations made on this SiLK file, if any.

  5. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  6. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.CloneNotSupportedException]) @native()
  7. val compressionMethod: CompressionMethod
  8. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  9. val fileFlags: Byte
  10. val fileFormat: FileFormat
  11. val fileVersion: Byte
  12. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.Throwable])
  13. def flowTypeId: Option[FlowType]

    Optional SiLK flow type for all flows in this packed file.

  14. final def getClass(): Class[_ <: AnyRef]
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  15. val headerEntries: IndexedSeq[HeaderEntry]
  16. def invocations: Seq[String]

    Command-lines used to produce this SiLK file, if any.

  17. def isBigEndian: Boolean

    True if data within the records of this file are stored in big-endian (MSB first) format.

  18. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  19. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  20. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  21. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  22. def probeName: Option[String]

    Optional probe name recorded with SiLK file.

  23. def productElementNames: Iterator[String]
    Definition Classes
    Product
  24. val recordSize: Short
  25. val recordVersion: Short
  26. def sensorId: Option[Sensor]

    Optional SiLK sensor ID for all flows in this packed file.

  27. val silkVersion: SilkVersion
  28. def startTimeOffset: Option[Long]

    Optional base start time, in milliseconds since the UNIX epoch.

    Optional base start time, in milliseconds since the UNIX epoch. Times in this packed file are expressed as a delta from this base start time.

  29. final def synchronized[T0](arg0: => T0): T0
    Definition Classes
    AnyRef
  30. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException])
  31. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException])
  32. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException]) @native()
  33. def writeTo(outputStream: OutputStream): Unit

    Writes the header to the provided output stream.

Inherited from Serializable

Inherited from Product

Inherited from Equals

Inherited from AnyRef

Inherited from Any

Ungrouped