Packages

  • package root

    This is documentation for Mothra, a collection of Scala and Spark library functions for working with Internet-related data.

    This is documentation for Mothra, a collection of Scala and Spark library functions for working with Internet-related data. Some modules contain APIs of general use to Scala programmers. Some modules make those tools more useful on Spark data-processing systems.

    Please see the documentation for the individual packages for more details on their use.

    Scala Packages

    These packages are useful in Scala code without involving Spark:

    org.cert.netsa.data

    This package, which is collected as the netsa-data library, provides types for working with various kinds of information:

    org.cert.netsa.io.ipfix

    The netsa-io-ipfix library provides tools for reading and writing IETF IPFIX data from various connections and files.

    org.cert.netsa.io.silk

    To read and write CERT NetSA SiLK file formats and configuration files, use the netsa-io-silk library.

    org.cert.netsa.util

    The "junk drawer" of netsa-util so far provides only two features: First, a method for equipping Scala scala.collection.Iterators with exception handling. And second, a way to query the versions of NetSA libraries present in a JVM at runtime.

    Spark Packages

    These packages require the use of Apache Spark:

    org.cert.netsa.mothra.datasources

    Spark datasources for CERT file types. This package contains utility features which add methods to Apache Spark DataFrameReader objects, allowing IPFIX and SiLK flows to be opened using simple spark.read... calls.

    The mothra-datasources library contains both IPFIX and SiLK functionality, while mothra-datasources-ipfix and mothra-datasources-silk contain only what's needed for the named datasource.

    org.cert.netsa.mothra.analysis

    A grab-bag of analysis helper functions and example analyses.

    org.cert.netsa.mothra.functions

    This single Scala object provides Spark SQL functions for working with network data. It is the entirety of the mothra-functions library.

    Definition Classes
    root
  • package org
    Definition Classes
    root
  • package cert
    Definition Classes
    org
  • package netsa
    Definition Classes
    cert
  • package io
    Definition Classes
    netsa
  • package silk

    SiLK file formats, data types, and methods to read them, including support for reading them from Spark.

    SiLK file formats, data types, and methods to read them, including support for reading them from Spark.

    RWRec is the type of SiLK flow records.

    You can use RWRecReader to read SiLK files from Scala, including compressed files if Hadoop native libraries are available. For example:

    import org.cert.netsa.io.silk.RWRecReader
    import java.io.FileInputStream
    
    val inputFile = new FileInputStream("path/to/silk/rw/file")
    
    for ( rec <- RWRecReader.ofInputStream(inputFile) ) {
      println(rec.sIP)
    }
    Definition Classes
    io
    See also

    org.cert.netsa.mothra.datasources.silk.flow for working with SiLK data in Spark using the Mothra SiLK datasource.

  • package config
    Definition Classes
    silk
  • package io
    Definition Classes
    silk
  • BagDataType
  • BagReader
  • BagResult
  • BagWriter
  • CompressionMethod
  • FileFormat
  • FlowType
  • Header
  • HeaderEntry
  • IPSetReader
  • IPSetWriter
  • PrefixMapProtocolPortPair
  • PrefixMapReader
  • PrefixMapResult
  • RWRec
  • RWRecReader
  • RWRecWriter
  • Sensor
  • SilkConfig
  • SilkDataFormatException
  • SilkVersion
  • TCPState

class BagWriter extends AnyRef

A writer of binary SiLK Bag files.

To include a header in the Bag file that specifies the type of the and counter, run setKeyType() and/or setCounterType() prior to writing the Bag.

Example:
  1. This example reads the contents of "example.bag" and writes it to "copy.bag", where the keys are IP addresses:

    val in = new java.io.FileInputStream("example.bag")
    val out = new java.io.FileOutputStream("copy.bag")
    val bagresult = BagReader.ofInputStream(in)
    val bag = bagresult match {
      case BagResult.IPAddressBag(iter) => iter
      case _ => null
    }
    val writer = BagWriter.toOutputStream(out)
    if ( None != bag.keyType ) {
      writer.setKeyType(bag.keyType)
    }
    if ( None != bag.counterType ) {
      writer.setCounterType(bag.counterType)
    }
    writer.appendIPAddresses(bag)
    writer.close()
See also

the companion object for more details

Linear Supertypes
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. BagWriter
  2. AnyRef
  3. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. Protected

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##: Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. def appendIPAddresses[T <: IPAddress](iter: Iterator[(T, Long)]): Unit

    Iterates over the (key, counter) pairs where each key is an IPAddresses and writes the values to the output stream as a SiLK Bag.

    Iterates over the (key, counter) pairs where each key is an IPAddresses and writes the values to the output stream as a SiLK Bag.

    Expects the IPAddresses in the Iterator to be in sorted order (numerically ascending).

    Expects all IPAddresses in the Iterator to be of the same size; that is, either all are IPv4Address or all are IPv6Address.

    Writes the file's header if it has not been written yet. The type of the key and counter may no longer be changed once this function is called.

    This function may be called successfully multiple times as long as the IPAddresses across the various calls are the same size and are in sorted order.

    Calls to this function may not be mixed with calls to appendIntegers().

  5. def appendIntegers(iter: Iterator[(Int, Long)]): Unit

    Iterates over the (key, counter) pairs where each key is an Int and writes the values to the output stream as a SiLK Bag.

    Iterates over the (key, counter) pairs where each key is an Int and writes the values to the output stream as a SiLK Bag.

    Expects the Ints in the Iterator to be in sorted order (numerically ascending).

    Writes the file's header if it has not been written yet. The type of the key and counter may no longer be changed once this function is called.

    This function may be called successfully multiple times as long as the keys across the various calls are in sorted order.

    Calls to this function may not be mixed with calls to appendIPAddresses().

  6. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  7. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.CloneNotSupportedException]) @native()
  8. def close(): Unit

    Closes the output stream.

    Closes the output stream.

    Writes the SiLK file header to the output stream if it has not been written, writes any buffered records, closes the output stream, and releases resources.

  9. val compressionMethod: CompressionMethod
  10. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  11. def equals(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef → Any
  12. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.Throwable])
  13. final def getClass(): Class[_ <: AnyRef]
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  14. def hashCode(): Int
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  15. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  16. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  17. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  18. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  19. val out: DataOutputStream
  20. def setCounterType(counterType: BagDataType): Unit

    Sets the type of the counter.

    Sets the type of the counter. The value is written into the output stream's header.

    Exceptions thrown

    java.lang.IllegalArgumentException if called after the file's header has been written.

  21. def setKeyType(keyType: BagDataType): Unit

    Sets the type of the key.

    Sets the type of the key. The value is written into the output stream's header.

    Exceptions thrown

    java.lang.IllegalArgumentException if called after the file's header has been written

  22. final def synchronized[T0](arg0: => T0): T0
    Definition Classes
    AnyRef
  23. def toString(): String
    Definition Classes
    AnyRef → Any
  24. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException])
  25. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException])
  26. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException]) @native()
  27. def wasHheaderWritten: Boolean

    Whether the Bag file has been written--that is, whether the append() method has been called.

    Whether the Bag file has been written--that is, whether the append() method has been called.

    returns

    true once the append() method has been called

Inherited from AnyRef

Inherited from Any

Ungrouped