Packages

  • package root

    This is documentation for Mothra, a collection of Scala and Spark library functions for working with Internet-related data.

    This is documentation for Mothra, a collection of Scala and Spark library functions for working with Internet-related data. Some modules contain APIs of general use to Scala programmers. Some modules make those tools more useful on Spark data-processing systems.

    Please see the documentation for the individual packages for more details on their use.

    Scala Packages

    These packages are useful in Scala code without involving Spark:

    org.cert.netsa.data

    This package, which is collected as the netsa-data library, provides types for working with various kinds of information:

    org.cert.netsa.io.ipfix

    The netsa-io-ipfix library provides tools for reading and writing IETF IPFIX data from various connections and files.

    org.cert.netsa.io.silk

    To read and write CERT NetSA SiLK file formats and configuration files, use the netsa-io-silk library.

    org.cert.netsa.util

    The "junk drawer" of netsa-util so far provides only two features: First, a method for equipping Scala scala.collection.Iterators with exception handling. And second, a way to query the versions of NetSA libraries present in a JVM at runtime.

    Spark Packages

    These packages require the use of Apache Spark:

    org.cert.netsa.mothra.datasources

    Spark datasources for CERT file types. This package contains utility features which add methods to Apache Spark DataFrameReader objects, allowing IPFIX and SiLK flows to be opened using simple spark.read... calls.

    The mothra-datasources library contains both IPFIX and SiLK functionality, while mothra-datasources-ipfix and mothra-datasources-silk contain only what's needed for the named datasource.

    org.cert.netsa.mothra.analysis

    A grab-bag of analysis helper functions and example analyses.

    org.cert.netsa.mothra.functions

    This single Scala object provides Spark SQL functions for working with network data. It is the entirety of the mothra-functions library.

    Definition Classes
    root
  • package org
    Definition Classes
    root
  • package cert
    Definition Classes
    org
  • package netsa
    Definition Classes
    cert
  • package data

    The org.cert.netsa.data.net package is for working with network-related data.

    The org.cert.netsa.data.net package is for working with network-related data. This includes types for IP addresses, port numbers, protocol numbers, and the like. Many of these types have namespaces managed by IANA, and the types provide mechanisms for looking up names from numbers and vice-versa based on embedded copies of IANA's tables.

    In org.cert.netsa.data.time you can find an Ordering for Java LocalDate objects, and a type LocalDateSet for working with sets of those dates.

    Finally, org.cert.netsa.data.unsigned contains types for working with unsigned integer values.

    Definition Classes
    netsa
  • package io
    Definition Classes
    netsa
  • package mothra
    Definition Classes
    netsa
  • package analysis
  • package datasources

    This package contains the Mothra datasources, along with mechanisms for working with those datasources.

    This package contains the Mothra datasources, along with mechanisms for working with those datasources. The primary novel feature of these datasources is the fields mechanism.

    To use the IPFIX or SiLK data sources, you can use the following methods added by the implicit CERTDataFrameReader on DataFrameReader after importing from this package:

    import org.cert.netsa.mothra.datasources._
    val silkDF = spark.read.silkFlow()                                    // to read from the default SiLK repository
    val silkRepoDF = spark.read.silkFlow(repository="...")                // to read from an alternate SiLK repository
    val silkFilesDF = spark.read.silkFlow("/path/to/silk/files")          // to read from loose SiLK files
    val ipfixDF = spark.read.ipfix(repository="/path/to/mothra/data/dir") // for packed Mothra IPFIX data
    val ipfixS3DF = spark.read.ipfix(s3Repository="bucket-name")          // for packed Mothra IPFIX data from an S3 bucket
    val ipfixFilesDF = spark.read.ipfix("/path/to/ipfix/files")           // for loose IPFIX files

    (The additional methods are defined on the implicit class CERTDataFrameReader.)

    Using the fields method allows you to configure which SiLK or IPFIX fields you wish to retrieve. (This is particularly important for IPFIX data, as IPFIX files may contains many many possible fields organized in various ways.)

    import org.cert.netsa.mothra.datasources._
    val silkDF = spark.read.fields("sIP", "dIP").silkFlow(...)
    val ipfixDF = spark.read.fields("sourceIPAddress", "destinationIPAddress").ipfix(...)

    Both of these dataframes will contain only the source and destination IP addresses from the specified data sources. You may also provide column names different from the source field names:

    val silkDF = spark.read.fields("server" -> "sIP", "client" -> "dIP").silkFlow(...)
    val ipfixDF = spark.read.fields("server" -> "sourceIPAddress", "client" -> "destinationIPAddress").ipfix(...)

    You may also mix the mapped and the default names in one call:

    val df = spark.read.fields("sIP", "dIP", "s" -> "sensor").silkFlow(...)
    See also

    IPFIX datasource

    SiLK flow datasource

  • functions
  • package util
    Definition Classes
    netsa

package mothra

Ordering
  1. Alphabetic
Visibility
  1. Public
  2. Protected

Package Members

  1. package analysis
  2. package datasources

    This package contains the Mothra datasources, along with mechanisms for working with those datasources.

    This package contains the Mothra datasources, along with mechanisms for working with those datasources. The primary novel feature of these datasources is the fields mechanism.

    To use the IPFIX or SiLK data sources, you can use the following methods added by the implicit CERTDataFrameReader on DataFrameReader after importing from this package:

    import org.cert.netsa.mothra.datasources._
    val silkDF = spark.read.silkFlow()                                    // to read from the default SiLK repository
    val silkRepoDF = spark.read.silkFlow(repository="...")                // to read from an alternate SiLK repository
    val silkFilesDF = spark.read.silkFlow("/path/to/silk/files")          // to read from loose SiLK files
    val ipfixDF = spark.read.ipfix(repository="/path/to/mothra/data/dir") // for packed Mothra IPFIX data
    val ipfixS3DF = spark.read.ipfix(s3Repository="bucket-name")          // for packed Mothra IPFIX data from an S3 bucket
    val ipfixFilesDF = spark.read.ipfix("/path/to/ipfix/files")           // for loose IPFIX files

    (The additional methods are defined on the implicit class CERTDataFrameReader.)

    Using the fields method allows you to configure which SiLK or IPFIX fields you wish to retrieve. (This is particularly important for IPFIX data, as IPFIX files may contains many many possible fields organized in various ways.)

    import org.cert.netsa.mothra.datasources._
    val silkDF = spark.read.fields("sIP", "dIP").silkFlow(...)
    val ipfixDF = spark.read.fields("sourceIPAddress", "destinationIPAddress").ipfix(...)

    Both of these dataframes will contain only the source and destination IP addresses from the specified data sources. You may also provide column names different from the source field names:

    val silkDF = spark.read.fields("server" -> "sIP", "client" -> "dIP").silkFlow(...)
    val ipfixDF = spark.read.fields("server" -> "sourceIPAddress", "client" -> "destinationIPAddress").ipfix(...)

    You may also mix the mapped and the default names in one call:

    val df = spark.read.fields("sIP", "dIP", "s" -> "sensor").silkFlow(...)
    See also

    IPFIX datasource

    SiLK flow datasource

Value Members

  1. object functions

    A collection of Spark SQL functions for use with network data.

Ungrouped