object ExportStream
An ExportStream factory.
- Alphabetic
- By Inheritance
- ExportStream
- AnyRef
- Any
- Hide All
- Show All
- Public
- Protected
Type Members
- trait DataWrittenCallback extends AnyRef
Supports a callback mechanism when data is written to an ExportStream.
Supports a callback mechanism when data is written to an ExportStream. Enable the callback by setting the
dataWrittenCallback
of an ExportStream.- Since
1.3.1
- trait MessageTimestamp extends AnyRef
Describes a class that may be used to generate a timestamp when creating an IPFIX message.
Value Members
- final def !=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
- final def ##: Int
- Definition Classes
- AnyRef → Any
- final def ==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
- def appendTo(path: Path, model: InfoModel): ExportStream
Creates a new ExportStream that appends to an existing file.
Creates a new ExportStream that appends to an existing file.
- path
The file to append IPFIX records to
- model
The information model for elements
- def appendTo(channel: FileChannel, model: InfoModel): ExportStream
Creates a new ExportStream that appends to an existing file when given a java.nio.FileChannel open for reading and writing to that file.
Creates a new ExportStream that appends to an existing file when given a java.nio.FileChannel open for reading and writing to that file.
Creates a new SessionGroup using
model
, reads the IPFIX data fromchannel
and adds the Templates to the Session, then creates an ExportStream to append records to the file.- channel
An open read-write handle to the file to append IPFIX records to
- model
The information model for elements
- Exceptions thrown
java.lang.RuntimeException
if the underlying file contains records in more than one observation domain.
- def apply(outputStream: DataOutputStream, session: Session): ExportStream
Creates a new ExportStream
Creates a new ExportStream
- session
The session to use.
- def apply(outputStream: WritableByteChannel, session: Session): ExportStream
Creates a new ExportStream.
Creates a new ExportStream.
- outputStream
Where to write the IPFIX data.
- session
The session to use.
- final def asInstanceOf[T0]: T0
- Definition Classes
- Any
- def clone(): AnyRef
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.CloneNotSupportedException]) @native()
- final def eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
- def equals(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef → Any
- def finalize(): Unit
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.Throwable])
- final def getClass(): Class[_ <: AnyRef]
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
- def hashCode(): Int
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
- final def isInstanceOf[T0]: Boolean
- Definition Classes
- Any
- final def ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
- final def notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
- final def notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
- final def synchronized[T0](arg0: => T0): T0
- Definition Classes
- AnyRef
- def toString(): String
- Definition Classes
- AnyRef → Any
- final def wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException])
- final def wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException])
- final def wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException]) @native()
- object TimestampNow extends MessageTimestamp
The default timestamp generator.
This is documentation for Mothra, a collection of Scala and Spark library functions for working with Internet-related data. Some modules contain APIs of general use to Scala programmers. Some modules make those tools more useful on Spark data-processing systems.
Please see the documentation for the individual packages for more details on their use.
Scala Packages
These packages are useful in Scala code without involving Spark:
org.cert.netsa.data
This package, which is collected as the
netsa-data
library, provides types for working with various kinds of information:org.cert.netsa.data.net
- types for working with network dataorg.cert.netsa.data.time
- types for working with time dataorg.cert.netsa.data.unsigned
- types for working with unsigned integral valuesorg.cert.netsa.io.ipfix
The
netsa-io-ipfix
library provides tools for reading and writing IETF IPFIX data from various connections and files.org.cert.netsa.io.silk
To read and write CERT NetSA SiLK file formats and configuration files, use the
netsa-io-silk
library.org.cert.netsa.util
The "junk drawer" of
netsa-util
so far provides only two features: First, a method for equipping Scala scala.collection.Iterators with exception handling. And second, a way to query the versions of NetSA libraries present in a JVM at runtime.Spark Packages
These packages require the use of Apache Spark:
org.cert.netsa.mothra.datasources
Spark datasources for CERT file types. This package contains utility features which add methods to Apache Spark DataFrameReader objects, allowing IPFIX and SiLK flows to be opened using simple
spark.read...
calls.The
mothra-datasources
library contains both IPFIX and SiLK functionality, whilemothra-datasources-ipfix
andmothra-datasources-silk
contain only what's needed for the named datasource.org.cert.netsa.mothra.analysis
A grab-bag of analysis helper functions and example analyses.
org.cert.netsa.mothra.functions
This single Scala object provides Spark SQL functions for working with network data. It is the entirety of the
mothra-functions
library.