case class Header(fileFlags: Byte, fileFormat: FileFormat, fileVersion: Byte, compressionMethod: CompressionMethod, silkVersion: SilkVersion, recordSize: Short, recordVersion: Short, headerEntries: IndexedSeq[HeaderEntry]) extends Product with Serializable
A SiLK file header, including contained header entries. Supports only "new-style" header format (SiLK versions 1.0+).
- fileFlags
The bits encoding file flags. Currently only whether the file is big-endian.
- fileFormat
The SiLK file format contained within this file.
- fileVersion
The SiLK file version--specifically the version of the header format.
- compressionMethod
The compression method used by data in this file.
- silkVersion
The version of SiLK used to create this file.
- recordSize
The size of individual (uncompressed) records in this file.
- recordVersion
The record version of the file format.
- headerEntries
Sequence of additional extensible header records of various types.
- See also
Header.isBigEndian
- Alphabetic
- By Inheritance
- Header
- Serializable
- Product
- Equals
- AnyRef
- Any
- Hide All
- Show All
- Public
- Protected
Instance Constructors
- new Header(fileFlags: Byte, fileFormat: FileFormat, fileVersion: Byte, compressionMethod: CompressionMethod, silkVersion: SilkVersion, recordSize: Short, recordVersion: Short, headerEntries: IndexedSeq[HeaderEntry])
- fileFlags
The bits encoding file flags. Currently only whether the file is big-endian.
- fileFormat
The SiLK file format contained within this file.
- fileVersion
The SiLK file version--specifically the version of the header format.
- compressionMethod
The compression method used by data in this file.
- silkVersion
The version of SiLK used to create this file.
- recordSize
The size of individual (uncompressed) records in this file.
- recordVersion
The record version of the file format.
- headerEntries
Sequence of additional extensible header records of various types.
Value Members
- final def !=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
- final def ##: Int
- Definition Classes
- AnyRef → Any
- final def ==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
- def annotations: Seq[String]
Annotations made on this SiLK file, if any.
- final def asInstanceOf[T0]: T0
- Definition Classes
- Any
- def clone(): AnyRef
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.CloneNotSupportedException]) @native()
- val compressionMethod: CompressionMethod
- final def eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
- val fileFlags: Byte
- val fileFormat: FileFormat
- val fileVersion: Byte
- def finalize(): Unit
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.Throwable])
- def flowTypeId: Option[FlowType]
Optional SiLK flow type for all flows in this packed file.
- final def getClass(): Class[_ <: AnyRef]
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
- val headerEntries: IndexedSeq[HeaderEntry]
- def invocations: Seq[String]
Command-lines used to produce this SiLK file, if any.
- def isBigEndian: Boolean
True if data within the records of this file are stored in big-endian (MSB first) format.
- final def isInstanceOf[T0]: Boolean
- Definition Classes
- Any
- final def ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
- final def notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
- final def notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
- def probeName: Option[String]
Optional probe name recorded with SiLK file.
- def productElementNames: Iterator[String]
- Definition Classes
- Product
- val recordSize: Short
- val recordVersion: Short
- def sensorId: Option[Sensor]
Optional SiLK sensor ID for all flows in this packed file.
- val silkVersion: SilkVersion
- def startTimeOffset: Option[Long]
Optional base start time, in milliseconds since the UNIX epoch.
Optional base start time, in milliseconds since the UNIX epoch. Times in this packed file are expressed as a delta from this base start time.
- final def synchronized[T0](arg0: => T0): T0
- Definition Classes
- AnyRef
- final def wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException])
- final def wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException])
- final def wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException]) @native()
- def writeTo(outputStream: OutputStream): Unit
Writes the header to the provided output stream.
This is documentation for Mothra, a collection of Scala and Spark library functions for working with Internet-related data. Some modules contain APIs of general use to Scala programmers. Some modules make those tools more useful on Spark data-processing systems.
Please see the documentation for the individual packages for more details on their use.
Scala Packages
These packages are useful in Scala code without involving Spark:
org.cert.netsa.data
This package, which is collected as the
netsa-data
library, provides types for working with various kinds of information:org.cert.netsa.data.net
- types for working with network dataorg.cert.netsa.data.time
- types for working with time dataorg.cert.netsa.data.unsigned
- types for working with unsigned integral valuesorg.cert.netsa.io.ipfix
The
netsa-io-ipfix
library provides tools for reading and writing IETF IPFIX data from various connections and files.org.cert.netsa.io.silk
To read and write CERT NetSA SiLK file formats and configuration files, use the
netsa-io-silk
library.org.cert.netsa.util
The "junk drawer" of
netsa-util
so far provides only two features: First, a method for equipping Scala scala.collection.Iterators with exception handling. And second, a way to query the versions of NetSA libraries present in a JVM at runtime.Spark Packages
These packages require the use of Apache Spark:
org.cert.netsa.mothra.datasources
Spark datasources for CERT file types. This package contains utility features which add methods to Apache Spark DataFrameReader objects, allowing IPFIX and SiLK flows to be opened using simple
spark.read...
calls.The
mothra-datasources
library contains both IPFIX and SiLK functionality, whilemothra-datasources-ipfix
andmothra-datasources-silk
contain only what's needed for the named datasource.org.cert.netsa.mothra.analysis
A grab-bag of analysis helper functions and example analyses.
org.cert.netsa.mothra.functions
This single Scala object provides Spark SQL functions for working with network data. It is the entirety of the
mothra-functions
library.