case class UnionField[+T](fs: IPFIXField[T]*) extends IPFIXField[T] with LazyLogging with Product with Serializable
A field which returns all of the results of the fields given as its arguments. It's an error if the arguments don't all produce the same result type.
- Note
This is an experimental interface and is likely to be removed or made private in a future version.
- Alphabetic
- By Inheritance
- UnionField
- Product
- Equals
- LazyLogging
- IPFIXField
- Serializable
- AnyRef
- Any
- Hide All
- Show All
- Public
- Protected
Instance Constructors
- new UnionField(fs: IPFIXField[T]*)
Value Members
- final def !=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
- final def ##: Int
- Definition Classes
- AnyRef → Any
- final def ==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
- final def asInstanceOf[T0]: T0
- Definition Classes
- Any
- def clone(): AnyRef
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.CloneNotSupportedException]) @native()
- final def eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
- def extract(rec: Record, filename: String): Iterator[T]
An Iterable of values for this field for the given IPFIX record from the given file.
An Iterable of values for this field for the given IPFIX record from the given file.
- rec
IPFIX record for data to be pulled from.
- filename
Filename or equivalent for source reporting.
- Definition Classes
- UnionField → IPFIXField
- final def extractOne(rec: Record, filename: String): Option[T]
The first value produced by this field, if any.
The first value produced by this field, if any.
- Definition Classes
- IPFIXField
- def filterCheck(f: Filter, constraints: Seq[Constraint]): FilterResult
Whether the given filter (which is on this field regardless of the name in the Filter object) will incklude a file with the given constraints.
Whether the given filter (which is on this field regardless of the name in the Filter object) will incklude a file with the given constraints.
- returns
Passes
if every record will pass the filter,Fails
if every record will fail the filter, orMaybe
if some records will pass and some will fail the filter.
- Definition Classes
- IPFIXField
- def filterDates(f: Filter): Option[LocalDateSet]
The set of dates such that a record starting on that date might satisfy the given filter (which is on this field regardless of the name in the Filter object).
The set of dates such that a record starting on that date might satisfy the given filter (which is on this field regardless of the name in the Filter object).
- returns
Some(set)
if some set of dates will pass the filter, orNone
if no dates will ever pass the filter.
- Definition Classes
- IPFIXField
- def finalize(): Unit
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.Throwable])
- val fs: IPFIXField[T]*
- final def getClass(): Class[_ <: AnyRef]
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
- final def isInstanceOf[T0]: Boolean
- Definition Classes
- Any
- lazy val logger: Logger
- Attributes
- protected
- Definition Classes
- LazyLogging
- Annotations
- @transient()
- final def map[U](sqlType: DataType, f: (T) => U): IPFIXField[U]
The results of this field mapped through the given function.
The results of this field mapped through the given function. The provided
sqlType
argument should be the appropriate type for the result.- Definition Classes
- IPFIXField
- final def ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
- final def notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
- final def notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
- def productElementNames: Iterator[String]
- Definition Classes
- Product
- val sqlType: DataType
The SQL type of values of this field.
The SQL type of values of this field.
- Definition Classes
- UnionField → IPFIXField
- final def synchronized[T0](arg0: => T0): T0
- Definition Classes
- AnyRef
- def toString(): String
- Definition Classes
- UnionField → AnyRef → Any
- final def wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException])
- final def wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException])
- final def wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException]) @native()
- final def |[U >: T](other: IPFIXField[U]): IPFIXField[U]
The union of this field and the given field.
The union of this field and the given field. This produces all of the results of this field first, followed by any results of the other field.
- Definition Classes
- IPFIXField
This is documentation for Mothra, a collection of Scala and Spark library functions for working with Internet-related data. Some modules contain APIs of general use to Scala programmers. Some modules make those tools more useful on Spark data-processing systems.
Please see the documentation for the individual packages for more details on their use.
Scala Packages
These packages are useful in Scala code without involving Spark:
org.cert.netsa.data
This package, which is collected as the
netsa-data
library, provides types for working with various kinds of information:org.cert.netsa.data.net
- types for working with network dataorg.cert.netsa.data.time
- types for working with time dataorg.cert.netsa.data.unsigned
- types for working with unsigned integral valuesorg.cert.netsa.io.ipfix
The
netsa-io-ipfix
library provides tools for reading and writing IETF IPFIX data from various connections and files.org.cert.netsa.io.silk
To read and write CERT NetSA SiLK file formats and configuration files, use the
netsa-io-silk
library.org.cert.netsa.util
The "junk drawer" of
netsa-util
so far provides only two features: First, a method for equipping Scala Iterators with exception handling. And second, a way to query the versions of NetSA libraries present in a JVM at runtime.Spark Packages
These packages require the use of Apache Spark:
org.cert.netsa.mothra.datasources
Spark datasources for CERT file types. This package contains utility features which add methods to Apache Spark DataFrameReader objects, allowing IPFIX and SiLK flows to be opened using simple
spark.read...
calls.The
mothra-datasources
library contains both IPFIX and SiLK functionality, whilemothra-datasources-ipfix
andmothra-datasources-silk
contain only what's needed for the named datasource.org.cert.netsa.mothra.analysis
A grab-bag of analysis helper functions and example analyses.
org.cert.netsa.mothra.functions
This single Scala object provides Spark SQL functions for working with network data. It is the entirety of the
mothra-functions
library.