case object ListSemantics extends ShortEnum[ListSemantics] with Product with Serializable
IPFIX Structured Data (List) Type Semantics as defined in RFC6313
Structured data type semantics are provided in order to express the relationship among multiple list elements in a Structured Data Information Element.
See https://www.iana.org/assignments/ipfix/ipfix.xhtml#ipfix-structured-data-types-semantics
May be converted to and from Short values.
- Grouped
- Alphabetic
- By Inheritance
- ListSemantics
- Serializable
- Product
- Equals
- ShortEnum
- ValueEnum
- AnyRef
- Any
- Hide All
- Show All
- Public
- Protected
Value Members
- final def !=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
- final def ##: Int
- Definition Classes
- AnyRef → Any
- final def ==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
- def apply(v: Short): ListSemantics
The IPFIX list semantic type for this Short value.
The IPFIX list semantic type for this Short value.
- Exceptions thrown
java.util.NoSuchElementException
if the value represents no known list semantic type.
- final def asInstanceOf[T0]: T0
- Definition Classes
- Any
- def clone(): AnyRef
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.CloneNotSupportedException]) @native()
- final def eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
- def equals(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef → Any
- def finalize(): Unit
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.Throwable])
- final macro def findValues: IndexedSeq[ListSemantics]
- Attributes
- protected
- Definition Classes
- ShortEnum
- final def getClass(): Class[_ <: AnyRef]
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
- final def isInstanceOf[T0]: Boolean
- Definition Classes
- Any
- final def ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
- final def notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
- final def notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
- def productElementName(n: Int): String
- Definition Classes
- Product
- def productElementNames: Iterator[String]
- Definition Classes
- Product
- final def synchronized[T0](arg0: => T0): T0
- Definition Classes
- AnyRef
- lazy val values: IndexedSeq[ListSemantics]
Collection of all known valid list semantics.
Collection of all known valid list semantics.
- Definition Classes
- ListSemantics → ValueEnum
- final lazy val valuesToEntriesMap: Map[Short, ListSemantics]
- Definition Classes
- ValueEnum
- final def wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException])
- final def wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException])
- final def wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException]) @native()
- def withName(name: String): ListSemantics
The IPFIX list semantic type identified by the name
name
.The IPFIX list semantic type identified by the name
name
.- Exceptions thrown
java.util.NoSuchElementException
if the name represents no known list semantic.
- def withNameOpt(name: String): Option[ListSemantics]
Finds the IPFIX IE list semantic value whose name is
name
as an Option.Finds the IPFIX IE list semantic value whose name is
name
as an Option.The function seaches linearly through the list of list semantics.
- returns
The list semantic represented by
name
.
- def withValue(i: Short): ListSemantics
- Definition Classes
- ValueEnum
- Annotations
- @SuppressWarnings()
- def withValueEither(i: Short): Either[NoSuchMember[Short, ValueEnumEntry[Short]], ListSemantics]
- Definition Classes
- ValueEnum
- def withValueOpt(i: Short): Option[ListSemantics]
- Definition Classes
- ValueEnum
- case object AllOf extends ListSemantics with Product with Serializable
IPFIX list semantic value denoting all of the elements in the list are properties of the data record.
- case object ExactlyOneOf extends ListSemantics with Product with Serializable
IPFIX list semantic value denoting that only a single element in the list is a property of the data record.
- case object NoneOf extends ListSemantics with Product with Serializable
IPFIX list semantic value denoting that none of the elements are properties of the data record.
- case object OneOrMoreOf extends ListSemantics with Product with Serializable
IPFIX list semantic value denoting one or more elements in the list are properties of the data record.
- case object Ordered extends ListSemantics with Product with Serializable
IPFIX list semantic value denoting elements in the list are ordered.
- case object Undefined extends ListSemantics with Product with Serializable
IPFIX List semantic denoting the semantic is not specified.
This is documentation for Mothra, a collection of Scala and Spark library functions for working with Internet-related data. Some modules contain APIs of general use to Scala programmers. Some modules make those tools more useful on Spark data-processing systems.
Please see the documentation for the individual packages for more details on their use.
Scala Packages
These packages are useful in Scala code without involving Spark:
org.cert.netsa.data
This package, which is collected as the
netsa-data
library, provides types for working with various kinds of information:org.cert.netsa.data.net
- types for working with network dataorg.cert.netsa.data.time
- types for working with time dataorg.cert.netsa.data.unsigned
- types for working with unsigned integral valuesorg.cert.netsa.io.ipfix
The
netsa-io-ipfix
library provides tools for reading and writing IETF IPFIX data from various connections and files.org.cert.netsa.io.silk
To read and write CERT NetSA SiLK file formats and configuration files, use the
netsa-io-silk
library.org.cert.netsa.util
The "junk drawer" of
netsa-util
so far provides only two features: First, a method for equipping Scala scala.collection.Iterators with exception handling. And second, a way to query the versions of NetSA libraries present in a JVM at runtime.Spark Packages
These packages require the use of Apache Spark:
org.cert.netsa.mothra.datasources
Spark datasources for CERT file types. This package contains utility features which add methods to Apache Spark DataFrameReader objects, allowing IPFIX and SiLK flows to be opened using simple
spark.read...
calls.The
mothra-datasources
library contains both IPFIX and SiLK functionality, whilemothra-datasources-ipfix
andmothra-datasources-silk
contain only what's needed for the named datasource.org.cert.netsa.mothra.analysis
A grab-bag of analysis helper functions and example analyses.
org.cert.netsa.mothra.functions
This single Scala object provides Spark SQL functions for working with network data. It is the entirety of the
mothra-functions
library.