sealed abstract class IpfixSet extends AnyRef
The IpfixSet class represents an IPFIX Set. (The name is "IpfixSet" to avoid conflicts with the standard Scala Set).
A Set is a generic term for a collection of records that have a similar structure. A Set consists of a 4-byte Set Header and one or more records.
There are three types of Sets: (1)Data Sets contain IPFIX flow records and are represented by the RecordSet class. (2)Template Sets and (3)Option Template Sets contain templates (or schemas) that describe the representation of the data in Data Sets. They are represented by the TemplateSet class.
To create a Set from IPFIX data stored in a ByteBuffer that was read as part of message, use
val set = IpfixSet.fromBuffer(buffer, message)
- Since
1.3.1 This class was previously called
Set
.- See also
The companion object for more details
- Alphabetic
- By Inheritance
- IpfixSet
- AnyRef
- Any
- Hide All
- Show All
- Public
- Protected
Abstract Value Members
- abstract def size: Int
Returns the number of items (Data Records or Template Records) in the Set.
- abstract def toBuffer(buffer: ByteBuffer): ByteBuffer
Appends this IpfixSet to a buffer for writing to an IPFIX stream.
Appends this IpfixSet to a buffer for writing to an IPFIX stream.
- Exceptions thrown
java.nio.BufferOverflowException
if there is not enough room in the buffer for all items in the Set. The buffer's position when an error is thrown unknown.
Concrete Value Members
- final def !=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
- final def ##: Int
- Definition Classes
- AnyRef → Any
- final def ==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
- final def asInstanceOf[T0]: T0
- Definition Classes
- Any
- def clone(): AnyRef
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.CloneNotSupportedException]) @native()
- final def eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
- def equals(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef → Any
- def finalize(): Unit
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.Throwable])
- final def getClass(): Class[_ <: AnyRef]
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
- def hashCode(): Int
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
- final val id: Int
- final def isInstanceOf[T0]: Boolean
- Definition Classes
- Any
- final def ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
- final def notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
- final def notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
- def recordIterator: Iterator[Record]
Returns an Iterator over the Records in this IpfixSet, returning an empty Iterator if this is a TemplateSet.
- final val session: Session
- final def synchronized[T0](arg0: => T0): T0
- Definition Classes
- AnyRef
- def templateIterator: Iterator[Template]
Returns an Iterator over the Templates in this IpfixSet, returning an empty Iterator if this is a RecordSet.
- def toString(): String
Returns a String representation the Set.
Returns a String representation the Set.
- Definition Classes
- IpfixSet → AnyRef → Any
- final def wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException])
- final def wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException])
- final def wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException]) @native()
This is documentation for Mothra, a collection of Scala and Spark library functions for working with Internet-related data. Some modules contain APIs of general use to Scala programmers. Some modules make those tools more useful on Spark data-processing systems.
Please see the documentation for the individual packages for more details on their use.
Scala Packages
These packages are useful in Scala code without involving Spark:
org.cert.netsa.data
This package, which is collected as the
netsa-data
library, provides types for working with various kinds of information:org.cert.netsa.data.net
- types for working with network dataorg.cert.netsa.data.time
- types for working with time dataorg.cert.netsa.data.unsigned
- types for working with unsigned integral valuesorg.cert.netsa.io.ipfix
The
netsa-io-ipfix
library provides tools for reading and writing IETF IPFIX data from various connections and files.org.cert.netsa.io.silk
To read and write CERT NetSA SiLK file formats and configuration files, use the
netsa-io-silk
library.org.cert.netsa.util
The "junk drawer" of
netsa-util
so far provides only two features: First, a method for equipping Scala scala.collection.Iterators with exception handling. And second, a way to query the versions of NetSA libraries present in a JVM at runtime.Spark Packages
These packages require the use of Apache Spark:
org.cert.netsa.mothra.datasources
Spark datasources for CERT file types. This package contains utility features which add methods to Apache Spark DataFrameReader objects, allowing IPFIX and SiLK flows to be opened using simple
spark.read...
calls.The
mothra-datasources
library contains both IPFIX and SiLK functionality, whilemothra-datasources-ipfix
andmothra-datasources-silk
contain only what's needed for the named datasource.org.cert.netsa.mothra.analysis
A grab-bag of analysis helper functions and example analyses.
org.cert.netsa.mothra.functions
This single Scala object provides Spark SQL functions for working with network data. It is the entirety of the
mothra-functions
library.