class BagWriter extends AnyRef
A writer of binary SiLK Bag files.
To include a header in the Bag file that specifies the type of the
and counter, run setKeyType()
and/or setCounterType()
prior to
writing the Bag.
This example reads the contents of "example.bag" and writes it to "copy.bag", where the keys are IP addresses:
val in = new java.io.FileInputStream("example.bag") val out = new java.io.FileOutputStream("copy.bag") val bagresult = BagReader.ofInputStream(in) val bag = bagresult match { case BagResult.IPAddressBag(iter) => iter case _ => null } val writer = BagWriter.toOutputStream(out) if ( None != bag.keyType ) { writer.setKeyType(bag.keyType) } if ( None != bag.counterType ) { writer.setCounterType(bag.counterType) } writer.appendIPAddresses(bag) writer.close()
- See also
the companion object for more details
- Alphabetic
- By Inheritance
- BagWriter
- AnyRef
- Any
- Hide All
- Show All
- Public
- Protected
Value Members
- final def !=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
- final def ##: Int
- Definition Classes
- AnyRef → Any
- final def ==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
- def appendIPAddresses[T <: IPAddress](iter: Iterator[(T, Long)]): Unit
Iterates over the (key, counter) pairs where each key is an IPAddresses and writes the values to the output stream as a SiLK Bag.
Iterates over the (key, counter) pairs where each key is an IPAddresses and writes the values to the output stream as a SiLK Bag.
Expects the IPAddresses in the Iterator to be in sorted order (numerically ascending).
Expects all IPAddresses in the Iterator to be of the same size; that is, either all are IPv4Address or all are IPv6Address.
Writes the file's header if it has not been written yet. The type of the key and counter may no longer be changed once this function is called.
This function may be called successfully multiple times as long as the IPAddresses across the various calls are the same size and are in sorted order.
Calls to this function may not be mixed with calls to appendIntegers().
- def appendIntegers(iter: Iterator[(Int, Long)]): Unit
Iterates over the (key, counter) pairs where each key is an Int and writes the values to the output stream as a SiLK Bag.
Iterates over the (key, counter) pairs where each key is an Int and writes the values to the output stream as a SiLK Bag.
Expects the Ints in the Iterator to be in sorted order (numerically ascending).
Writes the file's header if it has not been written yet. The type of the key and counter may no longer be changed once this function is called.
This function may be called successfully multiple times as long as the keys across the various calls are in sorted order.
Calls to this function may not be mixed with calls to appendIPAddresses().
- final def asInstanceOf[T0]: T0
- Definition Classes
- Any
- def clone(): AnyRef
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.CloneNotSupportedException]) @native()
- def close(): Unit
Closes the output stream.
Closes the output stream.
Writes the SiLK file header to the output stream if it has not been written, writes any buffered records, closes the output stream, and releases resources.
- val compressionMethod: CompressionMethod
- final def eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
- def equals(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef → Any
- def finalize(): Unit
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.Throwable])
- final def getClass(): Class[_ <: AnyRef]
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
- def hashCode(): Int
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
- final def isInstanceOf[T0]: Boolean
- Definition Classes
- Any
- final def ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
- final def notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
- final def notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
- val out: DataOutputStream
- def setCounterType(counterType: BagDataType): Unit
Sets the type of the counter.
Sets the type of the counter. The value is written into the output stream's header.
- Exceptions thrown
java.lang.IllegalArgumentException
if called after the file's header has been written.
- def setKeyType(keyType: BagDataType): Unit
Sets the type of the key.
Sets the type of the key. The value is written into the output stream's header.
- Exceptions thrown
java.lang.IllegalArgumentException
if called after the file's header has been written
- final def synchronized[T0](arg0: => T0): T0
- Definition Classes
- AnyRef
- def toString(): String
- Definition Classes
- AnyRef → Any
- final def wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException])
- final def wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException])
- final def wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException]) @native()
- def wasHheaderWritten: Boolean
Whether the Bag file has been written--that is, whether the
append()
method has been called.Whether the Bag file has been written--that is, whether the
append()
method has been called.- returns
true
once theappend()
method has been called
This is documentation for Mothra, a collection of Scala and Spark library functions for working with Internet-related data. Some modules contain APIs of general use to Scala programmers. Some modules make those tools more useful on Spark data-processing systems.
Please see the documentation for the individual packages for more details on their use.
Scala Packages
These packages are useful in Scala code without involving Spark:
org.cert.netsa.data
This package, which is collected as the
netsa-data
library, provides types for working with various kinds of information:org.cert.netsa.data.net
- types for working with network dataorg.cert.netsa.data.time
- types for working with time dataorg.cert.netsa.data.unsigned
- types for working with unsigned integral valuesorg.cert.netsa.io.ipfix
The
netsa-io-ipfix
library provides tools for reading and writing IETF IPFIX data from various connections and files.org.cert.netsa.io.silk
To read and write CERT NetSA SiLK file formats and configuration files, use the
netsa-io-silk
library.org.cert.netsa.util
The "junk drawer" of
netsa-util
so far provides only two features: First, a method for equipping Scala scala.collection.Iterators with exception handling. And second, a way to query the versions of NetSA libraries present in a JVM at runtime.Spark Packages
These packages require the use of Apache Spark:
org.cert.netsa.mothra.datasources
Spark datasources for CERT file types. This package contains utility features which add methods to Apache Spark DataFrameReader objects, allowing IPFIX and SiLK flows to be opened using simple
spark.read...
calls.The
mothra-datasources
library contains both IPFIX and SiLK functionality, whilemothra-datasources-ipfix
andmothra-datasources-silk
contain only what's needed for the named datasource.org.cert.netsa.mothra.analysis
A grab-bag of analysis helper functions and example analyses.
org.cert.netsa.mothra.functions
This single Scala object provides Spark SQL functions for working with network data. It is the entirety of the
mothra-functions
library.