object RWRecWriter
The RWRecWriter companion object provides support for creating an RWRecWriter.
- Alphabetic
 - By Inheritance
 
- RWRecWriter
 - AnyRef
 - Any
 
- Hide All
 - Show All
 
- Public
 - Protected
 
Value Members
-   final  def !=(arg0: Any): Boolean
- Definition Classes
 - AnyRef → Any
 
 -   final  def ##: Int
- Definition Classes
 - AnyRef → Any
 
 -   final  def ==(arg0: Any): Boolean
- Definition Classes
 - AnyRef → Any
 
 -   final  def asInstanceOf[T0]: T0
- Definition Classes
 - Any
 
 -    def clone(): AnyRef
- Attributes
 - protected[lang]
 - Definition Classes
 - AnyRef
 - Annotations
 - @throws(classOf[java.lang.CloneNotSupportedException]) @native()
 
 -   final  def eq(arg0: AnyRef): Boolean
- Definition Classes
 - AnyRef
 
 -    def equals(arg0: AnyRef): Boolean
- Definition Classes
 - AnyRef → Any
 
 -    def finalize(): Unit
- Attributes
 - protected[lang]
 - Definition Classes
 - AnyRef
 - Annotations
 - @throws(classOf[java.lang.Throwable])
 
 -    def forOutputStream(s: OutputStream, compressionMethod: CompressionMethod = CompressionMethod.NONE, order: ByteOrder = ByteOrder.nativeOrder()): RWRecWriter
Creates and returns a writer that writes RWRecs as a binary SiLK RWRec stream to the output stream
susing the default file format and version.Creates and returns a writer that writes RWRecs as a binary SiLK RWRec stream to the output stream
susing the default file format and version. The default format preserves as much information as possible, but note that this may limit compatibility with older versions of SiLK. For finer-grained control over the output format, consider using forOutputStreamFormat or forOutputStreamPrecision.If
compressionMethodis provided, compresses output using that method, otherwise output is not compressed.If
orderis provided, writes data in the specified byte order, otherwise uses native order. -    def forOutputStreamFormat(s: OutputStream, fileFormat: FileFormat, recordVersion: Short, compressionMethod: CompressionMethod = CompressionMethod.NONE, order: ByteOrder = ByteOrder.nativeOrder()): RWRecWriter
Creates and returns a writer that writes RWRecs as a binary SiLK RWRec stream to the output stream
susing the requested file format and record version, if available.Creates and returns a writer that writes RWRecs as a binary SiLK RWRec stream to the output stream
susing the requested file format and record version, if available. @throws SilkDataFormatException if the format is not supported.If
compressionMethodis provided, compresses output using that method, otherwise output is not compressed.If
orderis provided, writes data in the specified byte order, otherwise uses native order. -    def forOutputStreamPrecision(s: OutputStream, timePrecision: ChronoUnit, compressionMethod: CompressionMethod = CompressionMethod.NONE, order: ByteOrder = ByteOrder.nativeOrder()): RWRecWriter
Creates and returns a writer that writes RWRecs as a binary SiLK RWRec stream to the output stream
susing the default file format and record version for the requested time precision.Creates and returns a writer that writes RWRecs as a binary SiLK RWRec stream to the output stream
susing the default file format and record version for the requested time precision.If
compressionMethodis provided, compresses output using that method, otherwise output is not compressed.If
orderis provided, writes data in the specified byte order, otherwise uses native order. -   final  def getClass(): Class[_ <: AnyRef]
- Definition Classes
 - AnyRef → Any
 - Annotations
 - @native()
 
 -    def hashCode(): Int
- Definition Classes
 - AnyRef → Any
 - Annotations
 - @native()
 
 -   final  def isInstanceOf[T0]: Boolean
- Definition Classes
 - Any
 
 -   final  def ne(arg0: AnyRef): Boolean
- Definition Classes
 - AnyRef
 
 -   final  def notify(): Unit
- Definition Classes
 - AnyRef
 - Annotations
 - @native()
 
 -   final  def notifyAll(): Unit
- Definition Classes
 - AnyRef
 - Annotations
 - @native()
 
 -   final  def synchronized[T0](arg0: => T0): T0
- Definition Classes
 - AnyRef
 
 -    def toString(): String
- Definition Classes
 - AnyRef → Any
 
 -   final  def wait(): Unit
- Definition Classes
 - AnyRef
 - Annotations
 - @throws(classOf[java.lang.InterruptedException])
 
 -   final  def wait(arg0: Long, arg1: Int): Unit
- Definition Classes
 - AnyRef
 - Annotations
 - @throws(classOf[java.lang.InterruptedException])
 
 -   final  def wait(arg0: Long): Unit
- Definition Classes
 - AnyRef
 - Annotations
 - @throws(classOf[java.lang.InterruptedException]) @native()
 
 
Deprecated Value Members
-    def toOutputStream(s: OutputStream, compressionMethod: CompressionMethod = CompressionMethod.NONE): RWRecWriter
- Annotations
 - @deprecated
 - Deprecated
 (Since version netsa-io-silk 1.7.0) please use RWRecWriter.forOutputStream family of methods instead for control over output format
 
This is documentation for Mothra, a collection of Scala and Spark library functions for working with Internet-related data. Some modules contain APIs of general use to Scala programmers. Some modules make those tools more useful on Spark data-processing systems.
Please see the documentation for the individual packages for more details on their use.
Scala Packages
These packages are useful in Scala code without involving Spark:
org.cert.netsa.data
This package, which is collected as the
netsa-datalibrary, provides types for working with various kinds of information:org.cert.netsa.data.net- types for working with network dataorg.cert.netsa.data.time- types for working with time dataorg.cert.netsa.data.unsigned- types for working with unsigned integral valuesorg.cert.netsa.io.ipfix
The
netsa-io-ipfixlibrary provides tools for reading and writing IETF IPFIX data from various connections and files.org.cert.netsa.io.silk
To read and write CERT NetSA SiLK file formats and configuration files, use the
netsa-io-silklibrary.org.cert.netsa.util
The "junk drawer" of
netsa-utilso far provides only two features: First, a method for equipping Scala Iterators with exception handling. And second, a way to query the versions of NetSA libraries present in a JVM at runtime.Spark Packages
These packages require the use of Apache Spark:
org.cert.netsa.mothra.datasources
Spark datasources for CERT file types. This package contains utility features which add methods to Apache Spark DataFrameReader objects, allowing IPFIX and SiLK flows to be opened using simple
spark.read...calls.The
mothra-datasourceslibrary contains both IPFIX and SiLK functionality, whilemothra-datasources-ipfixandmothra-datasources-silkcontain only what's needed for the named datasource.org.cert.netsa.mothra.analysis
A grab-bag of analysis helper functions and example analyses.
org.cert.netsa.mothra.functions
This single Scala object provides Spark SQL functions for working with network data. It is the entirety of the
mothra-functionslibrary.