package unsigned
A variety of unsigned integral types, and new methods on the built-in integral types for working with them.
Import the implicit conversions from this package to add toUInt
methods and the like to standard Scala types.
Features
The overall pattern for each integral type (UByte, UShort, UInt, ULong) is the following (using UByte as the example):
Unsigned alues can be constructed from signed Byte and
Int values using UByte(b:
Byte)
and UByte(i: Int)
.
x.toUByte
, x.toUShort
, etc. and
x.toByte
, x.toShort
, etc. methods
are included.
All of the expected comparison, arithmetic, and bitwise operations are present. In addition, UByte extends Comparable, and equipped with an Ordering and membership in the Integral type class.
UByte.MinValue
and UByte.MaxValue
are defined.
If you import implicits.ByteUnsignedConversions
, then
x.toUByte
, etc. methods
will be available by implicit conversion on Byte values.
- Note
If you are concerned with efficiency, do not create arrays of unsigned values, as the will be boxed into objects. Instead, create arrays of normal signed values and then convert to and from unsigned when getting and setting the values.
- Alphabetic
- By Inheritance
- unsigned
- AnyRef
- Any
- Hide All
- Show All
- Public
- Protected
Type Members
- final class UByte extends AnyVal with Comparable[UByte] with ScalaNumericAnyConversions
An unsigned 8-bit value, also known as an octet, unsigned byte, or uint8.
- final class UInt extends AnyVal with Comparable[UInt] with ScalaNumericAnyConversions
An unsigned 32-bit value, also known as an unsigned integer, or uint32.
- final class ULong extends AnyVal with Comparable[ULong] with ScalaNumericAnyConversions
An unsigned 64-bit value, also known as an unsigned long or uint64.
- final class UShort extends AnyVal with Comparable[UShort] with ScalaNumericAnyConversions
An unsigned 16-bit value, also known as an unsigned short or uint16.
This is documentation for Mothra, a collection of Scala and Spark library functions for working with Internet-related data. Some modules contain APIs of general use to Scala programmers. Some modules make those tools more useful on Spark data-processing systems.
Please see the documentation for the individual packages for more details on their use.
Scala Packages
These packages are useful in Scala code without involving Spark:
org.cert.netsa.data
This package, which is collected as the
netsa-data
library, provides types for working with various kinds of information:org.cert.netsa.data.net
- types for working with network dataorg.cert.netsa.data.time
- types for working with time dataorg.cert.netsa.data.unsigned
- types for working with unsigned integral valuesorg.cert.netsa.io.ipfix
The
netsa-io-ipfix
library provides tools for reading and writing IETF IPFIX data from various connections and files.org.cert.netsa.io.silk
To read and write CERT NetSA SiLK file formats and configuration files, use the
netsa-io-silk
library.org.cert.netsa.util
The "junk drawer" of
netsa-util
so far provides only two features: First, a method for equipping Scala scala.collection.Iterators with exception handling. And second, a way to query the versions of NetSA libraries present in a JVM at runtime.Spark Packages
These packages require the use of Apache Spark:
org.cert.netsa.mothra.datasources
Spark datasources for CERT file types. This package contains utility features which add methods to Apache Spark DataFrameReader objects, allowing IPFIX and SiLK flows to be opened using simple
spark.read...
calls.The
mothra-datasources
library contains both IPFIX and SiLK functionality, whilemothra-datasources-ipfix
andmothra-datasources-silk
contain only what's needed for the named datasource.org.cert.netsa.mothra.analysis
A grab-bag of analysis helper functions and example analyses.
org.cert.netsa.mothra.functions
This single Scala object provides Spark SQL functions for working with network data. It is the entirety of the
mothra-functions
library.