sealed trait Constraint extends ConstraintSparkVerImpl
A constraint on the value of a field within a storage partition. For
example, a partition with the constraint
Constraint.EQ("protocolIdentifier", "6")
on it would guarantee
that every record in that partition has the value of 6 for its
protocolIdentifier
field.
Constraint are typically matched up against filters on a specific query to determine if the set of filters and constraints may produce any matches, or if the partition might be discarded completely.
This mechanism is exposed so that function fields may override the constraints used for evaluation.
- Note
This is an experimental interface and is likely to be removed or made private in a future version.
- Alphabetic
- By Inheritance
- Constraint
- ConstraintSparkVerImpl
- AnyRef
- Any
- Hide All
- Show All
- Public
- Protected
Abstract Value Members
- abstract def checkEQ(v: Any): FilterResult
Whether a filter F = v can match this constraint
Whether a filter F = v can match this constraint
- Definition Classes
- Constraint → ConstraintSparkVerImpl
- abstract def checkGE(v: Any): FilterResult
Whether a filter F >= v can match this constraint
Whether a filter F >= v can match this constraint
- Definition Classes
- Constraint → ConstraintSparkVerImpl
- abstract def checkGT(v: Any): FilterResult
Whether a filter F > v can match this constraint
Whether a filter F > v can match this constraint
- Definition Classes
- Constraint → ConstraintSparkVerImpl
- abstract def checkIsNull: FilterResult
Whether a filter IS_NULL(F) can match this constraint
Whether a filter IS_NULL(F) can match this constraint
- Definition Classes
- Constraint → ConstraintSparkVerImpl
- abstract def checkLE(v: Any): FilterResult
Whether a filter F <= v can match this constraint
Whether a filter F <= v can match this constraint
- Definition Classes
- Constraint → ConstraintSparkVerImpl
- abstract def checkLT(v: Any): FilterResult
Whether a filter F < v can match this constraint
Whether a filter F < v can match this constraint
- Definition Classes
- Constraint → ConstraintSparkVerImpl
- abstract val field: String
The name of the field which this constraint applies to
Concrete Value Members
- final def !=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
- final def ##: Int
- Definition Classes
- AnyRef → Any
- final def ==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
- final def asInstanceOf[T0]: T0
- Definition Classes
- Any
- def clone(): AnyRef
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.CloneNotSupportedException]) @native()
- final def eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
- def equals(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef → Any
- def finalize(): Unit
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.Throwable])
- final def getClass(): Class[_ <: AnyRef]
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
- def hasValueFor(fieldName: String): Boolean
Whether this constraint positively affirms a value for the given field (false for IS_NULL).
Whether this constraint positively affirms a value for the given field (false for IS_NULL). This is useful for determining whether a field should be considered as ruling out other possible information element sources in a gauntlet.
- def hashCode(): Int
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
- final def isInstanceOf[T0]: Boolean
- Definition Classes
- Any
- final def matchesFilter(filter: Filter): FilterResult
Whether this constraint matches the given filter, and how (Passes, Fails, Maybe, or Nulls).
- final def matchesFilterImpl(filter: Filter): FilterResult
Whether this constraint matches the given filter, and how (Passes, Fails, Maybe, or Nulls).
- final def ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
- final def notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
- final def notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
- final def synchronized[T0](arg0: => T0): T0
- Definition Classes
- AnyRef
- def toString(): String
- Definition Classes
- AnyRef → Any
- final def wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException])
- final def wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException])
- final def wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException]) @native()
This is documentation for Mothra, a collection of Scala and Spark library functions for working with Internet-related data. Some modules contain APIs of general use to Scala programmers. Some modules make those tools more useful on Spark data-processing systems.
Please see the documentation for the individual packages for more details on their use.
Scala Packages
These packages are useful in Scala code without involving Spark:
org.cert.netsa.data
This package, which is collected as the
netsa-data
library, provides types for working with various kinds of information:org.cert.netsa.data.net
- types for working with network dataorg.cert.netsa.data.time
- types for working with time dataorg.cert.netsa.data.unsigned
- types for working with unsigned integral valuesorg.cert.netsa.io.ipfix
The
netsa-io-ipfix
library provides tools for reading and writing IETF IPFIX data from various connections and files.org.cert.netsa.io.silk
To read and write CERT NetSA SiLK file formats and configuration files, use the
netsa-io-silk
library.org.cert.netsa.util
The "junk drawer" of
netsa-util
so far provides only two features: First, a method for equipping Scala scala.collection.Iterators with exception handling. And second, a way to query the versions of NetSA libraries present in a JVM at runtime.Spark Packages
These packages require the use of Apache Spark:
org.cert.netsa.mothra.datasources
Spark datasources for CERT file types. This package contains utility features which add methods to Apache Spark DataFrameReader objects, allowing IPFIX and SiLK flows to be opened using simple
spark.read...
calls.The
mothra-datasources
library contains both IPFIX and SiLK functionality, whilemothra-datasources-ipfix
andmothra-datasources-silk
contain only what's needed for the named datasource.org.cert.netsa.mothra.analysis
A grab-bag of analysis helper functions and example analyses.
org.cert.netsa.mothra.functions
This single Scala object provides Spark SQL functions for working with network data. It is the entirety of the
mothra-functions
library.