Class

toolkit.neuralnetwork.function

CrossEntropySoftmaxes

Related Doc: package function

Permalink

case class CrossEntropySoftmaxes(left: DifferentiableField, right: DifferentiableField, refInputIsPDF: Boolean = true, safeMode: Boolean = true) extends DifferentiableField with Product with Serializable

The cross-entropy loss function applied to the softmax of the input relative to the reference signal. This loss function is commonly used for training a classification network. Unlike the similarly-named "CrossEntropySoftMax", this class computes a cross-entropy softmax individually for each image representation of the batch. As such, its output is not a single scalar, but instead a vector of length batchSize. This allows the class to be tested by the existing test infrastructure.

left

The input signal, typically a classification output

right

The reference signal, typically a one hot code representing a class label

refInputIsPDF

The right reference input for each element of the batch sums to 1.

safeMode

Protect against generating NaNs for large inputs (>100).

Linear Supertypes
Serializable, Serializable, Product, Equals, DifferentiableField, GradientPropagation, DifferentiableFieldOps, BasicOps, AnyRef, Any
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. CrossEntropySoftmaxes
  2. Serializable
  3. Serializable
  4. Product
  5. Equals
  6. DifferentiableField
  7. GradientPropagation
  8. DifferentiableFieldOps
  9. BasicOps
  10. AnyRef
  11. Any
  1. Hide All
  2. Show all
Visibility
  1. Public
  2. All

Instance Constructors

  1. new CrossEntropySoftmaxes(left: DifferentiableField, right: DifferentiableField, refInputIsPDF: Boolean = true, safeMode: Boolean = true)

    Permalink

    left

    The input signal, typically a classification output

    right

    The reference signal, typically a one hot code representing a class label

    refInputIsPDF

    The right reference input for each element of the batch sums to 1.

    safeMode

    Protect against generating NaNs for large inputs (>100).

Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. def *(that: Float): DifferentiableField

    Permalink
    Definition Classes
    DifferentiableFieldOps
  4. def *(that: DifferentiableField): DifferentiableField

    Permalink
    Definition Classes
    DifferentiableFieldOps
  5. def +(that: Float): DifferentiableField

    Permalink
    Definition Classes
    DifferentiableFieldOps
  6. def +(that: DifferentiableField): DifferentiableField

    Permalink
    Definition Classes
    DifferentiableFieldOps
  7. def -(that: Float): DifferentiableField

    Permalink
    Definition Classes
    DifferentiableFieldOps
  8. def -(that: DifferentiableField): DifferentiableField

    Permalink
    Definition Classes
    DifferentiableFieldOps
  9. def /(that: Float): DifferentiableField

    Permalink
    Definition Classes
    DifferentiableFieldOps
  10. def /(that: DifferentiableField): DifferentiableField

    Permalink
    Definition Classes
    DifferentiableFieldOps
  11. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  12. def activateSGD(initField: libcog.Field = ScalarField(1f), invokeCallbacks: Boolean = true): Unit

    Permalink
    Definition Classes
    GradientPropagation
  13. def add(input: DifferentiableField, c: Float): DifferentiableField

    Permalink
    Definition Classes
    BasicOps
  14. def add(left: DifferentiableField, right: DifferentiableField): DifferentiableField

    Permalink
    Definition Classes
    BasicOps
  15. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  16. var backward: Option[libcog.Field]

    Permalink
    Definition Classes
    DifferentiableField
  17. def backwardCallback(back: libcog.Field): Unit

    Permalink
    Definition Classes
    DifferentiableField
  18. val batchSize: Int

    Permalink
  19. def clone(): AnyRef

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  20. def divide(input: DifferentiableField, c: Float): DifferentiableField

    Permalink
    Definition Classes
    BasicOps
  21. def divide(left: DifferentiableField, right: DifferentiableField): DifferentiableField

    Permalink
    Definition Classes
    BasicOps
  22. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  23. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  24. val forward: libcog.Field

    Permalink
  25. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  26. var gradientBinding: Option[GradientBinding]

    Permalink
    Definition Classes
    DifferentiableField
  27. val gradientConsumer: Boolean

    Permalink
    Definition Classes
    DifferentiableField
  28. val inputs: Map[Symbol, GradientPort]

    Permalink
  29. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  30. val left: DifferentiableField

    Permalink

    The input signal, typically a classification output

  31. def multiply(input: DifferentiableField, c: Float): DifferentiableField

    Permalink
    Definition Classes
    BasicOps
  32. def multiply(left: DifferentiableField, right: DifferentiableField): DifferentiableField

    Permalink
    Definition Classes
    BasicOps
  33. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  34. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  35. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  36. def pow(input: DifferentiableField, n: Float): DifferentiableField

    Permalink

    Raise a node to a fixed power.

    Raise a node to a fixed power. Cog has two pow() function signatures corresponding to both integer and non-integer powers. The integer case is detected here and special-cased (instead of having a separate PowN node for this).

    If the power n is anything other than a positive integer, make sure the inputs are always positive or NaNs will result.

    input

    the input signal

    n

    the power to raise the input to

    Definition Classes
    BasicOps
  37. val refInputIsPDF: Boolean

    Permalink

    The right reference input for each element of the batch sums to 1.

  38. val right: DifferentiableField

    Permalink

    The reference signal, typically a one hot code representing a class label

  39. val safeMode: Boolean

    Permalink

    Protect against generating NaNs for large inputs (>100).

  40. def subtract(input: DifferentiableField, c: Float): DifferentiableField

    Permalink
    Definition Classes
    BasicOps
  41. def subtract(left: DifferentiableField, right: DifferentiableField): DifferentiableField

    Permalink
    Definition Classes
    BasicOps
  42. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  43. def totalDerivative(): libcog.Field

    Permalink
    Definition Classes
    GradientPropagation
  44. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  45. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  46. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from Serializable

Inherited from Serializable

Inherited from Product

Inherited from Equals

Inherited from DifferentiableField

Inherited from GradientPropagation

Inherited from DifferentiableFieldOps

Inherited from BasicOps

Inherited from AnyRef

Inherited from Any

Ungrouped