Packages

  • package root

    org.apache.daffodil.sapi - Provides the classes necessary to compile DFDL schemas, parse and unparse files using the compiled objects, and retrieve results and parsing diagnostics

    Apache Daffodil (incubating) Scala API

    Packages

    org.apache.daffodil.sapi - Provides the classes necessary to compile DFDL schemas, parse and unparse files using the compiled objects, and retrieve results and parsing diagnostics

    org.apache.daffodil.sapi.logger - Provides the classes necessary to receive logging messages from Daffodil.

    org.apache.daffodil.sapi.debugger - Provides the classes necessary to perform parse tracing or create a custom debugger

    Definition Classes
    root
  • package org
    Definition Classes
    root
  • package apache
    Definition Classes
    org
  • package daffodil
    Definition Classes
    apache
  • package sapi

    Provides the classes necessary to compile DFDL schemas, parse and unparse files using the compiled objects, and retrieve results and parsing diagnostics

    Provides the classes necessary to compile DFDL schemas, parse and unparse files using the compiled objects, and retrieve results and parsing diagnostics

    Overview

    The Daffodil object is a factory object to create a Compiler. The Compiler provides a method to compils a provided DFDL schema into a ProcessorFactory, which creates a DataProcessor:

    val c = Daffodil.compiler()
    val pf = c.compileFile(file)
    val dp = pf.onPath("/")

    The DataProcessor provides the necessary functions to parse and unparse data, returning a ParseResult or UnparseResult, respectively. These contain information about the parse/unparse, such as whether or not the processing succeeded any diagnostic information.

    Parse

    The DataProcessor.parse method accepts input data to parse in the form of a InputSourceDataInputStream and an InfosetOutputter to determine the output representation of the infoset (e.g. Scala XML Nodes, JDOM2 Documents, etc.):

    val scalaOutputter = new ScalaXMLInfosetOutputter()
    val is = new InputSourceDataInputStream(data)
    val pr = dp.parse(is, scalaOutputter)
    val node = scalaOutputter.getResult

    The DataProcessor.parse method is thread-safe and may be called multiple times without the need to create other data processors. However, InfosetOutputter's are not thread safe, requiring a unique instance per thread. An InfosetOutputter should call InfosetOutputter.reset before reuse (or a new one should be allocated). For example:

    val scalaOutputter = new ScalaXMLInfosetOutputter()
    files.foreach { f => {
      outputter.reset
      val is = new InputSourceDataInputStream(new FileInputStream(f))
      val pr = dp.parse(is, scalaOutputter)
      val node = scalaOutputter.getResult
    }

    One can repeat calls to parse() using the same InputSourceDataInputStream to continue parsing where the previous parse ended. For example:

    val is = new InputSourceDataInputStream(dataStream)
    val scalaOutputter = new ScalaXMLInfosetOutputter()
    val keepParsing = true
    while (keepParsing) {
      scalaOutputter.reset()
      val pr = dp.parse(is, jdomOutputter)
      ...
      keepParsing = !pr.location().isAtEnd() && !pr.isError()
    }
    Unparse

    The same DataProcessor used for parse can be used to unparse an infoset via the DataProcessor.unparse method. An InfosetInputter provides the infoset to unparse, with the unparsed data written to the provided java.nio.channels.WritableByteChannel. For example:

    val inputter = new ScalaXMLInfosetInputter(node)
    val ur = dp.unparse(inputter, wbc)
    Failures and Diagnostics

    It is possible that failures could occur during the creation of the ProcessorFactory, DataProcessor, or ParseResult. However, rather than throwing an exception on error (e.g. invalid DFDL schema, parse error, etc), these classes extend WithDiagnostics, which is used to determine if an error occurred, and any diagnostic information (see Diagnostic) related to the step. Thus, before continuing, one must check WithDiagnostics.isError. For example:

    val pf = c.compile(file)
    if (pf.isError()) {
      val diags = pf.getDiagnostics()
      diags.foreach { d =>
        System.out.println(d.toString())
      }
      return -1;
    }
    Saving and Reloading Parsers

    In some cases, it may be beneficial to save a parser and reload it. For example, when starting up, it may be quicker to reload an already compiled parser than to compile it from scratch. To save a DataProcessor:

    val dp = pf.onPath("/")
    dp.save(saveFile);

    And to restore a saved DataProcessor:

    val dp = Daffodil.reload(saveFile);
    val pr = dp.parse(data, inputter);
    Definition Classes
    daffodil
  • package debugger

    Provides the classes necessary to perform parse tracing or create a custom debugger

    Provides the classes necessary to perform parse tracing or create a custom debugger

    Overview

    Daffodil comes with one prebuilt debugger, the TraceDebuggerRunner, which outputs verbose information during the parsing processes, which can be used to aid in debugging a DFDL schema. For example, the TraceDebuggerRunner can be use like so:

    val tdr = new TraceDebuggerRunner()
    Daffodil.setDebugger(tdr)

    Additionally, one may create their own debugger runner by implementing the methods in the DebuggerRunner.

    Once the debugger is set, it must then be turned on, like so:

    Daffodil.setDebugging(true);
    Definition Classes
    sapi
  • package infoset

    Defines various classes used control the representation of the infoset for parse and unparse.

    Defines various classes used control the representation of the infoset for parse and unparse. Classes that extend InfosetOutputter are provided to the DataProcessor.parse method to deteremine how to output an infoset. These classes are not guaranteed to be thread-safe. Classes that extend InfosetInputter are provided to the DataProcessor.unparse method to determine how to read in an infoset. A new InfosetOutputter is required for each call to unparse().

    Definition Classes
    sapi
  • package io
    Definition Classes
    sapi
  • package logger

    Provides the classes necessary to receive logging messages from Daffodil.

    Provides the classes necessary to receive logging messages from Daffodil.

    Overview

    Daffodil comes with three prebuilt log writers:

    To use one of these log writers, create and instance of it and pass it to Daffodil.setLogWriter. For example, to write all logs to /var/log/daffodil.log:

    val lw = new FileLogWriter(new File("/var/log/daffodil.log"))
    Daffodil.setLogWriter(lw)

    One may also change the log level using Daffodil.setLoggingLevel, which defaults to LogLevel.Info if not set. For example, to change the log level to LogLevel.Warning:

    Daffodil.setLoggingLevel(LogLevel.Warning);
    Definition Classes
    sapi
  • Compiler
  • Daffodil
  • DataLocation
  • DataProcessor
  • Diagnostic
  • InvalidParserException
  • InvalidUsageException
  • LocationInSchemaFile
  • ParseResult
  • ProcessorFactory
  • UnparseResult
  • ValidationMode
  • WithDiagnostics

class Compiler extends AnyRef

Compile DFDL schemas into ProcessorFactory's or reload saved parsers into DataProcessor's.

Linear Supertypes
AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. Compiler
  2. AnyRef
  3. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  5. def clone(): AnyRef
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @native() @throws( ... )
  6. def compileFile(schemaFile: File): ProcessorFactory

    Compile DFDL schema file into a ProcessorFactory

    Compile DFDL schema file into a ProcessorFactory

    To allow jar-file packaging, (where schema files might be part of a jar), it is recommended to use Compiler.compileSource instead.

    schemaFile

    DFDL schema file used to create a ProcessorFactory.

    returns

    ProcessorFactory used to create DataProcessor(s). Must check ProcessorFactory.isError before using it.

    Annotations
    @throws( classOf[java.io.IOException] )
  7. def compileSource(uri: URI): ProcessorFactory

    Compile DFDL schema source into a ProcessorFactory

    Compile DFDL schema source into a ProcessorFactory

    uri

    URI of DFDL schema file used to create a ProcessorFactory.

    returns

    ProcessorFactory used to create DataProcessor(s). Must check ProcessorFactory.isError before using it.

    Annotations
    @throws( classOf[java.io.IOException] )
  8. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  9. def equals(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  10. def finalize(): Unit
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  11. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  12. def hashCode(): Int
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  13. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  14. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  15. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  16. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  17. def reload(savedParser: ReadableByteChannel): DataProcessor

    Reload a saved parser from a java.nio.channels.ReadableByteChannel

    Reload a saved parser from a java.nio.channels.ReadableByteChannel

    savedParser

    java.nio.channels.ReadableByteChannel of a saved parser, created with DataProcessor.save

    returns

    DataProcessor used to parse data. Must check DataProcessor.isError before using it.

    Exceptions thrown

    InvalidParserException if the file is not a valid saved parser.

  18. def reload(savedParser: File): DataProcessor

    Reload a saved parser from a file

    Reload a saved parser from a file

    To allow jar-file packaging, (where the savedParser might be part of a jar), it is recommended to use the other version of Compiler.reload where the argument is a java.nio.channels.ReadableByteChannel for a saved parser.

    savedParser

    file of a saved parser, created with DataProcessor.save

    returns

    DataProcessor used to parse data. Must check DataProcessor.isError before using it.

    Exceptions thrown

    InvalidParserException if the file is not a valid saved parser.

  19. def setDistinguishedRootNode(name: String, namespace: String): Unit

    Specify a global element to be the root of DFDL Schema to start parsing

    Specify a global element to be the root of DFDL Schema to start parsing

    name

    name of the root node

    namespace

    namespace of the root node. Set to empty string to specify no namespace. Set to to NULL to figure out the namespace.

  20. def setExternalDFDLVariable(name: String, namespace: String, value: String): Unit

    Set the value of a DFDL variable

    Set the value of a DFDL variable

    name

    name of the variable

    namespace

    namespace of the variable. Set to empty string to specify no namespace. Set to to NULL to figure out the namespace.

    value

    value to so the variable to

  21. def setExternalDFDLVariables(extVarsFile: File): Unit

    Read external variables from a Daffodil configuration file

    Read external variables from a Daffodil configuration file

    extVarsFile

    file to read DFDL variables from.

    See also

    Daffodil Configuration File - Daffodil configuration file format

  22. def setExternalDFDLVariables(extVarsMap: Map[String, String]): Unit

    Set the value of multiple DFDL variables

    Set the value of multiple DFDL variables

    extVarsMap

    a may of key/value pairs, where the key is the variable name, and the value is the value of the variable. The key may be preceded by a string of the form "{namespace}" to define a namespace for the variable. If preceded with "{}", then no namespace is used. With not preceded by "{namespace}", then Daffodil will figure out the namespace.

  23. def setTunable(tunable: String, value: String): Unit

    Set a Daffodil tunable parameter

    Set a Daffodil tunable parameter

    tunable

    name of the tunable parameter to set.

    value

    value of the tunable parameter to set

    See also

    Tunable Parameters - list of tunables names of default values

  24. def setTunables(tunables: Map[String, String]): Unit

    Set the value of multiple tunable parameters

    Set the value of multiple tunable parameters

    tunables

    a map of key/value pairs, where the key is the tunable name and the value is the value to set it to

    See also

    Tunable Parameters - list of tunables names of default values

  25. def setValidateDFDLSchemas(value: Boolean): Unit

    Enable/disable DFDL validation of resulting infoset with the DFDL schema

    Enable/disable DFDL validation of resulting infoset with the DFDL schema

    value

    true to enable validation, false to disabled

  26. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  27. def toString(): String
    Definition Classes
    AnyRef → Any
  28. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  29. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  30. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @native() @throws( ... )

Inherited from AnyRef

Inherited from Any

Ungrouped