Data Flow Facilitator for Machine Learning (dffml) v0.2.1 Release Notes

Release Date: 2019-06-07 // almost 5 years ago
  • [0.2.1] - 2019-06-07

    โž• Added

    • Definition spec field to specify a class representative of key value pairs for
      definitions with primitives which are dictionaries
    • ๐Ÿ“š Auto generation of documentation for operation implementations, models, and
      ๐Ÿ”ง sources. Generated docs include information on configuration options and
      inputs and outputs for operation implementations.
    • Async helpers got an aenter_stack method which creates and returns and
      contextlib.AsyncExitStack after entering all the context's passed to it.
    • Example of how to use Data Flow Facilitator / Orchestrator / Operations by
      writing a Python meta static analysis tool,
      shouldi

    ๐Ÿ”„ Changed

    • OperationImplementation add_label and add_orig_label methods now use op.name
      instead of ENTRY_POINT_ORIG_LABEL and ENTRY_POINT_NAME.
    • ๐Ÿ‘‰ Make output specs and remap arguments optional for Operations CLI commands.
    • ๐Ÿ”‹ Feature skeleton project is now operations skeleton project

    ๐Ÿ›  Fixed

    • MemoryOperationImplementationNetwork instantiates OperationImplementations
      using their withconfig() method.
    • MemorySource now decorated with entry_point
    • MemorySource takes arguments correctly via config_set and config_get
    • skel modules have long_description_content_type set to "text/markdown"
    • Base Orchestrator __aenter__ and __aexit__ methods were moved to the
      Memory Orchestrator because they are specific to that config.
    • Async helper aenter_stack uses inspect.isfunction so it will bind lambdas