Newer
Older
Release 3.2.1 (01/15/2020)
=============================
- DM GUI Changes:
- Reworked configuration options for DAQs and uploads; those now support
including/excluding file extensions
- Fixed bug that was preventing usingconfiguration options and additional
keywords at the same time
Release 3.2.0 (11/05/2019)
=============================
- Implemented ability to include or exclude file extensions for uploads and DAQs
- New API Options for starting DAQ/Upload
* includeFileExtensions
* excludeFileExtensions
- New CLI Options for starting DAQ/Upload
* include-extensions
* exclude-extensions
- Added new ESAF interfaces for support of extended data retrieval
- Enabled URL scheme for transfer plugin configuration; this allows
multiple gridftp hosts per deployment
- Updated support software to allow offsite installation
- Add settings page
- Allow user to specify refresh rates per module.
Release 3.0.1 (08/09/2019)
=============================
- Fixed issue with unchecked additions to metadata catalog
- Updated support software:
- OpenJDK (replaces Oracle JDK)
- Payara (replaces Glassfish)
- Added 12IDB utilities
Release 3.0.0 (07/26/2019)
=============================
- Added python web service API for downloading files
- Updated DB schema for experiment data archival support
- Web Portal changes:
- Added storage views
- Added connection between station and experiment types
- Updated experiment views with storage and root path fields
=============================
- Moved code repo to gitlab
- Updated release/upgrade utilities
Release 2.5.0 (06/23/2019)
=============================
- Added pprint (pretty-print) option to display format for all commands
- Added support for repeated commands via repeatPeriod/repeatUntil/maxRepeats
keys in workflow definitions
- Implemented ability to stop processing jobs
- New command:
* stop-processing-job
- Enhanced support for processing multiple files with a given workflow
- New command:
* process-files
- Updated workflow API documentation
- DM Station GUI fixes/enhancements:
- Fixed issue with modifying user list before saving experiment
=============================
- Added system LDAP utilities for verifying and creating groups
- Added support for retrieving single processing job stage via the --stage
option for the get-processing-job command
- Prevented core files from being uploaded
- Prevented creation of experiment names with spaces
- Fixed file limit problem in compression utilities
- DM Station GUI fixes/enhancements:
- Fixed issue with experiment user list modification
- Added tab for monitoring processing jobs
- Restored workflow tab
- Optimized amount of data transferred with file metadata listing

sveseli
committed
- Implemented limit on number of files that can be retrieved using
a single list-experiment-files command or API call
- Added more diagnostic output to compression/decompression utilities
- Added logging and ability to use DM_CONSOLE_LOG_LEVEL environment
variable to specify logging level
- Resolved issue with undefined self key in exception hook
- Added support for single file upload
- Added support for experiment root path
- Added warning dialog for experiment updates
- Added number of tooltips
- Optimized handling of file metadata for experiments with large
number of files
- Added support for pagination while fetching full list of files
- Implemented progress dialog when fetching a very large set of files
- Resolved issue with filter boxes layout not matching data table
default column sizes
Release 2.2.2 (02/11/2019)
=============================
- Updated deployment scripts
- Fixed several Python 2/3 compatibility issues
Release 2.2.1 (02/04/2019)
=============================
- Updated API documentation
- DM Station GUI fixes/enhancements:
- Added file compression capability
=============================
- Added ESAF interfaces to APS DB service
- Moved all ESAF/BSS command line utilities to APS DB service
- New command:
* list-beamlines
- Converted software to Python 3, but retained Python 2 compatibility
- Introduced support for API pip and conda packages
- Added support for a single file upload via the --file-path option to
upload commands
- Added support for arbitrary experiment path under the ${DM_STATION_NAME}
directory via the --root-path option to the following commands:
* add-experiment
* update-experiment
* ${DM_STATION_NAME}-daq
* ${DM_STATION_NAME}-upload
Note that if this option is not used, experiment will be located under
the experiment type folder, as before.
=============================
- Added estimated completion time to upload monitoring
- Introduced additional enhancements for cataloging plugin that shorten
uploads of large datasets
- Developed initial image receiver service and utilities that can accept
Area Detector NTNDArray frames and save them into files; currently
supported formats are SDDS, JSON, HDF5, and JPEG.
- New commands:
* start-image-receiver
* stop-image-receiver
- Added new APS DB service for accessing beamline scheduling system
Release 2.0.0 (10/15/2018)
=============================
- Modified DM DB to support arbitrary experiment root paths
=============================
- Added generic script processing plugin for DAQ service
- Parallelized cataloging plugin; this significantly increased performance
for uploads using directory mode
- New commands:
* compress-files
* decompress-files
- Added support for deleting files
- New commands:
* delete-file
* delete-files
- Modified stat-file utility, which now requires explicit --md5sum flag to
retrieve checksum
- Added new --path-pattern option to the list-files command
- DM Station GUI fixes/enhancements:
- Added initial GUI test suite
- Added right-click copying functionality
- Fixed CPU usage bug
- Fixed incorrect handling of user selections
- Fixed focus policy for several buttons
=============================
- Added support for command output in html format
- Added support for requesting only subset of keys for list of processing jobs
- Resolved issues with symbolic links for uploading directories and files
via rsync
- Fixed DAQ mode issue with double counting of observed files under certain
circumstances
- Fixed problem with cataloging of HDF5 files that contain non-ascii metadata
- DM Station GUI fixes/enhancements:
- Fixed issue with user permissions when creating new experiments
- Fixed upload configuration issue
- Fixed caching issues for experiment file metadata and users
- Added live metadata browsing while files load in background
- Added sorting for user table
- Added ability to copy table information to clipboard
- Improved selection behavior on tables
- Added support for retrieving file metadata for a given experiment in
batches; list-experiment-files get --skip and --limit options
- Simplified workflow and processing job management by allowing DM session user
to be used as default workflow and/or processing job owner
- Modified commands:
* list-workflows
* get-workflow
* delete-workflow
* start-processing-job
* list-processing-jobs
- DM Station GUI fixes/enhancements:
- Added support for DM_DATA_DIRECTORY_MAP environment variable; this allows
browsing data locally, while moving data using GridFTP third
- Enhanced support for viewing large file metadata collections
- Enhanced algorithm for processing of existing files; this resolved issues
with multiple simultaneous DAQs involving large number of pre-existing
files that need to be processed
- Added support for retrieving experiment dataset statistics:
- New command:
* get-file-collection-stats
- Added support for output variables in workflow definitions
- Added generic HDF5 file metadata reader
- Developed DAQ/PROC service integration via new DAQ plugin; this enables
development and deployment of fully automated data acquisition/processing
pipelines based on user-defined workflows
- Modified commands (all get --workflow-name, --workflow-owner,
--workflow-job-owner and --workflow-args options):
* start-daq
* upload
* ${DM_STATION_NAME}-daq
* ${DM_STATION_NAME}-upload
- Added HTTPS support for accessing file metadata catalog via Mongo Express
- DM Station GUI fixes/enhancements:
- Added support for viewing experiment file metadata and file collection
statistics
- Added support for DM_BEAMLINE_MANAGERS environment variable
- Improved dialogs for DAQ/upload configuration options
=============================
- DM Station GUI fixes/enhancements:
- Added integration with ESAF DB
- Added new output formating option "--display-format=key-per-line" that is
common to all commands
- Introduced service monitoring infrastructure, which enabled deployment of
Nagios-based system monitoring
- New commands:
* get-service-status
* clear-service-status
- Added integration with APS ESAF DB via new APIs and CLIs
- New commands:
* list-esafs: list sector ESAFs by year
* get-esafs: retrieve ESAF by id
- Modified commands (new ESAF id option for setting experiment users)
* ${DM_STATION_NAME}-daq
* ${DM_STATION_NAME}-upload
- Enhanced DAQ processing framework with introduction of meta
classes for plugins and supporting utilities; this enables using
different processing chain depending on data location, or other
distinguishing criteria
- DM Station GUI fixes/enhancements:
- Added timer for automatic refresh of DAQ/upload status screens
- Split configuration options for DAQs/uploads
- Fixed issue with duplicate proposal users
- Used current date as default for experiment start/end dates
=============================
- Added DS interface and Java API for downloading files
- Added APIs and CLIs to clear (force service to forget) DAQs and uploads
- New commands:
* clear-daq
* clear-upload
- Introduced APSU-related enhancements:
- Processing for SDDS metadata in MongoDB cataloging plugin
- Component Database processing plugin
- Fixed issues with buttons that require previous item selection
- Added start date to list of DAQs and uploads
- Added ability to remove completed DAQs or uploads from the top-level list
=============================
- Introduced DM Station GUI (accessed via dm-station-gui command)
- Introduced automated system test framework for DM stations
- Added --process-existing option to start-daq and <station>-daq
commands; this will cause upload of existing files when starting DAQs
=============================
- Added APIs and CLIs to update experiment attributes and metadata
catalog, as well as to delete experiments
- New commands:
* update-experiment
* update-experiment-files
* get-async-update-status
=============================
- Modified scheduling algorithm for DAQs/uploads to simplify status monitoring
- Introduced integration with Beamline Scheduling System:
- New commands:
* list-runs
* list-proposals
* get-proposal
- Added the following options to the add-experiment command:
--proposal-id: automatically add to experiment all
users associated with a given beamline proposal
--run: look for beamline proposal in a given run (current run is the
default)
--users: comma specified list of usernames to be added to experiment
as users
- Added the following options for managing DAQs:
--duration: DAQ will be stopped automatically after given
number of days or hours
--dest-directory: files will be uploaded into
a specific directory relative to experiment root path
--upload-data-directory-on-exit: when DAQ finishes, upload of the given
data directory will be executed automatically
--upload-dest-directory-on-exit: specifies destination directory for
upload after DAQ completes
- Added the following options for managing uploads:
--dest-directory: files will be uploaded into
a specific directory relative to experiment root path
- Introduced framework for beamline-specific tools; added
beamline-specific commands that combine adding new experiment with running
DAQs or uploads: dm-${DM_STATION_NAME}-daq and dm-${DM_STATION_NAME}-upload
- Introduced sphinx as python API documentation framework
- Resolved possible timeout issue when starting DAQ or directory upload
with a directory containing large number of files
- Simplified data directory command line option for beamlines that use
gridftp (via DM_DATA_DIRECTORY_MAP environment variable)
=============================
- Introduced concept of experiment station and redesigned authorization
mechanisms to allow beamline managers to manage their stations; all
APIs and CLIs now conform to the new authorization scheme
- Modified get-experiments utility to allow retrieving list of experiments
for a given station
- Cleaned up web portal by removing unused views, and enabled station
management functionality
- GPFS DDN (extrepid) has replaced xstor as the main APS storage
- CLI changes:
- add-experiment command requires station name (can be set from env.
variable); experiment type can be specified using type name
- get-experiments command requires station name for beamline managers (can
be set from env. variable)
- start-experiment command is now optional
=============================
- Resolved issue with incorrect accounting of processing errors for DAQs
- Improved DAQ processing algorithm to avoid resource starvation between
simultaneous DAQs and uploads
- Enhanced monitoring status information for both DAQs and uploads
=============================
- Introduced new framework and utilities for synchronizing users with
APS DB
- Resolved several issues with special characters in file names for
gridftp transfer plugin
=============================
- Added SFTP file system observer agent
- Developed processing for HDF5 metadata in MongoDB cataloging plugin
- Modified catalog API and service interfaces to use file collections on
a per-experiment basis
=============================
- Resolved issue with upload command for directories containing large
number of files
- Implemented enhanced upload processing algorithm to avoid resource
starvation between simultaneous DAQs and uploads
- Added new polling file system observer agent as option for monitoring
directories
- Reworked catalog API and corresponding MongoDB interfaces to use unique
experiment file paths, rather than file names
=============================
- Added dm-list-daqs and dm-list-uploads commands
- Resolved issue with newly created directories treated as files for
real-time data acquisitions

sveseli
committed
=============================
- Developed directory processing mode for uploads; in this mode file transfer
plugins transfer entire directories as opposed to individual files
- Added dm-get-processing-plugins command
- Resolved working directory issue that may occur with simutaneous uploads
- Enhanced upload/daq performance and functionality (hidden files are not
processed; for uploads system can detect files that had been processed
already; improved handling and reporting of processing errors)
- Source file checksum is calculated for rsync/gridftp plugins by default
- Added dm-stop-upload command
- Resolved globus online user authorization delay issue
- Introduced framework and user interfaces for tracking progress of file
uploads and data acquisitions in DAQ service
- Added ability to monitor multiple directories for the same experiment
simultaneously (required changes to DAQ service REST interfaces)
- Enhanced start/stop DAQ and upload commands to use DM_FILE_SERVER_URL
environment variable
- Added user interfaces and utilities that enable experiment data download
from machines that have SSH access to the storage host
=============================
- Added file system observer agent interface for DAQ service
- Implemented FTP file system observer for DAQ service
- Added interfaces for deleting user experiment role in DS service
- Introduced java REST API framework, and specific experiment DS service API
- Web Portal notifies DS service about experiment user modifications
- Implemented Single Sign-On solution for backend services
- Enabled user authentication via login file
- Added file stat (with checksum) interface in DS web service
- After adding user role to experiment via command line, user is also
added to experiment group (if one exists)
- Added rsync file transfer plugin with checksum and delete
- Number of minor modifications made in preparation for test deployment at
- Developed initial version of Catalogging Web Service based on MongoDB
- Developed sample processing plugins: file metadata catalog, SDDS processing,
SGE job submission
=============================
- Implemented storage permission management and user group management
- Developed common file processing service plugin framework
=============================
- Functional web portal (user, experiment, and policy pages)
- Developed web service and its API/CLI frameworks
- Developed initial version of Data Storage Web Service
- Developed initial version of Data Acquisition Web Service;
- DAQ service can monitor file system on a detector node and subsequently
transfer data to storage