Skip to content

Standard tasks

Table 3 below lists the available standard tasks for the MO Job Service Provider at the time this document was compiled.

Category Task name Description
DSS Document Export Document Exports a document to disk.
DSS GIS Manager DFSMerge Runs the DFSWrite tool.
DSS GIS Manager Dfsu2Dfs2 Converts DFSU file type formats to DFS2.
DSS GIS Manager DFSWrite Runs the DFSWrite tool.
DSS GIS Manager GFSBuilder Runs the GFS Builder task.
DSS GIS Manager GFSDownloader Runs the GFS Downloader task.
DSS GIS Manager RasterCalculator Runs the Raster Calculator tool.
DSS GIS Manager RasterConversion Runs the Raster Calculator tool.
DSS GIS Manager RasterProject Runs a raster project tool.
DSS GIS Manager RasterReclassification Runs the Raster Reclassification tool.
DSS Job ExitIfRunning Exit if a job is running.
DSS Job RunJob Task for running a job.
DSS Job TagCurrentJob Tags current job with a user defined text
DSS Scenario ApproveSimulation Approves a simulation.
DSS Scenario CloneScenario Clones a scenario
DSS Scenario SimulationJobInstanceLog Copy the job instance log to a simulation
DSS Scenario RenameSimulation Renames a simulation
DSS Scenario ExistsSimulation Checks if a simulation exists
DSS Scenario GetScenarioInfo Returns information about a scenario
DSS Scenario RunScenario Runs a scenario
DSS Miscellaneous CheckFTPFile Check the remote file in the FTP Service.
DSS Miscellaneous CopyDirectory Copy Directory is responsible for copying directories.
DSS Miscellaneous DownloadFiles Download files.
DSS Miscellaneous Email Task for sending of mails.
DSS Miscellaneous ExecuteSavedTool Allows saved tools to be used in jobs, removing the need for embedding them in scripts
DSS Miscellaneous GetCultureName Gets the current culture.
DSS Miscellaneous JobHelper Task for handling job items.
DSS Miscellaneous MakeZip Job task for making zip file.
DSS Miscellaneous SaveBlob Job task for saving data into blob table.
DSS Miscellaneous Sleep Job task for blocking the job for a specified period or until a named event occur.
DSS Miscellaneous UnzipFiles Job task for unzipping a zip file.
DSS Script RunScript Runs a script Manager
DSS Settings GetWorkspaceSettings Job task for getting a workspace setting.
DSS Settings SetWorkspaceSettings Job task for setting a workspace setting.
DSS Spreadsheet SaveSpreadsheetTimeseries Saves a range from a Manager spreadsheet as a time series
DSS Spreadsheet UpdateSpreadsheet Update and get spreadsheet cell values task.
DSS Spreadsheet CopySpreadsheet Copies a spreadsheet.
DSS Time DateTimes The DateTimes task is handling responsible for construction of date times.
DSS Time SetTimeStamp Sets the time stamp on a file or in a registry key
DSS Time MakeTimeStamp Makes a time stamp.
DSS Time GetTimeStamp Gets a time stamp from a file or a registry key
DSS Time series AppendTimeseries Appends one time series to another time series.
DSS Time series CalculateTimeseriesQuantile Calculates time series quantiles
DSS Time series CreateTimeseriesGroup Creates a time series group
DSS Time series ExistsTimeseriesGroup Checks if a time series group exists
DSS Time series CopyTimeseries Copies a time series from on group to another
DSS Time series CopyTimeseriesByGroup Copies all time series in one group to another group
DSS Time series CopyTimeseriesByGroupPeriod Copies a period of all time series in a group to another group
DSS Time series CopyTimeseriesByPeriod Copies a period of a time series to another group
DSS Time series ExistsTimeseries Checks if a time series exists
DSS Time series ExportTimeseries Exports a time series to the file system
DSS Time series ImportTimeseries Imports a time series
DSS Time series InsertTimeseriesValues Insert values in a time series
DSS Time series RunToolHierarchy Runs a tool hierarchy
DSS Time series MoveTimeseries Moves a time series from one group to another group
DSS Time series RenameTimeseries Renames a time series
DSS Time series ResampleTimeseries Resamples a time series
DSS Time series SmoothTimeseries Smooth time series task.
DSS Time series SumTimeseries Sum time series task.
DSS Time series TimeseriesStatistics The time series statistics task will loop through the rows in a specified spreadsheet, and given the column that contains a time series path, it will calculate the specified statistics, and return the result to a specified column in the same spreadsheet.
DSS Time series RunToolSequence Runs a tool sequence
DSS Time series ExistTimeseriesValues Checks if a time series value exists
DSS Time series TrimTimeSeries Class for trimming time series
DSS Workflow RunWorkflow Job task to execute a Manager workflow.
External Model FinalizeModel The class for finalizing hotstart.
External Model RunModel Job task for running a models manually without using the MW scenario manager.
External Model FinalizeModel Task for finalizing hotstart.
Maintenance ManageChangeLog Task for managing change log entries.
Maintenance ManageInitialConditions Task to clean initial conditions before specified date from the database.
Maintenance ManageEventLog Task for managing the event log.
Maintenance ManageJobLogs Helps to manage job logs by deleting the job logs before specified date from the database It also gives an option to export the job logs being deleted to an export folder for archiving.
Maintenance ManageSimulations Helps to manage simulations by deleting the simulations before specified date from the database. It also gives an option to export the simulations being deleted to an export folder for archiving.
Maintenance RemoveRasterTimeSteps Removes time steps from a raster
Maintenance RemoveTimeseries Removes a time series
Maintenance RemoveTimeseriesGroup Removes a time series group
Maintenance RemoveTimeSeriesValues Removes time steps from a time series. “Begin” and “End” and end dates can be specified.
Maintenance Vacuum Task for doing a vacuum on the PostgreSQL database of the connection
MSBuild Task CallTarget Invokes the specified targets within the project file.
MSBuild Task CombinePath Combines the specified paths into a single path.
MSBuild Task ConvertToAbsolutPath Converts a relative path, or reference, into an absolute path.
MSBuild Task Copy Copies files on the file system to a new location.
MSBuild Task CreateProperty Populates properties with the values passed in. This allows values to be copied from one property or string to another.
MSBuild Task Delete Deletes the specified files.
MSBuild Task Error Stops a build and logs an error based on an evaluated conditional statement.
MSBuild Task Exec Runs the specified program or command with the specified arguments.
MSBuild Task FindInList In a specified list, finds an item that has the matching itemspec.
MSBuild Task FindUnderPath Determines which items in the specified item collection have paths that are in or below the specified folder.
MSBuild Task MakeDir Creates directories and, if necessary, any parent directories.
MSBuild Task Message The Message task allows MSBuild projects to issue messages to loggers at different steps in the build process.
MSBuild Task OnError The OnError task causes one or more targets to execute, if the ContinueOnError attribute is false for a failed task.
MSBuild Task ReadLinesFromFile Reads a list of items from a text file.
MSBuild Task RemoveDir Removes the specified directories and all of its files and subdirectories.
MSBuild Task RemoveDuplicates Removes duplicate items from the specified item collection.
MSBuild Task Warning Logs a warning during a build based on an evaluated conditional statement.
MSBuild Task WriteLinesToFile Writes the paths of the specified items to the specified text file.
Time Series Factory Db2File Task responsible for extracting data from the database into files on the disk. The main area for configuration is a spreadsheet.
Time Series Factory File2Db Task responsible for extracting data from files into files into the database. The main area for configuration is a spreadsheet.
Time Series Factory TimeSeriesFactory The TimeSeriesFactory task is responsible for construction time series. It works in conjunction with File2Db and Db2File.

Table 3 Task descriptions

DSS GIS Manager

Dfsu2Dfs2

Purpose:

Converts DFSU file type formats to DFS2.

Description:

Converts DFSU file type formats to DFS2.

Properties:

Property Description
Dx Gets or sets the Dx. If not specified Dx will be calculated by the x-extent of the input DFSU file divided by J.
Dy Gets or sets the Dy. If not specified Dy will be calculated by the y-extent of the input DFSU file divided by K.
Dz Gets or sets the dz. If not specified Dz will be set to 1.
GridOriginX Gets or sets the latitude. If not specified the y-origin of the input DFSU file will be used.
GridOriginY Gets or sets the latitude. If not specified the y-origin of the input DFSU file will be used.
J Gets or sets the J. If not specified K will be used.
K Gets or sets the K. If not specified J or 1000 will be used.
L Gets or sets the L. If not specified the number of layers of the input DFSU file will be used.
LandValue Gets or sets the land value. Zero by default.
OutputDfs2FileName Gets or sets the output DFS2 file name. If not specified the input DFSU filename will be used.
Projection Gets or sets the projection. If not specified the projection of the input DFSU file will be used.
Rotation Gets or sets the rotation. Zero by default.

DFSWrite

Purpose:

Writes DFS2 files from rasters in the database.

Description:

Writes DFS2 files from rasters in the database.

Properties:

Property Description
FilePath Gets or sets the path of the file to export
InputRaster Gets or sets the input Raster. raster path or memory:[name].
ClearOutputObject Gets or sets ClearOutputObject. memory:[name].
CoordinateSystem Gets or sets the Coordinate system as string
EumItem Gets or sets the MikeZero eum item for data export. If empty - eum item taken from temporal raster
EumUnit Gets or sets the MikeZero eum item unit for data export. If empty - eum item taken from temporal raster.
ItemName Gets or sets the MikeZero item name for data export. If empty - eum item taken from temporal raster.
StartTime Gets or sets the start time for raster to be saved into DFS2. If empty all time steps exported.

GFSBuilder

Purpose:

Writes DFS2 files from .grb2 files in a folder on disk.

Description:

Writes DFS2 files from .grb2 files in a folder on disk.

Properties:

Property Description
Calculations Gets or sets the calculations to apply. The order of the calculations is significant, and will be applied in order to the variables.
EumTypes Gets or sets the EumTypes to apply. For a list of all available EumTypes, right-click on Timeseries Manager database and choose Add new time series, then find the EumType in the Variable dropdown.
EumUnits Gets or sets the EumUnits to apply. For a list of all available EumUnits, right-click on Timeseries Manager database and choose Add new time series, then find the EumUnit in the Unit dropdown.
GFSFolder Gets or sets the GFS folder. This folder will keep all downloaded GFS files, because files can be reused on subsequent runs.
Offset Gets or sets the offset property in hours (positive or negative). This integer will be added to now or PackageStart in order to determine the final start date.
PackageLength Gets or sets the total package length in hours. Common examples are 24 or 12.
ResultPath Gets or sets the path of the final result dfs2 file. This file will be deleted if it exists.
UseNow Gets or sets a value indicating whether now should be used as the package start date. The current time will be converted to UTC.
Variables Gets or sets the variables to process. The order of the variables must match the order of other properties eg: calculation; units.
PackageStart Gets or sets the package start property (UTC) in format yyyyMMdd eg: 20170827. This will be used to determine the package start date. Can be overidden by UseNow.

GFSDownloader

Purpose:

Downloads and imports GFS files from NCEP.

Description:

The Global Forecast System (GFS) is a weather forecast model produced by the National Centers for Environmental Prediction (NCEP). Dozens of atmospheric and land-soil variables are available through this dataset, from temperatures, winds, and precipitation to soil moisture and atmospheric ozone concentration. The entire globe is covered by the GFS at a base horizontal resolution of 18 miles (28 kilometers) between grid points, which is used by the operational forecasters who predict weather out to 16 days in the future. Horizontal resolution drops to 44 miles (70 kilometers) between grid point for forecasts between one week and two weeks.

Properties:

Property Description
BoundingBox Gets or sets the bounding box in the format [leftlon],[rightlon],[toplat],[bottomlat] e.g. Eastern Europe's Black Sea would be similar to 25.94,44.08,47.52,39.5
ForecastLength Gets or sets the total forecast length in hours. Common examples are 120 or 192. Note that the time step is 1h during the first 120 hours, 3h in the following 120-240h and 12h up to 384 hours.
GFSFolder Gets or sets the GFS folder. This folder will keepall downloaded GFS files, because files can be reused on subsequent runs.
KeepDays Gets or sets the number of days of GFS files to keep. Must be between 0 and 11. Eg: if set to 5, then any file with a forecast time older than 5 days will be deleted.
Offset Gets or sets the offset property in days. This integer will be added to now or ForecastDate in order to determine the final forecast date.
UseNow Gets or sets a value indicating whether now should be used as the forecast date. The current time will be converted to UTC.
Variables Gets or sets the variables to download. When changing this, consider that the GFS files you previously downloaded will not reflect these changes, so building a forecast from older data may be problematic (see variables in the table below).
Cycle Gets or sets the forecast cycle. Should be 0, 6, 12, or 18 (UTC). If -1, the latest cycle will be used.
ForecastDate Gets or sets the forecast date property (UTC) in format yyyyMMdd eg: 20170827. This will be used to determine the forecast publish date. Can be overidden by UseNow.
ImportTo Gets or sets the GIS Manager Group where the GFS GISManagerGroup

Variables

Variable Code Level
Pressure reduced to MSL PRMSL Mean sea level
u-component of wind ground UGRD 10m above
v-component of wind ground VGRD 10m above
Total precipitation APCP Entire atmosphere
Total cloud cover TCDC Surface
Temperature TMP 2m above ground

RasterCalculator

Purpose:

Runs raster statistics on a specific rasters in the database using the Raster Calculator tool.

Description:

Runs raster statistics on a specific rasters in the database using the Raster Calculator tool.

Properties:

Property Description
Formula Gets or sets the formula to apply for the calculation. Use syntax normally found in spreadsheets, but instead of cell references.
InputRaster Gets or sets the input Raster. raster path or memory:[name].
OutputRaster Gets or sets the output Raster. raster path or memory:[name].
ClearOutputObject Gets or sets ClearOutputObject. memory:[name].

RasterConversion

Purpose:

Converts rasters to specific file type.

Description:

Converts rasters to specific file type.

Properties:

Property Description
FilePath Gets or sets the FilePath.
FileType Gets or sets the FileType(GRIB or NetCdf).
OutputRaster Gets or sets the output Raster. raster path or memory:[name].
Variable Gets or sets the variable which to convert.
CoordinateSystem Gets or sets the Coordinate system as string.

RasterProject

Purpose:

Re-projects rasters in the database to another projection.

Description:

Re-projects rasters in the database to another projection.

Properties:

Property Description
CoordinateSystem Gets or sets the coordinate system to project to.
InputRaster Gets or sets the input Raster. raster path or memory:[name].
InterpolationMethod Gets or sets the interpolation method to use (NearestNeighbor, Bilinear or Cubic).
OutputRaster Gets or sets the output Raster. raster path or memory:[name].
RasterProcessor Gets or sets the raster processor to use when executing the tool. PostGIS Raster Processor is the default processor. Other processors can be added and used by the tool.
ClearOutputObject Gets or sets ClearOutputObject. memory:[name].

RasterReclassification

Purpose:

Reclassifies rasters using the Raster Reclassification tool.

Description:

Reclassifies rasters using the Raster Reclassification tool.

Properties:

Property Description
InputRaster Gets or sets the input Raster. raster path or memory:[name].
MappingType Gets or sets a value of defining the type of mapping. If single value, values are mapped one to one. Otherwise, a range of values is mapped to a new value.
OutputRaster Gets or sets the output Raster. raster path or memory:[name].
ClearOutputObject Gets or sets ClearOutputObject. memory:[name].
KeepUnmappedValues Gets or sets a value indicating whether the unmapped values are kept.
Mapping Gets or sets the mapping of values from the original raster to the reclassified raster. Missing or No Data values can be represented by using the string values null, nothing, nodata, novalue, missing, missingdata, or missing value. These can be used as either the orginal value or the new value.

DSS Miscellaneous

CopyDirectory

Purpose:

This task copied a directory of files, from Source directory, to Destination as specified. If DeleteBeforeCopy is set to true, then the destination directory is deleted before the copy occurs. This recursively copies subfolders too.

Properties:

Property Description
SourceDirectory Source directory
DestinationDirectory Destination directory
DeleteBeforeCopy Delete Destination directory before copying current Source data

JobHelper

The JobHelper is a dedicated task to simplify working with job items (see the Use of job items section earlier in this document), by providing options for e.g. extracting specific item elements.

Task parameter Type Description
Action String, required Defines the action to be performed by the JobHelpertask.
InputItems1 Items, required for a subset of the actions See usage in Table 5
InputItems2 Items, required for a subset of the actions See usage in Table 5
Position String, required for a subset of the actions See usage in Table 5
OutputString String, output See usage in Table 5
OutputItems Items, output See usage in Table 5
Separator String See usage in Table 5

Table 4 JobHelper task parameters

Action Description Parameter use
GetItem Gets the item at the specified position. InputItems1
Position
OutputString
GetItemCount Gets the number of elements in the item list InputItems1
OutputString
GetLastItem Gets the last element in the item list InputItems1
OutputString
Rem Remove duplicate items oveDuplicates InputItems1
OutputItems
GetCommonItems Gets an item list with the common elements InputItems1
InputItems2
OutputItems
GetDistinctItems Gets an item list with the distinct elements InputItems1
InputItems2
OutputItems
StringToItemCol Convert a string to an item collection InputString
Separator
OutputItems
ItemColToString Combine an item collection into a string InputItems1
Separator
OutputString

Table 5 JobHelper action values

DSS Time handling

DateTimes

Purpose:

Set date and time

Description:

If UseNow is true, the DateTime will be set to the current date and time. Else, the DateTime specified by the property will be offset by the number of days as specified by the Offset property.

Properties:

Property Description
DateTime Date and time specified to be used
UseNow Boolean to state if use current date and time
Offset Number of days to offset set DateTime
Output Output

External scenario execution tasks

In those cases where the Scanario Manager does not support the functionality required, the external scenario executions tasks can be used. The tasks with primary focus on model execution are described in below table where they are listed in the order they are typically used. These tasks allow very flexible, easy and transparent configuration of the model execution.

Name Description
InitialiseHotstart This task prepares model files including identifying and copying hotstart files as well as modifying times in input files.
RunModel This task runs models and captures output information from the models.
FinalizeHotstart This task archives model runs for use by InitialiseHotstart.
CopyDirectory Copies a directory recursively.
DateTimes Provides the ability to construct date times based on input date times or the current time and allows offsetting the date times.

The intended model execution workflow is illustrated below. Notice how the Time series factory task can be used to prepare input time series before running the model.

  1. The Job runner executes the job.
  2. The database holds contains all the time series.
  3. Current folder contains the model that shall be executed.
  4. The History folder contains the historical model that provides the hotstart files.
  5. The Master folder contains the model template.

The detailed documentation for the External scenario excution tasks is provided below.

InitializeModel

Purpose:

This task initialises the model and hotstart conditions.

Description:

The Models folder structure contains three folders; Current, Master and History, sitting in the root folders structure. Current folder is wiped clean, all data is deleted. Absolute all model data is copied from Master to Current

Hotstart files are copied from History to Current. The files come from the previous job runs. In the Current folder, the hotstart files are set to work with the model, with the date time values set according to job setting from the Initialise target.

Properties:

Property Description
Folder Sub path to model folder and files. E.g C:\data\MIKE IPO Software and Demo data\IRMA\ForecastModelHD
StartTimes Array of StartTimes in the format yyyy-MM-dd HH-mm-ss
EndTimes Array of EndTimes in the format yyyy-MM-dd HH-mm-ss
SimulationFileNames List of model Simulation file names for hotstart.
ModelTypes List of supported Model Types to initialize.

  • MIKE11
  • MIKESHE
  • AutoCAL
  • MOUSE
  • MIKE21
  • MIKE21FM
  • MIKE21SW
  • CurrentFolderName Folder name under the specified Folder to use as working folder.
    ForceHotstartTimes Boolean to ensure Start and End times are used.
    ForceHotstartTimesList Boolean list to ensure Start and End times are used.
    HistoryTime History time the overwrites the first start time to identify the history folder. Use the format yyyy-MM-dd HH-mm-ss.
    HotStart Specify If the hot starting should be done.
    HotstartElements Result file for input to Hotstart
    HotStartFolder Folder to initialize it's hotstart.
    NoHotstartTimeStep Integer specifying the time step number to use if no host start file is available for a time step.
    ResultElements Result output files from Hotstart
    TextLog Write log into a text file InitializedHotstart.Log.
    TimeOfForecast Date time containing the Time Of Forecast (yyyy-MM-dd HH-mm-ss)

    Sample properties, where EndTimes and folder is taken from Job properties.

    RunModel

    Purpose:

    Run hydraulic models and optimiser.

    Description:

    This task executes the models. The task calls the executable for each when they are required to run.

    Properties:

    Property Description
    SimulationFileName Name of simulation file for model and or optimiser.
    ModelType Type of Model or Optimiser. The supported types of models are listed below.

  • MIKE11
  • MOUSEHD
  • MOUSERUNOFF
  • MIKESHE
  • AutoCAL
  • MIKE21
  • MIKE21FM
  • MIKE21SW
  • MIKEFLOOD
  • MIKEFLOODGPU
  • Run64Bit Run the model in 64 bit mode (default=True). True will tell the engine to look for the executable in the x64 subfolder.
    TextLog Write log into a text file RunModel.Log.

    Sample properties. $(HDModelFolder) is a job property defined on the job.

    FinalizeModel

    Purpose:

    Finalise model archiving any hotstart files necessary for subsequent model runs

    Description:

    The Current folder is copied to the new History folder, which is DateTime stamped by StartTime and EndTime.

    This will delete all files and sub folders in the new history folder, determined by entries in CleanupElements, and excluding what is specified in CleanupExcludeElements.

    This will than delete the History Folder, if it contains more files and folders specified by the number Keep input property.

    Properties:

    Property Description
    Folder Sub path to model folder and files
    StartTime Start DateTime for time stamping new history folder
    EndTime End DateTime for time stamping new history folder
    Success Value indicating whether finalizing hotstart was a success.
    Keep Number of days to keep hotstart files for
    CleanupElements List of Elements to Clean up
    CleanupExcludeElements List of Elements to exclude in cleanup
    DestinationHistoryFolder Output folder is a sub folder of the specified Folder named with the start and endtime.
    TextLog Write log into a text file FinalizeModel.Log.

    Sample properties. Job properties are defined on the job for EndTime,Folder and StartTime.

    Maintenance

    ManageChangeLog

    Purpose:

    Manages the change log of database.

    Description:

    Able to delete and/or export the change log.

    Properties:

    Property Description
    DeleteLog Specifies if the change log should be deleted.
    End End date of the query of change logs to be handled.
    Export Indicate if the change log should be exported.
    ExportFile The file to export the change log to.

    If the export file is not specified, a file will be written ito the temporary DHIDSS folder %temp%\DHIDSS. This folder will be cleanup, so that the export files cannot be found.
    Start Start date of the query of change logs to be handled.

    ManageEventLog

    Purpose:

    Manages the event logMaintain simulations of the databasea scenario.

    Description:

    The ManageEventLog job task is able to export and/or delete events in the events table of the MIKE OPERATIONS database.

    Will delete simulations of a scenario.

    Properties:

    Property Description
    DeleteLog Specifies if the event log should be deleted.
    EndScenario End date of the query of event log to be handled.The id or full path to the scenario to manage.
    EventType A comma separated string of event types to delete.
    The event types are specified as an integer values (e.g. 1,2).
    This field is optional. If not specified, all event types will be managed.

    Event Types:

  • Information = 1
  • Warning = 2
  • Error = 3
  • Critical = 4
  • AuditSuccess = 5
  • AuditFailure = 6
  • Export IndicateValue indicating if the event logsimulations of the scenario should be exported.
    ExportFileExportFolder The file to export the event log to.
    If the export file is not specified, a file will be written ito the temporary DHIDSS folder %temp%\DHIDSS. This folder will be cleanup, so that the export files cannot be found.Folder where the simulations are exported to if Export=True.
    ForceDelete Delete simulations even if the status of a simulation is Running.
    OlderThanDaysOlderThan Specified a number of days that events should be older than. Events older than a specified value, will be managed when start date is minimum value and end date is maximum value.
    This means that start and end dates cannot be specified at the same time as specifying OlderThanDays.Will delete simulations older than a specied number of hours.
    Start Start date of the query of event logs to be handled.

    Manage Simulations

    RemoveRasterTimeSteps

    Purpose:

    Removes raster time steps from a raster.

    Description:

    Removes time steps from a raster. The raster should be either

    Properties:

    Property Description
    Raster The id or full path to the raster to maintain.
    Begin The start date from where to delete raster time steps.
    End The end date to where to delete raster time steps .

    RemoveTimeSeriesValues

    Purpose:

    Maintain time steps on in a time series.

    Description:

    Removes time steps from a time series. Begin and End and end dates can be specified.

    Properties:

    Property Description
    TimeSeries Id or full path to the time series to manage.
    Begin Start date from where to delete time steps of the time series.
    End End date to where to delete time steps of the time series.

    Vacuum

    Purpose:

    Reclaims storage occupied by dead tubles in the PostgreSQL database.

    Description:

    Task for doing a vacuum on the PostgreSQL database of the connection.

    VACUUM reclaims storage occupied by dead tuples. In normal PostgreSQL operation, tuples that are deleted or obsoleted by an update are not physically removed from their table; they remain present until a VACUUM is done. Therefore it's necessary to do VACUUM periodically, especially on frequently-updated tables.

    If no Table is specified, VACUUM processes every table in the current database. Specifying the Table parameter, VACUUM processes only that table.

    Plain VACUUM (without FULL) simply reclaims space and makes it available for re-use. This form of the command can operate in parallel with normal reading and writing of the table, as an exclusive lock is not obtained. However, extra space is not returned to the operating system (in most cases); it's just kept available for re-use within the same table. VACUUM FULL rewrites the entire contents of the table into a new disk file with no extra space, allowing unused space to be returned to the operating system. This form is much slower and requires an exclusive lock on each table while it is being processed.

    VACUUM FULL is only needed when you have a table that is mostly dead rows i.e, the vast majority of its contents have been deleted. It should not be used for table optimization or periodic maintenance, as it's generally counterproductive. In most cases the freed space will be promptly re-allocated, possibly increasing file-system-level fragmentation and requiring file system space allocations that're slower than just re-using existing free space within a table.

    When you run VACUUM FULL on a table, that table is locked for the duration of the operation, so nothing else can work with the table. VACUUM FULL is much slower than a normal VACUUM, so the table may be unavailable for a while.

    Properties:

    Property Description
    AnalyseMode Specifies if the vacuum should be run in analyse mode. Updates statistics used by the planner to determine the most efficient way to execute a query.

    After adding or deleting a large number of rows, it might be a good idea to issue a VACUUM ANALYZE command for the affected table. This will update the system catalogs with the results of all recent changes, and allow the PostgreSQL query planner to make better choices in planning queries.
    FullMode Specifies if the vacuum should be a FULL vacuum. When doing a full vacuum without using the Table option, all tables of the database is locked. It is remmended to do a FULL VACUUM on tables being frequently-updated. The following tables are know to be frequently-updated updated.
    Table Single table to vacuum. If left empty, all tables of the database is vacuumed.

    Time series factory tasks

    The purpose of the three time series factory tasks is to allow the user to construct new time series, either from scratch or based on existing time series, by using on a number of built-in operations. The time series factory task suite consists of three tasks that are typically configured in a shared spreadsheet. The functionality is summarised below.

    Name Description
    TimeseriesFactory This task constructs time series for the models based on configuration stored in spreadsheets using a variety of different rules.
    Db2File This task extracts time series from the database and updates input files accordingly. It uses configuration stored in the same spreadsheets as the Hierarchy task.
    File2Db This task works in the opposite way of the Db2File and extracts model results from res11 and dfs0 files and stores it in the data base. The configuration is stored in the same type of spreadsheets as the Db2File and Time series factory tasks.

    Time series factory

    Purpose:

    Prepare time series in correct structure for use in hydraulic models. The structure format is defined in spreadsheets, and the task reads in these spreadsheets, to produced input time series for the models, to ensure they can be processed by the models.

    Description:

    The Time series factory functionality allows for construction of time seires in the database using a variety of different methods. The time series and methods to use on the time series are defined in spreadsheets in the spreadsheet manager and comes in column pairs where the first column specifies one to many metohds to apply to the time series defined in the second column. Processing this time series results in one time series segment to which the result of processing of subsequent column pairs is appended.

    The spreadsheets are normally used in conjunction with the File2Db and Db2File that occupy column A through E where column D is the resulting time series created by the Timeseries factory task.

    The following methods are available:

    Method Description
    ExtrapolateFirst(DateTimeX) This method copies the first value in the time series backwards to DateTimeX unless a value already exists at that point in time or earlier. If the time series is empty, the method does nothing. The new value gets the date time DateTimeX.
    ExtrapolateLast(DateTimeX) This method copies the last value in the time series forwards to DateTimeX unless a value already exists at that point in time or later. If the time series is empty, the method does nothing. The new value gets the DateTimeX.
    None This method simply passes through the time series, effectively operating like a copy
    InsertValueAt(DateTimeX, value) Inserts a value at the specific point in time unless the time series already covers this point in time. If the time series is empty, the value is always inserted. The value is either a double or null.
    ExtractPeriod(DateTimeX, DateTimeY) Extracts all values in the period defined. Use >= for both date times)
    ExtractPeriod(DateTimeX, DateTimeY, true, false) Extracts all values in the period defined. Allows specifying if the = should be used in >=
    DeleteFrom(DateTimeX) Copy timeseries, omitting values from specified date time and onwards
    DeleteTo(DateTimeX) Copy timeseries, omitting values up to and including specified date time
    Multiply(value) Multiply all values in timeseries by value specified.
    Offset(X) Offset every point in timeseries by length of hours specified. This basically adds time to every point in the timeseries, thereby time shifting the timeseries.
    Replace(oldvalue, newvalue) Replaces any occurrence of oldvalue in the specified timeseries, with the new value.
    Min Input source time series are concatenated together, and the resulting time series for use by the models contains a minimum value for each and every time series point that exists in the concatenation of the source time series.
    Max Input source time series are concatenated together, and the resulting time series for use by the models contains a maximum value for each and every time series point that exists in the concatenation of the source time series.
    Add Multiple time series are timestep added together, the resulting time series contain an addition for each time step in all the input time series, timesteps. The start of the resulting time series is equal to the maximum of all input starts, and conversely the end is the minimum of all input end. Interpolation is used to determine input values so that each and every date time between start and end are added. Please note that is this method is used in conjunction with other methods, then this one must be used first, as it takes multiple time series as input arguments. Also appends results of previous processing or earlier method time series column pairs. Hence this is used to allow hierarchy methods to prepare time series before being added.
    DropBeforeFirst(value) This method adds a time step to the time series, one second before the first time step present, and with the value specified as input to the method
    DropHoursAfter(hours, value) This method adds a time step to the time series after the last time step present. The method takes two arguments, first being number of hours the insert the time step after the current last time step, and the second argument is the value to be inserted.
    DropAfterLast(value) This method add a time step to the time series, one second after the last time step present, and with a value specified as input to the method.
    DropInGaps This method adds two time steps between any gap, the gap and value to be added in the time step is specified as input to the methods, and the time steps are 1s after first value and 1s before second value in the gap, to fill in the gap.
    CreateNew(valuetype, xaxistype, xaxisunit, yaxisvariable, yaxisunit) Allows creation of empty time series. An example is CreateNew(Mean_Step_Accumulated,Non_Equidistant_Calendar,second,Pumping Rate,m^3/yr)
    MultiplyMonths(J, F, M, A, M, J, J, A, S, O, N, D) Multiplies fators to months. The factors use period as comma separator

    The task will proceed and process all rows in all sheets even though there are errors. All errors are logged to the Job Manager output as well as added to a list. When the task terminates, it should fail if this list contains entries.

    Properties:

    Property Description
    Spreadsheet Name of spreadsheet
    Worksheets List of worksheet names to traverse
    Replace Replace allows for the definition of statements that are replaced into any cell value. The parameters takes the form of e.g. Val1=Val2;Val3=Val4 wich means that all occurences of Val1 will be replaced wirg Val2. Likewise for Val3 that is replaced with Val4
    ExcludeQuality Numbers to filter out of input timeseries. This represents timeseries values with bad quality, and having the flag set

    Db2File

    Purpose:

    Export timeseries to file

    Description:

    This task export timeseries to dfs0 files. This task reads through a spreadsheet, to obtain the specification for timeseries to export, resulting dfs0 files to be generated and contains important data unit and type information for source and destination.

    Properties:

    Property Description
    Spreadsheet Spreadsheet containing export specifications.
    Worksheet Worksheet in spreadsheet containing export specifications
    Folder Destination File for timeseries
    Replace Replace allows for the definition of statements that are replaced into any cell value. The parameters takes the form of e.g. Val1=Val2;Val3=Val4 which means that all occurences of Val1 will be replaced wirg Val2. Likewise for Val3 that is replaced with Val4

    File2Db

    Purpose:

    Import timeseries from file.

    Description:

    This task imports timeseries from dfs0 files. This task reads through a spreadsheet, to obtain the specification for timeseries to export, resulting dfs0 files to be generated and contains important data unit and type information for source and destination

    Properties:

    Property Description
    SpreadsheetName Spreadsheet containing import specifications
    WorksheetName Worksheet in spreadsheet containing import specifications
    Folder Source File for timeseries
    StartTime Import data not before this data and time
    EndTime Import data not after this date and time
    Replace Replace allows for the definition of statements that are replaced into any cell value. The parameters takes the form of e.g. Val1=Val2;Val3=Val4 which means that all occurences of Val1 will be replaced wirg Val2. Likewise for Val3 that is replaced with Val4

    Run Scenario Task

    Run Scenerio

    Purpose:

    Run scenario task.

    Description:

    This task can be created refer to the following screen shot:

    Properties:

    Property Description
    Display name Display name of the task
    Scenario Gets or sets the scenario id or fully qualified path
    AfterOutputCreated Gets or sets the value of after output delegate - a script to run after output has been retrieved from the adapter. The format is either scriptpath or scriptpath@arguments, where arguments represents a list of arguments to the script starting from the second argument (the first is always the IAdapter). Arguments are separates by a vertical bar (|), e.g. /myascript@1|some text
    AfterSimulationScript Gets or sets the value of after simulation delegate - a script to run after the model has run. The format is either scriptpath or scriptpath@arguments, where arguments represents a list of arguments to the script starting from the second argument (the first is always the IAdapter). Arguments are separates by a vertical bar (|), e.g. /myascript@1|some text
    Approve Gets or sets a value indicating whether to simulation should be approved or not
    BeforeSimulationScript Gets or sets the value of beforesimulation delegate - a script to run before the model is started. The format is either scriptpath or scriptpath@arguments, where arguments represents a list of arguments to the script starting from the second argument (the first is always the IAdapter). Arguments are separates by a vertical bar (|), e.g. /myascript@1|some text
    EndTime Gets or sets the value of end date.
    OutputTimeseriesDisposition Gets or sets the value of output timeseries disposition.
    OutputTimeseriesGroup Gets or sets the value of output timeseries group.
    StartTime Gets or sets the value of Start date.
    TimeOfForecast Gets or sets the value of time of forecast.
    Condition Optional attribute. The condition to be evaluated.
    ContinueOnError Optional attribute. A Boolean attribute that defaults to false if not specified.
    Enabled To enable or disable the task.

    How to set up an after/before simulation script in Run Scenario Task

    The before simulation script as the after simulation script in the RunScenario Task in the job manager takes at least (the adpater object) but typically two arguments. The first agrument is always the adapter object, the second is of type object and is used to parse additional paramaters. These additional parameters are sepcified in the job task behind the script separated by vertical bars (|). For example: testing@par1|par2|par3

    The script testing needs a definition such as:

    def testing(adapter, args):
        <Script>
        <Author>admin</Author>
        <Description>Run Scenario pre-processing script to do something</Description>
        <Parameters>
            <Parameter name="adapter" type="IAdapter">Handle to the Generic Adapter at runtime.</Parameter>
            <Parameter name="args" type="object">List of strings.</Parameter>
        </Parameters>
        </Script>