Page tree

Introduction

A common problem is the disk usage when keeping 10000s of completely  finished tasks in translate5.

Therefore a special workflow action "deleteOldEndedTasks" exists in order to delete tasks specified by a specific filter.

Backup before deletion

For warranty reasons the tasks can be exported in multiple formats to a definable SFTP space before deletion.

The backup is done as

  • export in the original format
  • export in XLF2 format - containing change marks
  • zip that files and move them onto a configurable backup space

Usage and Configuration

In order to use that feature add one or more (with different filters) entries to the LEK_workflow_action table - currently no CLI or UI tool is available.

See Workflow Action and Notification Customization for details to that database table.


The added line(s) to the workflow action table should look like:

triggeractionClassactionparameters 
doCronPeriodicaleditor_Workflow_ActionsdeleteOldEndedTasks

in JSON format, see below explained in detail.

As trigger endpoint the periodical cron API endpoint should be used. Since each call is limited to by default 5 tasks to be processed a daily trigger would delete and backup only 5 tasks a day.

Multiple configurations with different filter sets (for example different workflow states for different client ids) can be added.

Parameters JSON explained

By default only tasks with status "end" are considered to be deleted, unless below workflowSteps config is given.

With a given workflowSteps config tasks in status end, open and unconfirmed AND with matching workflowSteps are matching.

Parameters for task filtering (needed for deletion and backup)

parametertypemandatatory / defaultdescription
limitintegeroptional, defaults to 5The amount of maximum tasks deleted / backuped per execution
workflowStepsstring[]

optional, empty array "[]"

The workflow step the task must match - ["no workflow", "workflowEnded"] makes mostly sense
clientIdsinteger[]

optional, empty array "[]"

The list of client IDs (yes the pure numeric ID, not the customer number!) the task must match
lifetimeTypestring "modified" or "created"

optional, defaults to "modified"

Defines if the last modified timestamp or the created time stamp of the task should be evaluated with the next parameter
taskLifetimeDaysinteger

optional, see description

Can be defined here, if nothing is defined takes system config "runtimeOptions.taskLifetimeDays", if that is also not defined, defaults to 100

Parameters for task backup (in the same config object)

parametertypemandatatory / defaultdescription
filesystemstring "sftp" or "local"needed for backup

sftp: backup to sftp location

local: backup to local folder

targetPathstring

needed for backup

The path with task pattern on the target system. Example: "/backup/task-{id}-{taskNr}-{taskName}.zip"

For SFTP: targetPath must contain the whole absolute path on the server. You get that after logging in with an ordinary sftp tool to your sftp server.

Basically all values given in {curly braces} are replaced with the same named value from the task json object.

hoststring

needed for sftp backup

the SFTP server hostname / IP address
portinteger

optional for sftp backup

Defines the port to be used for the network connection
usernamestring

needed for sftp backup

SFTP login username
passwordstring

needed for sftp backup

SFTP login password

Example delete 5 tasks each with workflow steps "no workflow" or "workflowEnded" OR implicitly task status "end"

The tasks are additionally filtered for tasks where the modified timestamp is older then X days, where X is configured in runtimeOptions.taskLifetimeDays

Parameters JSON:
{"limit": 5, "workflowSteps": ["no workflow", "workflowEnded"]}

Same example with local backup and for clients with ID 123 or 321 only

{
    "filesystem": "local",
    "targetPath": "/backup/task-{id}-{taskNr}-{taskName}.zip",
    "limit": 5 (optional),
	"clientIds": [123,321]
    "workflowSteps": ["no workflow", "workflowEnded"] (optional)
}

Same example with SFTP backup

{
    "filesystem": "sftp",
    "targetPath": "/home/USERNAME/backup/task-{id}-{taskNr}-{taskName}.zip",
    "host": "HOSTNAME_OR_IP",
    "port": 1234,
    "username": "USERNAME",
    "password": "PASSWORD"
}

Apart from the local filesystem and SFTP, WebDAV and several cloud services can be addressed, if integrated. Get in contact with MittagQI to get more information here.

Port is optional, must be given as integer.

  • No labels