Page tree

Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Table of Contents

Introduction

A common problem is the disk usage when keeping 10000s of completely  finished tasks in translate5.

Therefore a special workflow action "deleteOldEndedTasks" exists in order to delete tasks specified by a specific filter.

Backup before deletion

For warranty reasons the tasks can be exported in multiple formats to a definable SFTP space before deletion.

The backup is done as

  • export in the original format
  • export in XLF2 format - containing change marks
  • zip that files and move them onto a configurable backup space

Usage and Configuration

In order to use that feature add one or more (with different filters) entries to the LEK_workflow_action table - currently no CLI or UI tool is available.

Note

See Workflow Action and Notification Customization for details to that database table.


The added line(s) to the workflow action table should look like:

triggeractionClassactionparameters 
doCronDailyeditor_Workflow_ActionsfinishOverduedTaskUserAssoc/Checks the deadine dates of a task assoc, if it is overdued, it'll be finished for all lectors, triggers normal workflow handlers if needed.
doCronPeriodicaleditor_Workflow_ActionsdeleteOldEndedTasks

in JSON format, see below explained in detail.

As trigger endpoint the periodical cron API endpoint should be used. Since each call is limited to by default 5 tasks to be processed a daily trigger would delete and backup only 5 tasks a day.

Info

Multiple configurations with different filter sets (for example different workflow states for different client ids) can be added.


Parameters JSON explained

Warning

By default only tasks with status "end" are considered to be deleted, unless below workflowSteps config is given.

With a given workflowSteps config tasks in status end, open and unconfirmed AND with matching workflowSteps are matching.

Parameters for task filtering (needed for deletion and backup)

{

    "filesystem": "local",

    "targetPath":
parametertypemandatatory / defaultdescription
limitintegeroptional, defaults to 5The amount of maximum tasks deleted / backuped per execution
workflowStepsstring[]

optional, empty array "[]"

The workflow step the task must match -

Optional:

{"limit": 5, "workflowSteps": ["no workflow", "workflowEnded"] }

Delete all tasks where the task status is 'end', and the last modified date for this task is older than x days (where x is Zf_configuration property taskLifetimeDays)

Only X tasks are deleted at once. X defaults to 5 and can be set as parameter.

Additionally "workflowSteps" can be provided. In this case except from default end status of a task also tasks in one of workflow steps will be handled.  

doCronPeriodicaleditor_Workflow_ActionsdeleteOldEndedTasksmakes mostly sense
clientIdsinteger[]

optional, empty array "[]"

The list of client IDs (yes the pure numeric ID, not the customer number!) the task must match
lifetimeTypestring "modified" or "created"

optional, defaults to "modified"

Defines if the last modified timestamp or the created time stamp of the task should be evaluated with the next parameter
taskLifetimeDaysinteger

optional, see description

Can be defined here, if nothing is defined takes system config "runtimeOptions.taskLifetimeDays", if that is also not defined, defaults to 100

Parameters for task backup (in the same config object)

parametertypemandatatory / defaultdescription
filesystemstring "sftp" or "local"needed for backup

sftp: backup to sftp location

local: backup to local folder

targetPathstring

needed for backup

The path with task pattern on the target system. Example: "/backup/task-{id}-{taskNr}-{taskName}.zip"

For SFTP: targetPath must contain the whole absolute path on the server. You get that after logging in with an ordinary sftp tool to your sftp server.

Basically all values given in {curly braces} are replaced with the same named value from the task json object.

hoststring

needed for sftp backup

the SFTP server hostname / IP address
portinteger

optional for sftp backup

Defines the port to be used for the network connection
usernamestring

needed for sftp backup

SFTP login username
passwordstring

needed for sftp backup

SFTP login password

Example delete 5 tasks each with workflow steps "no workflow" or "workflowEnded" OR implicitly task status "end"

The tasks are additionally filtered for tasks where the modified timestamp is older then X days, where X is configured in runtimeOptions.taskLifetimeDays

Code Block
titleParameters JSON:
{"limit": 5, "workflowSteps": ["no workflow", "workflowEnded"]}

Same example with local backup and for clients with ID 123 or 321 only

Code Block
{
    "filesystem": "local",
    "targetPath": "/backup/task-{id}-{taskNr}-{taskName}.zip",

...


    "limit": 5 (optional)

...

,
	"clientIds": [123,321]
    "workflowSteps": ["no workflow", "workflowEnded"] (optional)

...


}

Same

...

example with SFTP backup

{
    "filesystem": "sftp",
    "targetPath": "/home/USERNAME/backup/task-{id}-{taskNr}-{taskName}.zip",
    "host": "HOSTNAME_OR_IP",
    "port": 1234,
    "username": "USERNAME",
    "password": "PASSWORD"
}

Apart from the local filesystem and SFTP, WebDAV and several cloud services can be addressed, if integrated. Get in contact with MittagQI to get more information here.

Port is optional, must be given as integer

...

targetPath must contain the whole absolute path on the server. You get that after logging in with an ordinary sftp tool to your sftp server.

...

.