Distribution job is designed to distribute data from a server to selected agents and it's perfect, for example, for delivering software updates or important documents to remote workstations.
Mobile devices cannot participate in Distribution job.
This job looks at the the files in a selected source folder and syncs them to remote agents. It's reusable, so new files can be later added to the source folder. To re-use the job, just manually launch it with "Start" button. At this point source agent will rescan the folder and upload new files. The already present files will be re-hashed and compared across the agents. Matching files won't be re-synced.
Here's how Distribution job gets created and configured through a step-by-step wizard:
1) Go to JOBS tab -> Create new job, pick Distribution, click Next.
2) Give the job a name and description. This is optional and defaults can be used. Check option to use SHA2 hashing if preferred. Note that with SHA2 Agents running Connect version 2.0 and older will not be able to participate in this job. Click next.
3) Pick the source agent - the agent that will be uploading data to others. Only one agent can be picked.
4) Create group of destination agents, add the wanted agents there, assign a custom Profile and Schedule if necessary.
Note, instead of creating new, you can use already an existing group, if you have one, provided the agents do not produce Conflicts.
5) On step Path source and destination share paths shall be specified.
When using default Path Macro, Agent will start distributing the directory where that macro points unless you specify a subfolder there.
Using Agent tags is also possible. See here for more details on using tags.
6) Distribution job supports post commands which make it possible to run a command once transfer is complete. Triggers indicate the moment when the script will be executed. This step can be skipped.
Before file-indexing begins: right after job is created, the agent will start indexing files in the specified directory. A script, triggered at this moment can "cook the files before serving", for example, re-arrange the them, add/remove new and do things alike, so that the folder is indexed and distributed in a proper manner the way you need.
After an agent completes downloading: the script will run on each destination agent after it finishes download. Other agents may be still downloading the files, thus it's recommended not to remove or update the distributed files with this trigger.
After all agents complete downloading: as opposed to the trigger above, in this case script will run only after all destination agents finish downloading all the files.
8) Job scheduler defines when the job will be launched:
Run now - right after creation;
Run at - at the preferred date/time (local agents' time);
Repeat manually - job won't start until manually launched with "Start" button;
Repeat hourly - job will run every N hours. Scale is 1 hour, integer.
Repeat daily - job will run on a daily basis at the selected time. Scale is 1 day, integer.
Repeat weekly - job will run on selected days on the week, additionally you can set the exact time of the day.
In all those periodic schedules (hourly, daily and weekly) it's possible to select the starting and ending points for the job.
7) Review the job details and save.
Right after that source agents will index the specified source share and upload data to destination agents. A specified script/command will be executed at the picked trigger.