Monitoring
Drilling into a specific job from the Jobs page will take you to the job monitoring dashboard. Here you will find a list of job executions:
Each job execution represents an interval of data that the job is processing. You can track the latest job executions that were completed, and those which are still running.
A job execution can be in any of the following statuses:
The following KPIs can be tracked for the execution:
Principals
The data range (from data start time to data end time) represents the span of data that is being processed by the job execution:
The dates correspond to the
$event_time
, which reflects the increment date configured for your job.The range size (e.g., 1 minute, 15 minutes, etc) is determined by the job interval, either
COMMIT_INTERVAL
when available, orRUN_INTERVAL
.
Data ranges can be processed concurrently. Newer data ranges may begin processing alongside previous executions that are still running. This can happen in two scenarios:
Backlog of History: You created a job that starts picking data from the earliest time. The backlog of history would run parallel job executions. The size of each execution is determined by the job
COMMIT_INTERVAL
.Job Backlog: The time it takes to process a single execution is longer than the time span between executions. For example, you scheduled a job to run every minute, but it takes longer to process a particular minute. Hence, newer minutes would start processing in parallel.
While executions can proceed concurrently, data commits to the target system are always sequential. Data ranges are loaded in order, ensuring that newer data ranges are never loaded before older ones. The parallel processing applies to all operations preceding to the commit phase.
Last updated