LogoLogo
OverviewQuickstartsHow To GuidesReferenceArticlesSupport
Support
Support
  • Support
  • FAQs
    • Get Started with Upsolver
    • Basic Elements of Upsolver
    • Iceberg Cloud Storage Breakdown
    • Infrastructure
    • Cost Estimator
  • TROUBLESHOOTING
    • AWS Configuration
      • CloudFormation Stack Failed to Deploy
      • Private API Doesn't Start or Can't Connect
        • Elastic IPs Limit Reached
        • EC2 Spot Instance Not Running
        • DNS Cache
        • Security Group Not Open
    • Cluster
      • Compute Cluster Doesn't Start
      • Can't Connect to Apache Kafka Cluster
    • Jobs
      • Problem Ingesting Amazon S3 Data
      • Data Doesn't Appear in Athena Table
      • Exception When Querying Athena Table
  • ERROR MESSAGES
    • Error Messages
      • Cluster
        • UP10020 COMPUTE_CLUSTER is Missing
      • Jobs
        • UP10010 Missing ON Condition
        • UP10030 Entity Already Exists
        • UP10040 Entity Not Found
        • UP10050 Materialized View Illegal Column Expression
        • UP10060 Statement Parsing
        • UP10100 Cannot Select Records in an UNNEST Statement
        • UP20010 Source Data Not Found
        • UP20040 Could Not DROP Entity Used by a Job or Materialized View
      • Replication
        • UP20050 Reached PostgreSQL Replication Slots Limit
        • UP20051 PostgreSQL Replication is Disabled
      • Security
        • UP20020 No Access to Database
        • UP20030 No Permissions to assumeRole
        • UP20060 Unable to Connect
        • UP20061 Unable to Connect to a Private Network
Powered by GitBook
On this page
  • Possible Causes
  • Possible Solutions
  1. ERROR MESSAGES
  2. Error Messages
  3. Jobs

UP20010 Source Data Not Found

A new ingestion job could not find data at the specified location.

Possible Causes

For example, in the bucket stores-data, no files were found in the orders folder:

CREATE SYNC JOB load_orders_raw_data_from_s3
    CONTENT_TYPE = JSON
AS COPY FROM S3 s3_connection 
    LOCATION = 's3://store-data/orders/' 
INTO default_glue_catalog.stores_data.orders_raw_data; 

Possible Solutions

  • Verify that the data exists at the location specified in the job.

  • Verify you configured the job to read from the correct source location (topic/stream/bucket and path/etc.).

  • If data is expected to arrive in the source at a later point in time, either the SKIP_ALL_VALIDATIONS or SKIP_VALIDATIONS job option should be included in the job creation declaration. This option instructs Upsolver to ignore specific validations to enable you to create a job that reads from a source that currently has no data. The validation parameter depends on the source:

    • Amazon S3 - the source has not been created yet: SKIP_VALIDATIONS = ('EMPTY_PATH')

    • Apache Kafka - the topic has not been created yet: SKIP_VALIDATIONS = ('MISSING_TOPIC')

    • Amazon Kinesis - the stream has not been created yet: SKIP_VALIDATIONS = ('MISSING_STREAM')

    • Any other source: SKIP_VALIDATIONS = ('MISSING_DATA')

Last updated 11 months ago

Visit the jobs reference to learn more about the SKIP_ALL_VALIDATIONS and SKIP_VALIDATIONS job options.

Ingestion