Transformation
Transformation jobs copy your source data to your target and are written using familiar SQL code. These Quickstart guides will provide you with the essential skills to write a transformation job, however, Upsolver includes an extensive list of functions and operators that you can use to build advanced solutions for your requirements.
Before writing a transformation job, ensure you have a connection to your target and that your ingestion job is running.
Upsert Data to the Target Table
Perform inserts and updates in your target data using the INSERT
and MERGE
statements.
Delete Data from the Target Table
Find out how to use the MERGE
statement to delete rows from the target that have been deleted in the source.
Aggregate and Output Data Understand how to aggregate data when transforming data in the staging zone.
Join Two Data Streams
Learn how to join multiple data streams into one table using a MATERIALIZED VIEW
.
Output to Amazon Athena Learn how to create a job that writes to Amazon Athena.
Output to Amazon Redshift Find out how to write a transformation job to write to Amazon Redshift.
Output to Amazon S3 Discover how you can create a job that copies your data to Amazon S3.
Learn how to write a transformation job to copy your data to Elasticsearch.
Output to Snowflake Write a job that copies writes to a Snowflake table.
Last updated