User guide
First, you’ll use the following COPY statements to load new data from Amazon S3 to the tables in your
Amazon Redshift TICKIT database in the target cluster.
copy venue from 's3://<region-specific-bucket-name>/resize/etl_venue_pipe.txt'
CREDENTIALS 'aws_access_key_id=<Your-Access-Key-ID>;aws_secret_access_key=<Your-
Secret-Access-Key>' delimiter '|';
copy category from 's3://<region-specific-bucket-name>/resize/etl_cat
egory_pipe.txt' CREDENTIALS 'aws_access_key_id=<Your-Access-Key-
ID>;aws_secret_access_key=<Your-Secret-Access-Key>' delimiter '|';
copy date from 's3://<region-specific-bucket-name>/resize/etl_date_pipe.txt'
CREDENTIALS 'aws_access_key_id=<Your-Access-Key-ID>;aws_secret_access_key=<Your-
Secret-Access-Key>' delimiter '|';
copy event from 's3://<region-specific-bucket-name>/resize/etl_events_pipe.txt'
CREDENTIALS 'aws_access_key_id=<Your-Access-Key-ID>;aws_secret_access_key=<Your-
Secret-Access-Key>' delimiter '|' timeformat 'YYYY-MM-DD HH:MI:SS';
You must replace <Your-Access-Key-ID> and <Your-Secret-Access-Key> with your own credentials
and <region-specific-bucket-name> with the name of a bucket in the same region as your cluster.
We recommend that you use temporary credentials to perform this COPY operation. For more information
about using temporary credentials, see Temporary security credentials in the Amazon Redshift Database
Developer Guide. Use the following table to find the correct bucket name to use.
<region-specific-bucket-name>Region
awssampledbUS East (N. Virginia)
awssampledbuswest2US West (Oregon)
awssampledbeucentral1EU (Frankfurt)
awssampledbeuwest1EU (Ireland)
awssampledbapsoutheast1Asia Pacific (Singapore)
awssampledbapsoutheast2Asia Pacific (Sydney)
awssampledbapnortheast1Asia Pacific (Tokyo)
Note
In this exercise, you upload sample data from existing Amazon S3 buckets, which are owned
by Amazon Redshift. The bucket permissions are configured to allow everyone read access to
the sample data files. If you want to upload your own data, you must have your own Amazon
S3 bucket. For information about creating a bucket and uploading data, go to Creating a Bucket
and Uploading Objects into Amazon S3 in the Amazon Simple Storage Service Console User
Guide.
Step 6: Rename the Source and Target Clusters
Once you verify that your target cluster has been brought up to date with any data needed from the ETL
process, you can switch to the target cluster. If you need to keep the same name as the source cluster,
you’ll need to do a few manual steps to make the switch. These steps involve renaming the source and
target clusters, during which time they will be unavailable for a short period of time. However, if you are
able to update any data sources to use the new target cluster, you can skip this section.
1. Open the Amazon Redshift console.
API Version 2012-12-01
234
Amazon Redshift Management Guide
Step 6: Rename the Source and Target Clusters