Home
Google Labs
Cloud Composer: Copying BigQuery Tables Across Different Locations
Cloud Composer: Copying BigQuery Tables Across Different Locations

Difficulty Level
Intermediate
Prerequisites
none
Lab Duration
60 Min
About Lab
Overview
In this advanced lab, you will learn how to create and run an Apache Airflow workflow in Cloud Composer that completes the following tasks:
- Reads from a config file the list of tables to copy
- Exports the list of tables from a BigQuery dataset located in the US to Cloud Storage
- Copies the exported tables from the US to EU Cloud Storage buckets
- Imports the list of tables into the target BigQuery Dataset in the EU
What you will learn
You will learn how to create a composer workflow.
Prerequisite
- You should be familiar with how to start and end labs on the QwikSkills platform. Along with that, you should know how to log in to the GCP Console.
- Familiar with GCP Console UI. Opening services through the menu.
- If this is your first lab then you can learn the above prerequisites from this lab.