0

I'd like to import bigquery data to bigtable using Google Composer.

Exporting bigquery rows in Avro format to GCS was successful. However, import Avro data to Bigtable was not.

The error says

Caused by: org.apache.avro.AvroTypeException: Found Root, expecting com.google.cloud.teleport.bigtable.BigtableRow, missing required field key

I guess the schema between bigquery and bigtable should match each other. But I have no idea how to do this.

Eric Lee
  • 648
  • 2
  • 8
  • 26
  • Does this answer your question? [how to export bigquery to bigtable using airflow? schema issue](https://stackoverflow.com/questions/68663873/how-to-export-bigquery-to-bigtable-using-airflow-schema-issue) – Krish Aug 09 '21 at 13:29

1 Answers1

0

For every record read from the Avro files:

  • Attributes present in the files and in the table are loaded into the table.
  • Attributes present in the file but not in the table are subject to ignore_unknown_fields,
  • Attributes that exist in the table but not in the file will use their default value, if there is one set.

The below links are helpful.

[1] https://cloud.google.com/dataflow/docs/guides/templates/provided-batch#cloud-storage-avro-to-bigtable [2] https://github.com/GoogleCloudPlatform/DataflowTemplates/blob/master/src/main/resources/schema/avro/bigtable.avsc

[3] Avro to BigTable - Schema issue?

Lakshmi
  • 101
  • 3