• Angelo Cesaro

Apache Kafka and BigQuery connector with batch loads

Updated: Jan 8

BigQuery has 2 ways to ingest data:

1. First via streaming

2. Second via BigQuery jobs which read data from a GCS bucket and batch loads them to BigQuery.


The difference between the 2 approaches is the cost. Streaming data directly to BigQuery costs you by number of messages streamed and has a cap of 100k records per second. Conversely, BigQuery batch loads are free (you will be paying for data stored in bucket).

Kafka BigQuery sink connector uses streaming API by default but the beta feature (enableBatchLoad) allows the other route of loading through BigQuery batches.



Confluent Documentation:

Google BigQuery Quotas and Limits

Google BigQuery Sink Connector for Confluent


Get in touch to find out more!

We'll be happy to discuss your needs

Send us an email at contact@cesaro.io

489 views0 comments

Recent Posts

See All