Data Export Quickstart
Learn how to export data from ZettaBlock for offline usage, including analysis, backup, or integration into external systems.
Currently, you can export data in the following ways:
- via a CSV file download
- via Analytics API (if CSV download is too manual)
- via GraphQL API (to surface into your own dApp)
- via an ETL pipeline
- via webhooks (coming soon)
- via Kafka Streams (coming soon)
- by connecting to the Data Lake via a Jupyter Notebook (doc).
Option | Limit | Summary | Common Use Case |
---|---|---|---|
CSV Download | 5MB/ query | Click the download button after executing the SQL query. | Download SQL query results into CSV within the UI. |
Analytics API | - 1 concurrent call (Free and Build), - 10+ concurrent calls (Enterprise), - 10MB download / day (Free), - 1GB download/day (Build), - custom (Enterprise) | 1. Create from any SQL input. 2. Access through GraphQL. 2. Export data in any format. | Get data from ad-hoc queries: • Exploratory use cases • When tables don’t update often, usually metadata info. |
Transactional/ GraphQL API | - 1 call / sec (Free), - 5 calls / sec (Build), - custom like 1k calls / sec (Enterprise) | 1. Serve as GraphQL API. 2. Build your own API with desired transformation logic. | Serve data streams to dApps/ apps directly, in need of low latency (10ms) support. |
ETL pipeline* | No | 1. ZettaBlock managed ELT pipelines (integrity & uptime guaranteed). 2. Self-serve ELT connectors (users provide DB credentials). 3. Can export to BigQuery, Snowflake, S3, Databricks, MongoDB, Postgres, and other popular databases and warehouses. | The default choice for the analytics use case. • E.g., ingest new Polygon transactions every 24 hours. |
*To discuss the option of exporting data via ETL Pipeline, contact our team directly.
CSV Download
To download a CSV file of your query results, simply navigate to this panel:
Updated 10 months ago