Exploring cryptocurrency prices with QuestDB and Google Data Studio

Various crypto market research websites such as Glassnode, Messari, and TradingView provide free and easy ways to analyze market trends. However, not all the advanced charting functions are available for free, with some of the best features locked behind a paywall. If you want to explore these datasets using open-source and free alternatives, you can quickly get started in less than 5 minutes of setup.
#
Prerequisites- Google Cloud account
- gcloud sdk
- Helm 3
- Kaggle account
- Optional: Coinbase API access
#
Building a dataset of cryptocurrency pricesIn this tutorial, we will use the precompiled dataset of cryptocurrency prices from Kaggle, but if you would like to build your own dataset, see our Realtime crypto tracker tutorial that uses Kafka and QuestDB.
The Cryptocurrency Historical Prices dataset on Kaggle provides a compiled list of cryptocurrency prices pulled from CoinMarketCap. Log in or create a free Kaggle account to download the various .CSV files of coin prices over time. Since the overarching narrative in 2021 was the emergence of layer 1 protocols, I will look at Ethereum, Cardano, and Solana datasets.
#
Google Data Studio setupAlthough Google Data Studio can connect to any publicly-accessible PostgreSQL endpoint, I’ll be using Google Cloud since I’ve written about running QuestDB on GKE Autopilot before. In your Google Cloud account:
- Navigate to Google Kubernetes Engine
- Enable the Kubernetes Engine API
- Click on Create Cluster (you’ll need to have a billing account added)
For this demo, I decided to use the Standard mode with one node in
us-east1-c
region with other default settings accepted:

Once the cluster is created, fetch the cluster credentials via the gcloud
CLI
to authenticate:
#
Running QuestDB on GKE using HelmQuestDB provides an official Helm chart that deploys
a single StatefulSet
pod with 50Gi PVC by default. To make the PostgreSQL wire
endpoint accessible by Google Data Studio, we need to modify the service type
and set the expose field to true
. Create a new values.yaml
file with the
following contents:
Then deploy to our GKE cluster:
Wait for the pod to become healthy and provision a load balancer:

Under Kubernetes Engine > Services & Ingress, check the IP address of the
QuestDB service. There is a link which you can follow to port 9000
which will
bring you to the web UI for QuestDB. To insert data, click the Upload icon
on the left-hand panel and import the CSV data:

#
Working with Google Data StudioNavigate to the Google Data Studio dashboard and click on Create > Report. This will automatically bring up an Add data to report tab where we can select PostgreSQL. This tab is where you will need to fill out the IP address of the external load balancer provisioned by GKE as well as the default port and credentials:

Google will treat each table as a data source, so you’ll have to import each
table separately. By default, Data Studio will create a default table with
Symbol
and Record
Count:

You can modify the Dimension to add the other fields such as Marketcap
,
High
, Low
, etc:

The default Date format Google Data Studio uses is not compatible with QuestDB
as it uses the
PostgreSQL DATE_TRUNC function
that is not yet supported. Click on Resource > Manage added data sources and
change the Date format to ISO Year Week
. This way, it's possible to create a
coarse timeseries chart of Marketcap over time:

You can also add multiple dimensions to check how the market opened, what the highest price of the day was as opposed to the lowest price:

The graphing capabilities are not especially advanced with Data Studio, so support for candlesticks and other common financial charts are not available, but you can import multiple data sources to compare the price of Cardano vs. Solana over similar time period:

#
Next stepsRight now QuestDB does not natively support SSL connections. To enable SSL, we can use PgBouncer to secure Google Data Studio to PgBouncer and leverage a service mesh inside Kubernetes to secure PgBouncer to QuestDB. You can also put QuestDB behind a firewall and only whitelist the IPs that Data Studio uses to access the database.
Google Data Studio might be lacking for serious data analysis needs beyond simple visualizations and auto-aggregations. When we're making more complicated analyses, we will connect QuestDB directly to a Jupyter Notebook to explore our data in more detail using Pandas.