This post comes from Gábor Boros, who has written an excellent tutorial that shows how to build dashboards using Plotly and QuestDB that track and chart stock prices in real-time. Thanks for the submission, Gábor!
If you're working with large amounts of data, efficiently storing raw information will be your first obstacle. The next challenge is to make sense of the data utilizing analytics. One of the fastest ways to convey the state of data is through charts and graphs.
In this tutorial, we will create a real-time streaming dashboard using QuestDB, Celery, Redis, Plotly, and Dash. It will be a fun project with excellent charts to quickly understand the state of a system with beautiful data visualizations.
Plotly defines itself as "the front end for ML and data science models", which describes it really well. Plotly has an "app framework" called Dash which we can use to create web applications quickly and efficiently. Dash abstracts away the boilerplate needed to set up a web server and several handlers for it.
The project will be built from two main components:
- a backend that periodically fetches user-defined stock data from Finnhub, and
- a front-end that utilizes Plotly and Dash to visualize the gathered data on interactive charts
For this tutorial, you will need some experience in Python and basic SQL knowledge. We will use Celery backed by Redis as the message broker and QuestDB as storage to periodically fetch data.
Let's see the prerequisites and jump right in!
The source code for this tutorial is available at the corresponding GitHub repository.
First of all, we are going to create empty directories for our project root and the Python module:
To install the services required for our project, we are using Docker and Docker
Compose to avoid polluting our host machine. Within the project root, let's
create a file, called
docker-compose.yml. This file describes all the
necessary requirements the project will use; later on we will extend this file
with other services too.
Here we go! When you run
docker-compose up, QuestDB and Redis will fire up.
After starting the services, we can access QuestDB's interactive console on
We could create the database table later, but we will take this opportunity and create the table now since we have already started QuestDB. Connect to QuestDB's interactive console, and run the following SQL statement:
After executing the command, we will see a success message in the bottom left corner, confirming that the table creation was successful and the table appears on the right-hand side's table list view.
Voilà! The table is ready for use.
As mentioned, our project will have two parts. For now, let's focus on the
routine jobs that will fetch the data from Finnhub. As is the case of every
standard Python project, we are using
requirements.txt to define the
dependencies the project will use. Place the
requirements.txt in your project
root with the content below:
We can split the requirements into two logical groups:
- requirements for fetching the data, and
- requirements needed to visualize this data
For the sake of simplicity, we did not create two separate requirements files, though in a production environment we would do.
In order to let the application communicate with QuestDB utilizing the
client library, we need to install
libpq-dev package on our system. To install
it, use your package manager; on Windows, you may need to install PostgreSQL on
Then, create a virtualenv and install the dependencies:
Since the periodic tasks would need to store the fetched quotes, we need to
connect to QuestDB. Therefore, we create a new file in the
app package, called
db.py. This file contains the PostgreSQL connection pool that will serve as the
base for our connections.
Before we jump right into the implementation, we must configure Celery. To
create a configuration used by both the workers and the dashboard, create a
settings.py file in the
app package. We will use
to define the configuration. This helps us to read the settings from a
file, environment variable, and prefix them if needed.
Ensuring that we do not overwrite any other environment variables, we will set
the prefix to
SMD that stands for "stock market dashboard", our application.
Below you can see the settings file:
In the settings, you can notice we already defined the
database_url settings with unusual default values.
Some bits are missing at the moment. We still have to define the correct
settings and run the worker in a Docker container. To keep our environment
separated, we will use a
.env file. One of the most significant advantages for
pydantic settings is that it can read environment variables from
Let's create a
.env file in the project root, next to
your project structure should look like this:
Add the following content to the
As you may assume, you will need to get your API key for the sandbox environment at this step. To retrieve the key, the only thing you have to do is sign up to Finnhub, and your API key will appear on the dashboard after login.
Going through the code above:
In the first few lines, we import the requirements that are needed to fetch and store the data.
After importing the requirements, we configure the Finnhub client and Celery to use the Redis broker we defined in the application settings.
To fetch the data periodically per stock symbol, we need to programmatically create a periodic task for every symbol we defined in the settings.
The snippet above will register a new periodic per stock symbol after Celery is connected to the broker.
The last step is to define the
fetch task that does the majority of the work.
Using the Finnhub
client, we get a quote for the given symbol. After the quote
is retrieved successfully, we prepare a SQL query to insert the quote into the
database. At the end of the function, as the last step, we open a connection to
QuestDB and insert the new quote.
Congratulations! The worker is ready for use; let's try it out!
Execute the command below in a new terminal window within the virtualenv, and wait some seconds to let Celery kick in:
Soon, you will see that the tasks are scheduled, and the database is slowly filling.
Before proceeding to the visualization steps, let's have a look at what we have built so far:
- we created the project root
docker-compose.ymlfile to manage related services
app/settings.pythat handles our application configuration
app/db.pyconfiguring the database pool, and
- last but not least,
app/worker.pythat handles the hard work, fetches, and stores the data.
At this point, we should have the following project structure:
This tutorial is not about writing the necessary style sheets or collecting
static assets, so you only need to copy-paste some code. As the first step,
assets directory next to the
app package with the structure below:
style.css will define the styling for our application. As mentioned above,
Dash will save us from boilerplate code, so the
assets directory will be used
by default in conjunction with the stylesheet in it.
style.css file to the
assets directory, this can be done using
This is the most interesting part of the tutorial. We are going to visualize the
data we collect. Create a
main.py file in the
app package, and let's begin
with the imports:
After having the imports in place, we are defining some helper functions and constants.
In the first few lines, we define constants for setting a graph update frequency
GRAPH_INTERVAL) and colors that will be used for coloring the graph
After that, we define two helper functions,
now is responsible only for getting the current time in UTC (as Finnhub
returns the date in UTC too), the
get_stock_data does more. It is the core of
our front-end application, it fetches the stock data from QuestDB that workers
Define the initial data frame and the application:
As you can see above, the initial data frame (
df) will contain the latest 5
hours of data we have. This is needed to pre-populate the application with some
data we have. The application definition
app describes the application's
title, asset folder, and some HTML meta tags used during rendering.
Create the application layout that will be rendered as HTML. We won't write HTML, we will use Dash's helpers for that:
This snippet is a bit longer, though it has only one interesting part,
dcc.Interval. The interval is used to set up periodic graph refresh.
We are nearly finished with our application, but the last steps are to define two callbacks that will listen to input changes and the interval discussed above. The first callback is for generating the graph data and rendering the lines per stock symbol.
The other callback is very similar to the previous one; it will be responsible for updating the percentage change representation of the stocks or a given stock.
The last step is to call
run_server on the
app object when the script is
called from the CLI.
We are now ready to try our application with actual data. Make sure that the
Docker containers are started and execute
PYTHONPATH=. python app/main.py from
the project root:
http://127.0.0.1:8050/, to see the application in action.
To select only one stock, in the dropdown field choose the desired stock symbol and let the application refresh.
In this tutorial, we've learned how to schedule tasks in Python, store data in QuestDB, and create beautiful dashboards using Plotly and Dash. Although we won't start trading just right now; this tutorial demonstrated well how to combine these separately powerful tools and software to create something bigger and more useful. Thank you for your attention!
If you like this content, we'd love to know your thoughts! Feel free to share your feedback or come and say hello in the QuestDB Community Slack.