QuestDB comes with an HTTP server which exposes a REST API. This guide will
teach you how to use the REST API to create tables, import data, run queries,
and export results with
curl. If you prefer a more visual approach, you can
also use the Web Console.
For more information about our REST API, please consult the REST API
This requires a running instance of QuestDB with port
9000 exposed. You can
learn how to do so with Docker
Get test data
The first step is to get data into the database. Here are some sample files you may want to try. You may use only one (we provide example queries for both), but using the two files will allow you to try asof join.
|Data||Description||Download||File Size||Number of rows|
|NYC taxi data||10 years of NYC taxi trips. Simplified to 2 trips per hour. Contains ride start and end times, distance, passenger count, fare, tip, and total amount paid.||Download||16.2 Mb||183,000|
|NYX weather||10 years of hourly weather data in central NYC. 137,000 rows. Contains timestamp, temperature, wind, snow, and more||Download||6.7 Mb||137,000|
With your container running and port 9000 mapped, you can now send curl requests to the database server. This guide shows examples of how to interact with it.
First, we create the tables using
/exec, which allows us to pass SQL
statements. We also specify a designated timestamp column which will be useful
for time based queries and time joins across tables.
Note that the table creation step is optional as QuestDB automatically
recognizes schema. However, creating the table manually allows us to specify a
dedicated timestamp column which will be useful for time based queries, and to
symbol which are more efficient than the automatically
We import both files using the
/imp endpoint. Note that I set the flag
so the data flows into the tables we just created. Otherwise, the data would be
inserted in a new table named after the file, for example
weather.csv. We also
timestamp flag to mark the designated timestamp column in the csv
In addition to the csv import, we can also use
exec to execute INSERT
statements. You can either send all fields or a subset of the schema like in the
example below. This is useful to send values in a different order from the table
definition. It is also useful to skip values when they are not relevant. Missing
values will be inserted as
CREATE TABLE and
INSERT INTO statements, we can use
exec to pass
exec returns results in JSON.
Here are a few example queries you could run against the dataset.
|trips||Average week by week trip distance over time|
|trips||Average monthly trip duration in minutes|
|trips||Average fare per passenger count bucket|
|trips||Average tip percentage per passenger count bucket|
|trips and weather||Joining trips and weather data. This query returns the prevailing weather conditions for every trip in 2017|
You can use the
/exp endpoint to export query results as follows.
If you are querying from the Web Console, then you can download the results
download to csv button.
Shut down and cleanup
As QuestDB is a persisted database, the data will remain after you shut down the server. If you would like to remove the data, you can run the following statements to drop the tables.