Insert data

This page shows how to insert data into QuestDB using different programming languages and tools. To ingest data to a running instance, there are three main methods for inserting data:

  • InfluxDB line protocol (ILP) which provides flexibility, ease of use, and high ingestion rates
  • Postgres wire protocol for compatibility with a range of clients or fallback over ILP
  • Rest API which can be used for importing datasets from CSV
tip

InfluxDB Line Protocol is the recommended primary ingestion method in QuestDB. To query ingested data, users should utilize the Web Console and REST API on port 9000 or use a Postgres wire client. Methods for querying data are described on the query data documentation page.

Prerequisites#

This page assumes that QuestDB is running and accessible. QuestDB can be run using either Docker, the binaries or Homebrew for macOS users.

InfluxDB line protocol#

QuestDB implements InfluxDB line protocol which is accessible by default on TCP port 9009. This allows using QuestDB as a drop-in replacement for InfluxDB and other systems implementing the protocol.

This interface is the preferred ingestion method as it provides the following benefits:

  • high-throughput ingestion
  • robust ingestion from multiple sources into tables with dedicated systems for reducing congestion
  • configurable hysteresis via server configuration settings

For additional details on the message format, see the InfluxDB line protocol guide. Details on ports and authentication can be found on the InfluxDB API reference page, and a guide on the Telegraf agent for collecting and sending metrics to QuestDB via this protocol can be found on the Telegraf guide.

info
  • Each line protocol message must be delimited with newline \n characters.

  • The timestamp element of InfluxDB line protocol messages is optional and when omitted, the server will automatically assign the server's system time as the row's timestamp value.

const net = require("net")
const client = new net.Socket()
const HOST = "localhost"
const PORT = 9009
function run() {
client.connect(PORT, HOST, () => {
const rows = [
`trades,name=test_ilp1 value=12.4 ${Date.now() * 1e6}`,
`trades,name=test_ilp2 value=11.4 ${Date.now() * 1e6}`,
]
rows.forEach((row) => {
client.write(`${row}\n`)
})
client.destroy()
})
client.on("data", function (data) {
console.log("Received: " + data)
})
client.on("close", function () {
console.log("Connection closed")
})
}
run()

Postgres compatibility#

You can query data using the Postgres endpoint that QuestDB exposes. This is accessible via port 8812 by default. More information on the Postgres wire protocol implementation with details on supported features can be found on the Postgres API reference page.

This example uses the pg package which allows for quickly building queries using Postgres wire protocol. Details on the use of this package can be found on the node-postgres documentation.

This example uses naive Date.now() * 1000 inserts for Timestamp types in microsecond resolution. For accurate microsecond timestamps, the node-microtime package can be used which makes system calls to tv_usec from C++.

const { Client } = require("pg")
const start = async () => {
try {
const client = new Client({
database: "qdb",
host: "127.0.0.1",
password: "quest",
port: 8812,
user: "admin",
})
await client.connect()
const createTable = await client.query(
"CREATE TABLE IF NOT EXISTS trades (ts TIMESTAMP, date DATE, name STRING, value INT) timestamp(ts);",
)
console.log(createTable)
const insertData = await client.query(
"INSERT INTO trades VALUES($1, $2, $3, $4);",
[Date.now() * 1000, Date.now(), "node pg example", 123],
)
await client.query("COMMIT")
console.log(insertData)
for (let rows = 0; rows < 10; rows++) {
// Providing a 'name' field allows for prepared statements / bind variables
const query = {
name: "insert-values",
text: "INSERT INTO trades VALUES($1, $2, $3, $4);",
values: [Date.now() * 1000, Date.now(), "node pg prep statement", rows],
}
const preparedStatement = await client.query(query)
}
await client.query("COMMIT")
const readAll = await client.query("SELECT * FROM trades")
console.log(readAll.rows)
await client.end()
} catch (e) {
console.log(e)
}
}
start()

REST API#

QuestDB exposes a REST API for compatibility with a wide range of libraries and tools. The REST API is accessible on port 9000 and has the following entrypoints:

  • /imp - import data
  • /exec - execute an SQL statement

More details on the use of these entrypoints can be found on the REST API reference page.

/imp endpoint#

The /imp endpoint allows for importing a CSV file directly.

This example imports a CSV file with automatic schema detection.

Basic import
curl -F data=@data.csv http://localhost:9000/imp

This example overwrites an existing table, specifies a timestamp format and a designated timestamp column. For more information on the optional parameters for specifying timestamp formats, partitioning and renaming tables, see the REST API documentation.

Providing a user-defined schema
curl \
-F schema='[{"name":"ts", "type": "TIMESTAMP", "pattern": "yyyy-MM-dd - HH:mm:ss"}]' \
-F data=@weather.csv 'http://localhost:9000/imp?overwrite=true&timestamp=ts'

/exec endpoint#

Alternatively, the /exec endpoint can be used to create a table and the INSERT statement can be used to populate it with values:

# Create Table
curl -G \
--data-urlencode "query=CREATE TABLE IF NOT EXISTS trades(name STRING, value INT)" \
http://localhost:9000/exec
# Insert a row
curl -G \
--data-urlencode "query=INSERT INTO trades VALUES('abc', 123456);" \
http://localhost:9000/exec

Note that these two queries can be combined into a single curl request:

curl -G \
--data-urlencode "query=CREATE TABLE IF NOT EXISTS trades(name STRING, value INT);\
INSERT INTO trades VALUES('abc', 123456);" \
http://localhost:9000/exec

Web Console#

By default, QuestDB has an embedded Web Console running at http://[server-address]:9000. When running locally, this is accessible at http://localhost:9000. The Web Console can be used to explore table schemas, visualizing query results as tables or graphs, and importing datasets from CSV files. For details on these components, refer to the Web Console reference page.