TQS Integration uses QuestDB in industrial telemetry solutions for clients in the Life Science, Pharmaceutical, Energy, and Renewables industries. TQS Integration uses QuestDB when they require a time series database that’s simple and efficient for data collection, contextualization, visualization, analytics, and managed services for the world’s leading companies.
Massively reduced database deployment and maintenance costs
Integrations with developer tools to easily insert and query data
Simple to develop cloud-native solutions
Reliably ingest hundreds of thousands of events per second
Easy to deploy and low-effort to maintain
High-performance to monitor tens of thousands of metrics
TQS Integration builds reference architecture for software applications dealing with industrial telemetry that produce and process hundreds of thousands of events per second. QuestDB is used when they require a time series database for data visualization, real-time analytics, anomaly detection, and predictive maintenance.
In this case study, Senior Data Scientist, Holger Amort, describes how and why QuestDB is relied upon within high-performance reference architecture built at TQS Integration.
At TQS Integration, we specialize in software solutions for industrial processes in in Life Science, Pharmaceutical, Energy and Renewables industries. We’re dealing with vast amounts of industrial telemetry data via sensor and controller instrumentation, manufacturing execution systems, automation, ERP integration, and biopharmaceutical manufacturers. Our solutions enable manufacturers to leverage their process data through advanced software architecture and analytics.
Typically, we’re taking manufacturing data and combining it with other data sets of an organization to contextualize information. Having an overview of process and business information allows our users to make smarter decisions about their manufacturing processes, gain insights on predictive maintenance, anomaly detection, and much more.
The pharmaceutical and biotech industries are driven by regulation which means that manufacturing processes must be tightly controlled. Ensuring that procedures are followed correctly is mainly accomplished through telemetry from the manufacturing floor itself.
The types of metrics that manufacturers might be interested in is a range of variables such as temperature, pH, stirring rate of agitators, and many other metrics emitted when controlling manufacturing execution systems. Some manufacturing plants will produce and track between 50,000 and 70,000 process variables.
We’re typically sending approximately 100,000 to 150,000 events per second into QuestDB instances. The purpose of QuestDB in the reference architecture we build for our clients is for hot data, which we typically retain for 30 days. We then downsample this data for lower resolution cold storage in case we need a historical overview of the last 12 months’ sensor data, for instance.
When we are moving hot data over to cold storage, we employ compression algorithms to reduce the footprint of the data. As data becomes less critical to act upon, performance is less of a concern, and we can use the downsampled data for aggregate reports, trend analyses and longer-term predictions.
We’re collecting sensor and telemetry data from all available sources in our client’s facilities. This produces different types of time series data; for example, there are slow-moving batch processes, fast-moving purification steps, and high-speed filling lines in a biotechnology facility. Capturing events at different time scales and analyzing them requires a flexible and robust data strategy for acquisition, storage, and analysis.
It’s critical for our clients to have reliable systems based on the value of their raw materials used in production settings. If equipment fails, hundreds of millions of dollars worth of product could be destroyed. These are the areas where we see the value of predictive maintenance and anomaly detection to keep such processes operating reliably.
We have requirements for real-time monitoring of the state of the industrial plants and post-processing for ML models. We’re using QuestDB to analyze historical processes to control and maintain future operations. Anomaly detection is an exciting application that our clients are exploring for future scenarios as we can employ real-time alerts based on predictive modeling of time series data.
Because of the high-throughput functionality, we’re ingesting sensor data using InfluxDB line protocol to insert data. When querying the database, we’re using PostgreSQL wire over Python to run analytics in SQL and generate some basic visualizations.
For dashboards, we make use of Grafana quite often, which is an easy way to have an operational overview of the state of a system at any given time. An additional bonus of using open source software like QuestDB is that it allows us to easily test ML and multivariate analysis (MVA) libraries and take advantage of the very rich open-source ecosystem.
For instance, MVA allows us to detect similarity to describe how well a new signal matches our expectations given a template. We mark equipment failures (or events that led up to them) and use these to detect normal operating conditions and questionable events, highlighted in the screenshot below:
“TQS Integration uses QuestDB in data architecture solutions for clients in the Life Science, Pharmaceutical, Energy, and Renewables industries. We use QuestDB when we require a time series database that’s simple and efficient for data collection, contextualization, visualization, and analytics.”
Holger Amort, Senior Data Scientist at TQS Integration