Skip to content

The Unstructured Message: MQTT's Early Days and Impact on Scientific Research

by Brian Gilmore
8 min read

Welcome to the first episode of our podcast series! As part of our celebration of MQTT’s 25th birthday, we're excited to kick things off with a deep dive into the history and evolution of MQTT, one of the most pivotal communication protocols in IoT. Our guest for this episode is none other than Dr. Jeremy Frey, a trailblazer who was among the first to publicly document the use of MQTT. Join us as we travel back to the early 2000s and explore the fascinating journey of MQTT from its inception to its profound impact on scientific research and industry.

The Early Days of MQTT

Dr. Frey recounts the early 2000s, a time when the UK research councils were funding initiatives under the banner of e-science. This program aimed to harness modern computation and information systems to enhance research efficiency and productivity. Dr. Frey’s team secured a grant for a project named CombiChem, which focused on combinatorial chemistry, high-throughput experimentation, and the necessary data infrastructures.

Interestingly, while most of the program centered around middleware development, Dr. Frey's group stood out by engaging directly with end users. They delved into how experiments could be automated and designed, developing software to manage data collection and ensuring data quality through electronic laboratory notebooks. These early efforts in integrating digital tools into research laid the groundwork for what would become a transformative approach to laboratory experiments.

The Birth of MQTT in the Lab

One of the key turning points in Dr. Frey’s journey was the collaboration with IBM Hursley, where he met Andy Stanford-Clark. Stanford-Clark introduced MQTT as a solution for efficiently sending data across various sensors and systems in the lab. At that time, the concept of using a lightweight communication protocol to manage laboratory data was revolutionary.

The integration of MQTT enabled Dr. Frey's team to collect data from numerous sensors, automate processes, and even monitor experiments remotely. This capability was particularly groundbreaking, allowing for real-time monitoring and control, even from mobile devices. The implementation of MQTT transformed their approach to experiments, enhancing both efficiency and accuracy.

Challenges and Innovations

Dr. Frey discusses the technical challenges they faced in the early days, from wiring up sensors manually to dealing with analog-to-digital conversions. There were no plug-and-play solutions, and much of the work involved bespoke setups with national instruments and rudimentary digital systems. Despite these hurdles, the team’s innovative use of MQTT allowed them to overcome many obstacles.

They implemented various sensors, such as Hall effect sensors and thermometers, and connected them through MQTT to record and monitor data. One memorable project involved using MQTT to manage a high-powered laser experiment. The team strung sensors around the lab to monitor conditions, and MQTT enabled them to collect and analyze data efficiently, ensuring safety and precision in their experiments.

The Evolution of Lab Technologies

Over the years, technology in the lab has advanced significantly. Dr. Frey highlights the transition from manually wired sensors to modern systems like Raspberry Pi and Arduino, which simplify the process of collecting and transmitting data. The development of more sophisticated sensors that output digital signals directly has streamlined many aspects of lab work.

One fascinating aspect of their work involved using cameras to digitize analog readouts from older equipment. This creative solution allowed them to extend the life of valuable instruments and integrate them into their digital systems. The use of imaging and video analysis to extract data from these instruments showcases the innovative spirit that has driven the evolution of their laboratory practices.

The Role of AI and Data Management

Dr. Frey also touches on the importance of data curation and the role of AI in modern research. He emphasizes the need to maintain raw data alongside processed information to ensure accuracy and reliability. The ability to manage large volumes of data and extract meaningful insights is crucial in today’s research environment.

The integration of AI and machine learning tools offers new possibilities for data analysis and interpretation. By leveraging these technologies, researchers can identify patterns, predict outcomes, and enhance their experimental designs. Dr. Frey envisions a future where AI not only supports data processing but also assists in the creative aspects of scientific inquiry.


Our conversation with Dr. Jeremy Frey offers a compelling glimpse into the early days of MQTT and its transformative impact on scientific research. His experiences underscore the importance of innovation, collaboration, and the continuous evolution of technology in the lab. As we look to the future, the integration of IoT, AI, and advanced data management techniques promises to further revolutionize the way we conduct experiments and explore new frontiers in science.

Brian Gilmore

Brian Gilmore is VP of Community & Advocacy at HiveMQ. He has spent the past decade driving global initiatives to unify industrial and enterprise IoT with transformative technologies like machine learning and cloud. In leadership roles at InfluxData and Splunk, he led advanced IoT integration projects and cultivated developer relations and communities.

  • Brian Gilmore on LinkedIn
  • Contact Brian Gilmore via e-mail
HiveMQ logo
Review HiveMQ on G2