July 19, 2023

110 - NIST Fire Calorimetry Database with Matt Bundy

110 - NIST Fire Calorimetry Database with Matt Bundy
The player is loading ...
Fire Science Show

Ever wonder how scientists measure the intensity of a fire? Join us on this episode as we invite Dr. Matt Bundy from NIST. We discuss the intricacies of heat release rate, calorimetry, and how NIST is championing open data with their astonishing database.

You can learn more about the database here: https://www.nist.gov/blogs/taking-measure/new-nist-fire-calorimetry-database-available-answer-your-burning-questions

And find the database here: https://www.nist.gov/el/fcd

I've tried to make this conversation for everyone. If you never saw a calorimeter or have no idea how to measure fire, from this episode you will learn probably all you need to know. If you know everything about calorimetry, there is a ton of golden nuggets on how NIST runs their experiments, that are absolutely worth listening too.

In the podcast episode, we start deciphering the art of measuring fire characteristics - from using heat flux gauges, oxygen and mass loss calorimetry and what are the challenges to each of them.  We then explore the NIST Fire Calorimetry Database, unveiling its evolution from a humble set of data to an open-access resource. Dr. Bundy shares invaluable insights into the development of this enriching database, which has metamorphosed into an extensive video collection system that records fire experiments. He also whets our curiosity about the potential expansion and collaborations lying on the horizon for this unique database.

Rounding off our discussion, we delve into the inner workings of the hosting process of the NIST Calorimetry Database, its data storage methods, and how it facilitates easy access to experiments. We also glimpse the future of fire studies as Dr. Bundy calls on the Fire Science Show community to suggest objects to burn for their calorimetry.

Cover image: frame extracted from this calorimetry supercut video credit to NIST and Matt Bundy

This podcast episode is sponsored by OFR Consultants.


Transcript
Speaker 1:

Hello everybody, welcome to the Fire Science Show. Today I am here with another Matt from NIST. A few weeks ago I've interviewed Matt Hayler on Visualization in Fires, and Matt actually mentioned that he has a colleague in the office that does different types of photography than he does. The name of that colleague was Matt Bundy, and Matt Bundy is my interviewer today. He's the former leader of the National Fire Research Laboratory and he's the person responsible for the development and maintenance of the beautiful fire calorimetry database of NIST research. That is just pure gold for any fire engineer, and if you've never heard or seen that, your mind will be blown up. The link is in the show, and so you definitely should check it out after the episode or maybe during the episode when Matt is giving me a walkthrough the database. The database consists of multiple heat release rate measurements, and in the episode we will discuss a lot about the technicalities of what heat release rate is, how do we measure it, how does a calorimeter work. So we will find out a lot on how these measurements are done, and you will hear a lot about the database itself. And one thing that I really want to emphasize before we jump into the episode is a call to action that Matt brought. At the end of the episode, matt asked if there is stuff that people need to be burned. If you have an object, he will tell you how it burns. They are planning for next years of fire tests at the NIST laboratory and they're looking for ideas what would be the most useful fires that we, as Engineering Committee need to have measured, and they're willing to burn them and they're willing to provide us with data. So that's an opportunity I've never heard in my life. If you have an idea of what would be a great fire experiment to be measured and that could be really useful for you, please let me know or contact Matt directly. I really hope for FirescienceShow community to give some great ideas to NIST on what can be burned. And now enough of me talking. Let's switch to the interview, because you really want to listen what Matt has to say about the calorimetry and the beautiful database that they are building at NIST. Welcome to the FirescienceShow. My name is Vojci Wigrzynski and I will be your host and, as usual, before we jump into the episode, I would like to thank the sponsor of the podcast, thanks to whom this show is available freely to everyone, everywhere, with no strings attached and the sponsor is OFAR Consultants. Ofar Consultants is a multi-world winning, independent consultancy dedicated to addressing fire safety challenges. Ofar is the UK's leading fire risk consultancy and its globally established team has developed a reputation for preeminent fire engineering expertise, with colleagues working across the world to help protect people, property and environment. Internationally, their work range from Antarctic to Atacama Desert in Chile and a number of projects across Africa. In 2023, ofar will grow its team and it's keen to hear from industry professionals who would like to collaborate on fire safety features this year. Get in touch at OFARConsultantscom. And now let's learn more about the NIST Calorimetry Database. Hello everybody, welcome to FirescienceShow. I'm here today with Dr Matt Bundy. Hello, matt, hello. And Matt is with NIST and I was told he's the man behind the beautiful NIST Calorimetry Database, or one of the leaders of this magnificent task, and I've invited him to talk this over because it's a fantastic tool and something that I often go to, and I would love to learn how it came to life. So, matt, maybe first the story of your life as a fire scientist.

Speaker 2:

Well, I think, like most people, I had a not direct path to fire science. My undergraduate and early graduate work was heat transfer and thermodynamics, and then in grad school I was lucky enough to work on a project doing modeling of microgravity fires. That led to a postdoc at NIST where I continued doing experimental research on microgravity fires using the NASA drop tower. So I went from computational combustion and fire dynamics to very small scale measurements and that was my introduction to NIST. And then NIST is such a great place to meet such a wide variety of fire research interests and I started to become more interested in measurement methods, large scale fires and calorimetry in particular. Probably my first introduction to calorimetry was back during the early 2000s and there were some projects in our large fire lab and I was working with Dr George Mulholland who did a lot of work on smoke measurements, but he was trying to get a better understanding of the uncertainty of the large scale calorimetry measurement and so I helped him with that project and that was really my introduction to calorimetry.

Speaker 1:

And was the big hood already there in NIST when you joined in the early 2000s.

Speaker 2:

Yeah, so we had introduced a pollution control system in the late 90s and there were two larger calorimeters, so there was a 10 megawatt calorimeter and a 3 megawatt calorimeter when I first started working on that project.

Speaker 1:

And now it's 20, if I'm correct. Everyone's praising your fancy new calorimeter.

Speaker 2:

Yeah. So in 2015 we commissioned our new 20 megawatt calorimeter. We also added or reintroduced what we had originally called a furniture calorimeter, which is will take us up to half a megawatt. So now we have four calorimeters online.

Speaker 1:

So going into the subject of matter, so the calorimetry database. First. Let's maybe introduce the listeners who may be unaware to the concept of measuring the size of a fire. So here we are talking almost exclusively not just about it but primarily about the heat generation from fires which you can measure with the devices you have. Of course you also measure yields of the products along that. But the size of the fire expressed as the heat release rate is the thing we are almost interested in. Maybe you can bring up listeners on why is this parameter the one we choose to define fires Like? What information it gives us?

Speaker 2:

So the heat release rate is directly related to the size of the fire and really drives the hazard. So a lot of phenomenon are governed by the size of the fire and it's also a very useful way to characterize the time evolution. So if you want to know how fast the fire is growing and if you want to know the magnitude of the fire that drives smoke production and smoke transport so how much particulates and CO are generated and how they are driven through an environment the heat release rate is really a critical parameter in understanding that.

Speaker 1:

When I was an early engineer in this field and I was designing mostly smoke control systems. One thing related to heat release rate that struck me is that some of the equations used a term heat release rate. Some of them used convective heat release rate. So I figured out the difference, but perhaps there are still people struggling with that. So what would be the difference between the chemical, let's say, heat release rate, convective or deative one, because it's all different numbers. It is yeah.

Speaker 2:

Yes, so the chemical heat release rate is really the complete amount of energy that is generated by the fire, and that energy goes into different forms. So some of it goes into the convective plume of the fire, which is convective heat release rate. Some people call it convective enthalpy, and that really is the amount of heat that goes into heating up the air. There's also heat that goes back into pyrolyzing the fuel source. There's heat that is conducted to other items. So if you have a fire in a building where there's lots of steel and concrete, some of that heat will be conducted to those members. Then you mentioned radiative heat, which is very important, and the radiative heat is typically somewhere between 10 and 30% of the total chemical heat release rate. So what we're measuring with oxygen consumption calorimetry is the total chemical heat release rate, and then those other forms of energy that you mentioned are some fraction of that.

Speaker 1:

And how can you determine the amount of heat? From just knowing the oxygen depletion?

Speaker 2:

It's based on a methodology that is looking at the heats and formations of chemicals, and so if you know something about the molecule and the stoichiometric reaction with oxygen, you can get an understanding of what that chemical heat is. So it is a somewhat hydrolyzed formulation. Chemical reactions are not always complete. We do get things like SOET and CO and other byproducts of combustion, but those contributions to the heat release rate are secondary, and so to first order, we can get a good approximation of the chemical energy release in knowing how much oxygen is consumed by the fire to some uncertainty, and so the initial studies that were done, simply looking at the energetics, the chemical energetics, looked at a wide variety of fuels and realized that within about plus or minus 5%, the amount of chemical energy is in the number that we use in calorimetry is 13.1 gigajoules of energy per kilogram of oxygen consumed by the fire.

Speaker 1:

So basically you measure how much oxygen is gone from the air that is going through your apparatus.

Speaker 2:

That's correct. Yeah, we don't measure that directly. We capture all of the smoke from the fire. We measure the amount of oxygen in that exhaust stream before the fire and then during the fire and by knowing the chemical composition of the major species in the exhaust and the temperature and pressure and flow, we can use some algebra to infer what the oxygen depletion is.

Speaker 1:

What other major species are captured in the calorimeter?

Speaker 2:

COCO2? That's correct. Yeah, so of course the most important is oxygen. Co2 plays also an important role. And and then CO. You know some labs don't measure CO. Co is also very important because it's one of the main toxics compounds and so on its own right you may be very interested in measuring CO. It does play a small contribution to the energetics of the calrimetry, but it's not a critical measurement in the calrimetry. But yes, oxygen, co, co2, and then some labs will measure total hydrocarbon. We also measure soot through a light extinction measurement.

Speaker 1:

Tell me a little more about that. Why is soot important? For you? Is just another information, or you actually use that to narrow the uncertainties of the oxygen measurement?

Speaker 2:

The soot plays a relatively small role in the calrimetry uncertainty. Okay, we have actually looked at the stoichiometric impact of that and found it to be negligible. It's just not enough of a contributor in terms of the energetics to include in the formulation and so really the benefit of measuring the soot is having another combustion product yield and, of course, it's very important for engineers in terms of understanding visibility and it's an important from an engineering point of view.

Speaker 1:

Now, as you mentioned, you had CO, co2 and soot. It could be interesting also to map how this production changes as the efficiency of combustion changes, because you probably would also see that on the ratios between CO and CO2. So a very interesting set of measurements and these types of diagnostics are run every time you run the calrimetry. This is the default set of species that you always measure.

Speaker 2:

Yes, that's correct, and a lot of the times when we're designing an experiment or working with a researcher, they're not necessarily interested in CO. You know, they're looking at ignition or flame spread or some phenomenon that is not of interest. And so they'll say, oh, we don't really need to measure a soot for what we're doing. Or sometimes they don't even care about heat release rate. And you know I always insist that we do that. The one I have a very good excuse, in that we have a environmental permit. You know that says that we have to measure and track everything that we burn and we also have to report on our pollution mittens and we do calculations based on the fuel and combustion products. And so you know I have a very good excuse. I say, hey, we've got an environmental permit, we have to do that. But I also know that other researchers can take advantages of some of those measurements and having this database. Just because one researcher doesn't necessarily care about CO generation for his project, someone else may be trying to mine that database and find it very useful. So I'm careful to. You know we want to meet the needs of our customer, but sometimes it doesn't cost more to take additional data that can benefit others.

Speaker 1:

Now so the listeners can image, the colorometer is basically a room with a giant hood on the top and the pipe connected to that through which you suck out the fumes that are produced by the fire and then at some point you do all of those measurements. When I was tasked with designing one, or helping design one, for our laboratory, one thing that we worried is like it takes some time before the fumes reach your measuring stations. So to what extent you have some sort of time uncertainty in your measurements and and how do you deal with that?

Speaker 2:

Yeah, so that's interesting from a couple points of view. I was involved in an ASTM committee that looks at the cone calorimeter standard and the methodology that had been used for determining that time delay between the you know, where the gases are generated and where they're measured was not very well defined and different labs were interpreting that differently, and so I was involved with an effort to try to tighten that up a little bit, to specify more clearly how to calculate that time delay, the way that we do it in the large fire calorimetry system. We use a heat flux gauge to measure the reference time of the fire. The heat flux gauge is very fast. We can, you know, aim that at the fire and we have a very well controlled natural gas burner that we can introduce step functions. And so we'll introduce a step function in the natural gas burner to create a reference time of the change in fire size, and then we'll look at the response of the different gas gases oxygen, co2, temperature and pressure and when those signals have changed by percentage of their ultimate value in our case we use 10%. Then that is the delay time.

Speaker 1:

So you would have like a calibration steps like a burner, with steps that you use for calibration. You have a heat flux couch and the measurements on the same timeline and you can literally see what's the delay between peak on the heat couch and the oxygen sensor, for example.

Speaker 2:

That's correct and that will vary based on the flow rate. It doesn't vary a lot but if we'll run an experimental test series with a different exhaust flow rate, we'll go through and double check the delay times of all the instruments.

Speaker 1:

And if you have like a very tall subject. So basically the fire is higher in the ground, you have to recalibrate, or this would be a minor effect, like less than a second, and you would just omit that.

Speaker 2:

The delay. The transport time from the fire into the hood is a few seconds, and so if you had something higher up, it could have an effect of a couple of seconds, and, depending on what you're doing, that can be important. If you're dealing with a fire that's highly transient, if you have a very fast growing fire, or if you have a very transient phenomenon, then matching those time signals can be very critical, and so you may want to try to be as precise as possible.

Speaker 1:

And to close down on the measuring techniques, any mass loss rates that you're taking in your setup.

Speaker 2:

Yeah, so we do fairly frequently measure the mass loss rate of fuels. And back to your early question about calorimetry measurements and how to make that measurement. We talked about oxygen consumption calorimetry, but there's a lot of different ways to measure the size and growth rate of a fire, and one way is mass loss calorimetry, and in some senses it's more simple because all you have to do is measure the mass of the fuel and know something about the fuel composition and heat of combustion, and for some fuels that's a very precise and easy way to do it. So I mentioned our calibration burner. That's a natural gas fuel. We measure the composition of that with a gas chromatograph and so we very precisely know the heat of combustion. And then we have a positive displacement flow meter that we can get a very accurate measurement of the mass flow, and so with those pieces of information you can get a heat release rate and, assuming the combustion efficiency is good, we can also verify with the combustion products. We can make that measurement to better than 2% or 3%. So that's a mass loss based measurement with a gas flow. It's a little bit tougher with a solid fuel and I know a lot of people have tried to make mass loss measurements. And on the bench scale you know that's something. There is a mass loss calorimeter that looks very much like a cone calorimeter. It just doesn't include the oxygen consumption measurement and for some fuels that's at that scale. It's not hard to do. At a very large scale. It's very challenging One if, as the fire grows, it creates an updraft which can influence that mass loss measurement. If things fall, you know that can create a disturbance. So you know there's been many times where we've tried to do a mass loss measurement and it's been challenging to interpret the results. Sometimes it's used simply as a confirmation of our oxygen consumption measurement and it works very well with gas fires. As I mentioned, for liquid fuel fires We've gotten very good mass loss measurements that confirm very well our heat release rate measurements and that's always nice to have a completely independent measurement technique to confirm your results.

Speaker 1:

You've mentioned heat flux gouge and then earlier you have mentioned that radiative heat flux can be from 10 to 30% of the chemical gas. So is this also something you try to capture as you burn things or it just happens every now and then, not very often.

Speaker 2:

So we do always have at least one radiative heat flux gauge that we use. As I mentioned, it can be important for timing, so if there's ever any question about the delay times or it's a reference signal that we can always go back to. So we always have at least one measure of heat flux and we document the position of that gauge. Using that single measurement to infer what total radiative energy loss is is challenging One, because you know the fires are not always symmetric and so understanding the shape of the fire. And the other challenge is that you can have reflections of radiant heat and so you really have to be careful in terms of setting up that radiative measurement in order to use that to get a radiation heat release rate that you can use to determine the radiative fraction. We have done it for some projects. So for some pull fires, we've taken the effort to install numerous radiative heat flux gauges and really try to nail down that measurement. So it can be done, and we have done it for some projects, but it's not something we do for every project.

Speaker 1:

Man. I'm so satisfied that this part of interview was purely for me. I felt it was like one-on-one consulting on how to build a large-scale calorimeter, and I think anyone who ever is tasked with such a task will use this interview as a good reference. And I think everyone enjoyed listening about how do we measure. But now, the main point of this interview was not just the fact that NIST has amazing calorimetry device and that you can do these fancy measurements. The point is you've chosen to not just do them, but also share them with the community in a very open access form. It's not just about the way that you do share them, it's about how you share them, and this resource is absolutely astounding. So tell me when I was the point when you decided that this must become an open database and then you wanted to just give this to the researchers worldwide.

Speaker 2:

So, like most things, this did have an evolution and it did grow over time, and we started with an internal collection of data that I don't know that I would call it a database. It was more of a structured set of data and maybe it'd be useful to give you a brief history of how that evolved and eventually led to the public released fire calorimetry data. So when I started getting involved with fire calorimetry data and doing an uncertainty analysis and then eventually taking the position I have, leading the National Fire Research Lab, I started that work back around 2005-ish and at that point in time there were a lot of things that were very inefficient in doing the calorimetry measurement. We had these ice traps that had to be filled with dry ice. They were constantly clogging. We had a very clue G data acquisition system where people were taking screenshots to try to capture setup parameters and we would go back to this data sometimes a month later and it was really hard to recover information. It took half a day to set up to do one calorimetry measurement, and so I started looking at ways to make it more efficient and some things that didn't really change the measurement but changed the efficiency, like the removal of ice traps, and replacing that with another method to condition the flow and not require so much time or manual labor was one of the first things that I worked on, but eventually I wanted to be a little bit more efficient in the way we collected the data, and so I was fortunate enough to hire somebody who runs our data acquisition and programming system and was very critical to this project, arthur Junovsky. And he helped redesign the data collection, so everything was based off of input files and it was structured in a way that was very consistent, and so this was really the groundwork of the database. I mean, you really need to collect data in a very systematic way in order to have something that can be pushed into a database in a somewhat automated capacity. So this was work that happened, you know, over a period of time, and eventually it got to the point where it was very efficient and we were collecting both data and video in a fairly automated way, and so our users were very happy with this. You know, we had a structured data that if they knew, you know, when they did the test, they could recover a lot of the information. And there were input files and log files and structured data. And, you know, the users were pretty happy in terms of being able to find information about their data and locate it. And so, after we had a fairly decent size project and we ran I don't know 120 experiments, one of our users oh, I know you had on this show Kevin McGratton, you know he said Matt, you've got all this great data. You know, you've organized it in a very efficient way. The quality control is excellent. You've done a lot with uncertainty. Why not put this into some kind of HTML format so you know people can search it and give a little bit more utility? And so I started talking to Kevin and we brainstormed some ideas of what that might look like. We originally created an internal web page for our users and put about 1,000 experiments into that format, and that happened around, coincidentally, around the time the lab was shut down for COVID, so I had lots of free time to play around with the data and experiment with web programming, and so once we had an internal database that we were happy with, we talked to the NIST web development team about publishing that, and so, you know, at that point they gave us some ideas of formats that would be useful for them to work with. So all of our metadata is in JSON format and you know, the video and data and everything that they needed to kind of build their public database was created.

Speaker 1:

What was the scale of operation when you decided that you need to shift in that acquisition? Like how many tests per I don't know month a year? You would be doing that. That justifies this, because I assume it was a lot of work to get the acquisition on the level that you got.

Speaker 2:

Yeah, I get that question a lot in terms of what is the throughput of the National Fire Research Lab, and it really does vary so much. I mean, we've had years where we've run you know 500 plus experiments, but most of those have been high throughput experiments that aren't so instrumentation intensive, you know. So we may be doing a you know a test series looking at furniture flammability and burning lots of chairs and mattresses and you can turn those around pretty quickly and do five or six tests a day and run quite a few of them, but you don't have a lot of instrumentation. We have other projects where you know we'll have a month or two of buildup and instrument, with you know 400 or 500 channels of thermocouples and heat flux gauges, and you know those projects. It takes two or three months to run a single test, and so you know there's just a huge variants in the throughput in the lab, but it's usually one or the other. It's usually low instrumentation, high throughput or high instrumentation and low throughput. In both cases it's a lot of data, and so being able to record that data in a way that is consistent and also that you don't lose information, is something that we have to pay close attention to. I think at the time that we decided to create the database, there were about 1,500 heat-police-rayed experiments internally that we wanted to include in that database. I did want to, for this particular database, only include fires that we had video with as well, and so we started collecting video in a very and that's one thing that I haven't talked about yet is video collection, and when I first started working in the Fire Lab, all the video was collected on tapes, these mini DV tapes, and so you would load mini DV tapes into these cameras, point them at the fire and then sometimes people would never even look at them, or if they did, it was very tedious to go and look through the video, or they'd be looking for something in particular, but they weren't often digitized the finding the particular video that went with the particular experiment was challenging, and so that was another thing that I wanted was to have a video collection system that was integrated with the data collection system, and so now we have the ability to use all kinds of different video cameras webcams that are somewhat disposable we can put those in a fire and not worry if they burn up Very high quality professional cameras that are linked to a digital video collection card on a computer or webcams, ethernet based cameras, so we have a lot of different types of cameras that can be automatically triggered and the file name and date and metadata is all included along with those videos, and we started collecting data in that format around 2016. And so between 2016 and 2019, we had about 1,500 experiments where we had good data structure and video and you really straight data, and that's when we realized it would be very useful to have these in a database format.

Speaker 1:

Does this stretch over all four of your colorimeters or this is just for the big one? The data acquisition and video.

Speaker 2:

No, it stretches overall for the kilometers and most of the data in the database is on the smaller colorimeters because that's where most of the throughput is the big colorimeter. Obviously it takes a lot longer to set up and run those experiments.

Speaker 1:

And perhaps a naive question, but it goes back to the previous discussion, and how do you sync time between the data and the cameras?

Speaker 2:

That's a good question. That's something that we work on quite a bit. So we wanted to have a unique reference time to sync the data, and so all the programs that we use query an NTP time server and then convert that time to an EPIC time or UNIX time, which is the number of seconds since 1970. And it is useful to work in a time frame that's absolute. There's issues when you work with time, that is, human readable time, month, day and year. There's leap years and leap seconds, and we did struggle a little bit trying to have these different data systems and different cameras and different ways of recording time. So we wanted a very consistent time signal that we could link up across a number of different setups. Everything's connected to the ethernet. Nist has very good NTP time servers and so we've taken advantage of that and that's evolved. In the beginning Our time syncing was good to a couple of seconds and we figured that was good enough. But as time went on we had some projects where things were happening faster and it would look funny when we would see two different cameras out of sync for a few seconds, and so we wanted to tighten that up and now we've gotten our time synchronization down to about 50 milliseconds, which is pretty good for data and video.

Speaker 1:

This universal time formats are always fun when you post a large number into Microsoft Excel and it says it's 17 January 1954. I think even in one of the fields of science and medicine it was a source of major loss of data. I think in genetics that was Because I don't remember the number, but 10% of all the papers were compromised by the fact that Excel has changed numbers into dates. So there are funny consequences of having these capabilities too.

Speaker 2:

Yeah, we definitely fought with that, and we use lab-based programs. We use MATLAB for post-processing, we use Excel. We have a lot of different users that use different programs and we were constantly fighting time conversion issues, and so going to a time format was just simply an integer of seconds, and microseconds as well, but it's not easy to read we're at 1.6 billion seconds currently but it is something that's useful to have. It's easy to interpret once you understand it.

Speaker 1:

And since you're gathering so much data over so many different sized approaches, how did you settle up on the data structure in general, the format that you saved the data? You mentioned that your input data is in JSON. I know that the outputs are in CSV, so how did you end up with this particular structure? Was evolution from how you started, or was there a brainstorm to figure out the best way to save that?

Speaker 2:

I would say internally it was an evolution and basically just the tools that we had available and grew out of over time. Once we decided to publish the database, there were some requirements from security, so we're hosting the data on an Amazon web server. Our security the NIST security folks prohibit the use of PDF Excel. I know there's certain file formats that are not allowable on that server. They also insisted that the JSON format for the metadata would be critical in terms of if we need to migrate from one post to another. That format makes it very easy to migrate. So this was just working with IT professionals and using their suggestions in terms of data formats.

Speaker 1:

And now the outputs that we can see over the internet. Maybe you can give me a walkthrough like what can a user expect to find in the database files? I guess the best way to do that for anyone listening to the podcast is to open the database which is linked in the show notes. But let's try to have a walkthrough through the repository.

Speaker 2:

Sure, maybe I'll just open it up on my screen so I can kind of walk through users through it. If you just Google search NISTFCD, it should show up pretty easily for you. I'll just go to the show notes and click the link and then the database is right up top. There's a brief description of it. There's also a NIST Technical Note which really serves as the basis for this measurement. It's very important for us to get this technical note published, with a very detailed description of our measurement methods and uncertainty. Each of the numbers that are all the quantities that are presented here come along with an uncertainty and they're all traceable to this NIST Technical Note 27.7. So if you're interested in calrometry, it's a great resource. Anyone that's interested in building a large calrimeter out, I would recommend that they take a look at that technical note. There's a search box for experiments. You're curious if there's anything on trees or rooms or fuel types or whatever. Is there polystyrene? You can use that search box to look for a particular experiment. I should also say that right now in the publicly released database we have a couple hundred experiments, but soon we're going to be adding another set of data from a project which I believe is 90 more experiments, and we'll be continuing to populate this with experiments, and when we run the experiments in the lab to when all the data is published in or released, there can be a little bit of a lag. So we definitely have a lot in the queue that we're hoping to release soon. And then we have a list of projects that have a research overview and a number of experiments.

Speaker 1:

Okay, let's go into that. Let's go and see what's inside. How do you build the interior of that? Because you have a list of different experiments that are grouped by experiments. There are different numbers of fires inside, and once someone navigates inside, they finally find the data they've been looking for. So the role files. Let's go through this.

Speaker 2:

Yeah, so you can browse by project and then browse by the experiment name and quickly take a look at a plot of the heat release rate, a photograph of the setup, which is just a snapshot from the video near the time of ignition, and then some tabulated values such as the peak heat release rate and total heat release rate. So this is just a tabulated view of all the experiments conducted in that project that can be quickly browsed through.

Speaker 1:

And inside you find the most magical thing of them all, which is the video merged with the heat release rate plots, and I'm a huge fan of that. Matt Heller told me it was your idea. You have no chance to confirm that.

Speaker 2:

Yeah, so something that I did on one of the projects that I was running for my own benefit. I've been doing MATLAB programming for reports forever before I was a project leader or a supervisor, and those tools that I started developing back 15 years ago just gradually evolved. And then other users said, hey, I'd like to use that for one of my projects. So this is great to include in a presentation or display the data, and so, as more and more people found utility in it, I was happy to share it, and eventually I thought we've got these tools to overlay data and video. Let's go ahead and make these available as part of the database and how do you build them.

Speaker 1:

Is this a lot of labor? Is it just one click with the program that does it?

Speaker 2:

It's a little bit of setup and there's a few dozen parameters that are needed to overlay the data in the video, and so there's an input file that describes everything that you need. So it takes a little bit of time to set that up, but typically you set it up for one test and then you can run that through. It does take a little bit of time. It's probably not the most efficient. Program Basically opens a frame of the video and then draws the plot of each one of those frames and kind of marches through it. So an hour long video may take five or 10 minutes to generate this compressed video data overlay.

Speaker 1:

But I assume now that all your work in having the uniform time for that pays off, because if you want to make a time lapse, you can make a time lapse and be sure that the frames that you hit are correlated with the data that you show. You can speed up, and so some videos I'm sure that they came from this that the heat release rate would be slowing to real time when something interesting would be happening like a quick fire growth.

Speaker 2:

Yeah, so that's. One of the options in the program is to slow it in real time, if you choose to do that. And most of these, a lot of the tests, are half hour tests and I think people don't have the attention to sit there and watch a full half hour video of a fire test. It can be very painful.

Speaker 1:

It does sound way more exciting that it really is. And I found that people's attention span for fire tests. When they just walk into the lab and they hear there's going to be a fire test, the attention span of three minutes is usually what we get and they get bored of the fire unless it's growing or something's collapsing and stuff is happening. So no one has the means of their time to watch a video, for sure.

Speaker 2:

Yeah, and if you're the lead researcher, you may be interested in watching a half hour video of your experiment but nobody else is. And I'm the same way and I find a lot of value in being able to watch an experiment and compressed in a time lapse fashion, and so most of the videos in the database play in less than a minute. I mean that's kind of the goal. And so if we have a two hour video, maybe we only extract the frame every five seconds of real time and then play it at 30 frames per second and then that'll compress it to something under a minute in time and typically there's enough changing and you can see the fire growth and decay and see the overall dynamics in a time period that is where you can pay attention to it. So I find like 30 seconds is kind of the sweet spot for that. And so when you look at the videos in the database, you're not going to find anything that's longer than probably a minute and a half.

Speaker 1:

Fantastic. So for the last topic I wanted to bring up, when you were building the first instance of the database, where you're already thinking about potential uses in terms of training artificial intelligence on those data points, because now you have a very unique set of data with visuals, like you literally have the data on heat release rate, you would have data on heat fluxes and you have video images that are correlated with the data on the same time step, so it's a perfect data set to train. Was this something that you envisioned from the start or something that's just catching up with the database now?

Speaker 2:

Yeah, it wasn't something I was necessarily thinking of early on. I mean, my motivation for this was one just preserving the data, and what I noticed was you would have a researcher that was doing an experiment for one purpose, but the data from that was very useful for some other researcher, and so I would say, hey, so-and-so, go look at this project. They were burning polyurethane foam for this, but you may be interested in some other aspect of the data that was collected and so just being able to share a little more efficiency was my motivation. There were other researchers at NIST that were looking at machine learning, and they saw this database immediately as an opportunity to train models, and so there were a few different researchers here at NIST that started tinkering around with that a little bit and did some preliminary look at using machine learning to predict heat release rate based on video imagery. I don't believe that the next work has really gotten very far. I don't think anything's been published on that. It was more just kind of exploratory work. I do know that that researchers from outside nests have contacted me and asked to use this database for machine learning competitions or to train models, and so you know, I know that is going on. I've had a handful of people contact me and that's one thing that probably will do in a future release of the database Is allow people to download the whole database, because right now there's not one button you can click and answer to download the entire repository, and so you know all the data is there and it's available. But I think, you know, making it a little bit easier to download all of the files is something that I think people would benefit from.

Speaker 1:

I also think you know we're using the data that you have collected, especially the visuals, to maybe verify some of the old concepts, like there's has to be questions for the flame length, or does a relation between him, kids, release rate and flame intermittency, or even you know concepts of how to measure the flame height, like which point in the flame is the flame height? You know, if you watch frame by frame, you know that every frame flame is different height, so how do you average that into some sort of a value? So so I see endless world of possibilities to run with the data. So I can just thank you for one building this, this fantastic research, but also for the openness of nests and and yourself To not just share but to really put an extra effort into making this sharing easier for everyone. That this is actually quite astonishing that you actually care about whether it's easy for people to use it, not not just it's their problem if they want to use it. I appreciate this a lot.

Speaker 2:

I really appreciate that feedback thank you and Matt.

Speaker 1:

so what was the next big move in the database like? Will it grow? What do you intend to have it be like? Three, five, ten years?

Speaker 2:

We certainly want to continue to populate the database and then have it grow. I also know that there are other laboratories that have a lot of heat release rate data and we've been talking to researchers at UL and ATF and that have large scale calimeters, that have a lot of historical data and continue to generate data and asked if there's ways that we can collaborate, publish a larger database, sort of joint forces or have a at least have a consistent way of making this data available. You know, to continue to make it easier for the users. I do know that a lot of people have been interested in using the video for quantitative purposes, as you, as you mentioned, and I've gotten inquiries about. You know describing the camera position and camera settings and you know length, scales and various things that would be important to quantify visual measurements, and that's not something that we've typically done. A lot of these we just put a camera down. We don't really document anything about that camera position or about the scale. We've started to do that little bit more consistently and actually put a grid up in view of the camera before the test and then describe the more made a data about the camera. So that's something that I think research you know have asked us about and we'll try to provide that going forward. And then we also have, in newer versions of the of the program, the ability to generate composite images. You know, sometimes our experiments are, the geometry is complicated in one camera you can't really see everything, and so we'll have, you know, composite views with three or four different points of view to see the evolution of the fire. This all sounds exciting.

Speaker 1:

Also, any plans for like material data along those. There's some already. I wonder if, for more complex Mockups, like we really share I don't know con color measure data from, like a mattress that you've been in a full scale, you know so people have a chance on going from material scale to the item scale. I think that would be very interesting, sure, so yeah, there is one data set that we should be publishing shortly.

Speaker 2:

I think it's just about ready to go, and Isaac Leviton is the is the PI on that particular project. All of the materials that are studied in the full scale and these are vertical, parallel, parallel panel flame spread experiments are also documented in the Mac FP project, and so there is very extensive material data properties. So I think that particular data set will be very useful for anyone who's trying to look at you know at at modeling or or scaling up results from the different bench scale apparatus.

Speaker 1:

This is absolutely unique because it's one of the biggest challenges currently in fine-safety engineering that you would like to measure in small scale because that's cheap and available and have a good shot at scaling up, which means doing large scale experiments which only a few laps in the world can handle. So if this data was along your data, that that's definitely brilliant. However, it's already brilliant as it is.

Speaker 2:

And then another thing I'd like to mention is you know most of this data is really just taking data that was the NIST was doing for our internal projects and making it available. Another thing that I'd like to do going forward is, you know, ask the the fire science and fire engineering community what data would be most useful to them, and so you know we are undertaking an effort in 2024-25 and, you know, querying SFP and other you know practitioners in terms of heat release rate data that would be particularly valuable to their work, and so I'd like to also use this as an advertisement. If there are there's, you know anyone out there that would really like to see specific data in the database that can be provided. You know we'd be interested in knowing that, and there's no guarantee we'll be able to provide that and there are limitations in terms of the type of things that we can burn, but we certainly want to know, you know, if there are, you know, needs.

Speaker 1:

Absolutely like, and do you burn electric vehicles in it?

Speaker 2:

No, we really have not been in that, in that area of electric fires up to this point. You know, it's certainly something that we have thought about and there are others that they have been working that problem, but to this point that's not an area that NIST has been involved in, but we've certainly been considering it.

Speaker 1:

Well, I'm pretty sure you will be flooded with requests for that if you ask me, but yeah, I will. People you heard the man and they are looking for ideas about what types of design, fires engineering community needs, and that's brilliant that NIST is asking for that, and let's create them some great feedback and find out what actually is that we could use the most to guide our engineering. This is literally answer to our needs. So let's give them what they need and I certainly appreciate your call for ideas and I hope some of good ideas come from the Fire Science Show audience. Okay, matt, this is very insightful, interesting talk and once again, I can just express my gratitude towards NIST and your team for creating this really useful research. And yeah, thanks for doing that. Keep it up and we appreciate your hard work on this.

Speaker 2:

All right, thank you.

Speaker 1:

And that's it. Just wow, what a guy, what a project. Can you imagine doing all this effort to build the best worlds heat release rate database on hundreds of experiments that you have in the lab, putting so much work into that and then worrying that people lack a single button to download them all for their needs and actually developing that to have users have better experience using that? Amazing, just amazing. I appreciate it a lot. Matt and NIST, you guys are doing a fantastic job, thank you. And to the Fire Science Show audience, you've heard the man they're looking for ideas what to burn in the colorimetry that could be useful for us, that that I didn't expect this call to action, but this is a stunching one. And imagine you can just send an email that you need a particular item that that you often lack for your design fires and and they perhaps will burn it down and will provide you the best possible data you can have on that fire. I will be very disappointed if our science show community does not come with some good ideas for Matt and NIST on what we need to be burned, and I will be sure to follow up on Matt with this and see what they've burned and what they've learned from it and report it here on Earth of the Fire Science Show. So that would be it for today's episode. Thank you for listening. I hope your summer is getting great. Mine is certainly nice, with a lot of traveling and work, but that's quite fun. I enjoyed a lot, and you will perhaps hear some whistled off these voyages in the podcast, because as I go I meet people and I always meet someone interesting that is very worthy of being interviewed for the podcast. So a good traveling definitely is a good way to find new voices in the show. So far for today. Thank you so much for being here with me and next Wednesday, next episode, see you there. Cheers, bye.