Dec. 18, 2024

182 - Bias in fire research

182 - Bias in fire research
The player is loading ...
Fire Science Show

Fire is a highly contextualized problem; therefore, there is no such thing as an unbiased or "objective" fire experiment. It is a thing that many researchers would understand but is very rarely pointed out. Where it is not a problem for fire science (more like a 'feature'), it may become one when the results of scientific experiments are directly applied to real-world engineering cases.

In this episode, I cover biases in research, from general ones to highly specific fire safety engineering biases. The list is long, we cover:

  • selection bias
  • confirmation bias
  • measurement/instrumentation bias
  • publication bias
  • observer bias
  • sampling/data analysis bias
  • conflicts of interests

We also discuss the contextual nature of fire and fire science related to architecture, fuel, ignition, and environmental conditions. We cover experimental design and measurement techniques. While showcasing all those possible sources of uncertainty and error, it is important to highlight that the science is generally very reliable—you just need to know how to use it.

This is the final episode of 2024, so thank you very much for being here with the Fire Science Show and see you back on the Jan 8th 2025!!! Merry Christmas and a Happy New Year to all of you!

----
The Fire Science Show is produced by the Fire Science Media in collaboration with OFR Consultants. Thank you to the podcast sponsor for their continuous support towards our mission.

Chapters

00:00 - Research Bias in Fire Science

14:12 - Biases in Scientific Research

26:32 - Influential Factors in Fire Experiments

34:21 - Challenges in Fire Science Measurement

43:54 - Future Plans for Fire Science Podcast

Transcript
WEBVTT

00:00:00.221 --> 00:00:02.306
Hello everybody, welcome to the Fire Science Show.

00:00:02.306 --> 00:00:04.030
Merry Christmas, happy New Year.

00:00:04.030 --> 00:00:11.743
It's the last episode of the season of the Fire Science Show, so it's a good time to wish you all the best for 2025.

00:00:11.743 --> 00:00:27.344
I'm going for a hopefully rightfully earned holiday break, so in today's episode we're going to summarize a little bit of this year, but that's going to happen at the end of the episode and first I'm going to give you some technical content that you came here for in the first place.

00:00:27.344 --> 00:00:38.152
In today's episode I am going to talk about research bias and the challenges in carrying high-level research in the fire science overall.

00:00:38.652 --> 00:00:47.640
As my mission in the Fire Science Show is to bridge engineers and scientists and I believe the podcast is quite successful actually at bridging those communities together.

00:00:47.640 --> 00:01:07.081
I see more engineers using science-based methods and I truly believe that science-backed engineering is the only engineering that makes sense, like we need to be deeply embedded within scientifical principles, experiments, confirmed theories and so on if we want to do good engineering.

00:01:07.081 --> 00:01:14.828
But I also understand, as a researcher and someone who's practicing engineering, that research is very rarely straightforward.

00:01:14.828 --> 00:01:16.795
Research is very rarely easy.

00:01:16.795 --> 00:01:20.483
It's not something that gives you immediate answers.

00:01:20.483 --> 00:01:37.593
There are things to that that you have to understand when you want to benefit the most out of the findings that you can find in the scientific literature, and many people would not consider those things, but I think they are very important to be considered.

00:01:37.593 --> 00:01:59.933
And those things, those biases, those things that influence how research is being done and what it means to the research itself, what it means to the applicability All those things will significantly influence the usefulness of the studies that you use in your engineering, and that's what this podcast episode is all about.

00:01:59.933 --> 00:02:04.671
I'm going to talk about some general research biases that are true for any discipline.

00:02:04.671 --> 00:02:10.893
I'm going to talk about specific fire safety, engineering and fire science biases that are highly specific to our discipline.

00:02:10.893 --> 00:02:16.192
So, yeah, it's a good one, and it's the last one in this year, so you don't want to miss this out.

00:02:16.192 --> 00:02:19.028
Let's spin the intro and jump into the episode.

00:02:23.721 --> 00:02:25.266
Welcome to the Fireiresize Show.

00:02:25.266 --> 00:02:28.789
My name is Wojciech Wigrzyński and I will be your host.

00:02:28.789 --> 00:02:48.290
This podcast is brought to you in collaboration with OFR Consultants.

00:02:48.290 --> 00:02:51.248
Ofr is the UK's leading fire risk consultancy.

00:02:51.248 --> 00:03:02.073
Its globally established team has developed a reputation for preeminent fire engineering expertise, with colleagues working across the world to help protect people, property and environment.

00:03:02.073 --> 00:03:08.526
Established in the UK in 2016 as a startup business of two highly experienced fire engineering consultants.

00:03:08.526 --> 00:03:17.907
The business has grown phenomenally in just seven years, with offices across the country in seven locations, from Edinburgh to Bath, and now employing more than a hundred professionals.

00:03:17.907 --> 00:03:29.540
Colleagues are on a mission to continually explore the challenges that fire creates for clients and society, applying the best research experience and diligence for effective, tailored fire safety solutions.

00:03:29.540 --> 00:03:40.211
In 2024, ofr will grow its team once more and is always keen to hear from industry professionals who would like to collaborate on fire safety futures.

00:03:40.211 --> 00:03:43.548
This year, get in touch at ofrconsultantscom.

00:03:43.548 --> 00:03:52.539
Okay, it seems that in 2024, there are going to be very limited opportunities to join OFR If you would like to choose that as your career path.

00:03:52.539 --> 00:03:59.561
But worry not, in 2025, ofr is also going to recruit for a ton of positions, so keep your eyes open.

00:03:59.561 --> 00:04:03.668
Thanks, ofr, for supporting Firescience Show throughout year 2024.

00:04:03.668 --> 00:04:07.435
And I am very happy that this will continue in 2025.

00:04:08.060 --> 00:04:13.473
Now for today's episode research biases where the inspiration came from.

00:04:13.473 --> 00:04:27.886
So a year ago, I've published a podcast episode in here about design fires for car parks, where we've talked about how we need to change the paradigm of the design fire for car park, and when I was doing this podcast episode.

00:04:27.886 --> 00:04:40.242
We were writing a research paper we were finishing a research paper back then about our literature research about experiments on car park fires, and this paper eventually was written.

00:04:40.242 --> 00:04:59.648
We've submitted it to fire technology and sometime ago, like two weeks ago, we've received an acceptance of that paper with some minor corrections which were implemented, and I hope that the paper will soon be published.

00:05:00.348 --> 00:05:15.591
I've realized that there's a lot of resources out there in scientific literature that looks like something you could directly use in your engineering, but actually when you go deep into that it becomes very, very challenging.

00:05:15.591 --> 00:05:20.632
Actually it's very difficult to actually use them in your everyday practice.

00:05:20.632 --> 00:05:23.168
And where do those challenges come from?

00:05:23.168 --> 00:05:26.389
Why research would not be useful directly?

00:05:26.389 --> 00:05:37.759
So if you're looking for something very simple like, let's say, heat of combustion of a specific material, we have very good, very well established methods to estimate the heat of combustion.

00:05:37.759 --> 00:05:53.021
You can choose your sample very clearly and very obviously to say, okay, I am looking for this type of plastic material or this type of timber at this density and you can put it into calorimetric bomb and get your heat of combustion.

00:05:53.021 --> 00:05:56.194
That's not something that would carry a lot of bias.

00:05:56.194 --> 00:05:59.244
But when you think about applying that knowledge in practice?

00:05:59.244 --> 00:06:03.141
What does the heat of combustion number mean to the real world problem?

00:06:03.141 --> 00:06:07.927
I mean, it's the potential heat of combustion, it's the maximum that can go.

00:06:07.927 --> 00:06:09.766
It cannot go beyond that number.

00:06:09.766 --> 00:06:11.266
But will you reach that number?

00:06:11.266 --> 00:06:13.500
That's another question.

00:06:14.040 --> 00:06:25.745
Now you get into combustion efficiencies and chemical reactions, vitiated or well-ventilated conditions, the temperatures, materials can burn in different ways.

00:06:25.745 --> 00:06:38.630
So with even such a simple thing as heat of combustion, even though you get some unbiased number and some result of an experiment, of a test, it's not a guarantee that you nail it 100% in the real world.

00:06:38.630 --> 00:06:45.608
Now if you think about more complicated stuff, like you wonder, how does a train burn in a tunnel?

00:06:45.608 --> 00:06:47.473
That's quite a challenge.

00:06:47.473 --> 00:06:54.560
There have been people who've burned trains and yes, we do have some heat release rate curves from that.

00:06:54.560 --> 00:06:56.185
We have some emissions, from that.

00:06:56.185 --> 00:06:58.048
We know what radiation is.

00:06:58.048 --> 00:07:01.543
You can approximate some sort of a fire spread in the train.

00:07:01.543 --> 00:07:05.992
Yes, we have data points like that Very few but we have them.

00:07:05.992 --> 00:07:09.831
But how generalizable are those numbers to the real world?

00:07:09.831 --> 00:07:13.807
Like, can you say that all trains in the world will burn in the same way?

00:07:13.807 --> 00:07:14.411
Of course not.

00:07:14.411 --> 00:07:21.973
Every fire is going to be different because there are so many drivers to a fire, that there are countless, countless ways a thing can burn.

00:07:21.973 --> 00:07:32.500
And yet we have this little tiny nugget of information, the only information you have to allow you to choose something you design with.

00:07:32.500 --> 00:07:57.932
Now, when you work with such a scarce data set on a very complicated problem in a very highly specific context of a building, this is when an engineer really has to understand the research biases and understand how the research was performed, to adopt, adjust that outcome from the research into their engineering or apply the outcome of the research into their engineering.

00:07:57.932 --> 00:08:07.584
But when processing the outputs, when considering what are the results of their study, to fully understand the complexities and the uncertainties in the results of their study, to fully understand the complexities and the uncertainties in the results.

00:08:08.185 --> 00:08:15.869
In the paper I've mentioned we were looking into design fires, obviously for car parks, and looking into the literature.

00:08:15.869 --> 00:08:21.228
We've noticed a lot of electric vehicle experiments in the last 10 years.

00:08:21.228 --> 00:08:22.791
Like a lot of them, a lot of them.

00:08:22.791 --> 00:08:26.670
We really have a lot of data on electric vehicles and fires.

00:08:26.670 --> 00:08:36.440
Now what's interesting in that study is that we've also noticed that most of those tests would be ignited from underneath the battery.

00:08:36.440 --> 00:08:49.303
The fire would attack the battery and you almost have no results from fires in passenger compartments, nothing on engine fires, nothing on electrical cable fires in the car, just batteries.

00:08:49.784 --> 00:09:07.125
Now, if you think about it, if I get a vehicle to burn which costs $100,000 or whatever, and I have this one shot, one opportunity to really get it done, you know, to get my data point and get it published, I want to have a fire.

00:09:07.125 --> 00:09:08.525
That's meaningful, right.

00:09:08.525 --> 00:09:17.556
And if you think about electric vehicle fires, there's battery fire, right, that's what we're interested about, not just, you know, internal compartment fires, right.

00:09:17.556 --> 00:09:23.551
But in the end, because all of the researchers are doing like that, we have no clue.

00:09:23.551 --> 00:09:26.381
How does fire spread from the passenger compartment into the battery?

00:09:26.381 --> 00:09:34.908
We have no clue, no idea how much the battery is attacked when the fire is in the vehicle and what it takes to ignite the battery.

00:09:34.908 --> 00:09:38.067
And if you talk to the researchers, you know it takes a lot to ignite the battery.

00:09:38.067 --> 00:09:43.846
We've done research our own on electric vehicles and, yes, it takes a lot to ignite the battery.

00:09:43.846 --> 00:09:59.354
So the viewpoint in the literature is kind of skewed towards a very specific publishable results, publishable science, which is not really the true, objective image of the reality out there.

00:09:59.354 --> 00:10:04.167
And these are exactly biases that I want to show you in this episode.

00:10:04.567 --> 00:10:06.131
Another one ventilation in tunnels.

00:10:06.131 --> 00:10:07.764
Ah, that's a good one.

00:10:07.764 --> 00:10:16.369
Actually, whenever you see a study on smoke control in tunnels like magically, there's no flow in the tunnel they always happen in quiescent conditions.

00:10:16.369 --> 00:10:34.212
Especially if we're talking about CFD or about modelling, small-scale modelling you always have this perfect axis-sy, axis, symmetric, buoyant smoke plume vertical, touching the ceiling, splitting into half smoke, going both directions, and this is how they are studied when in a real tunnel.

00:10:34.212 --> 00:10:38.605
I've done hundreds of experiments in real tunnels when we've commissioned them.

00:10:38.605 --> 00:10:43.765
Maybe two or three times in my life I've seen quiescent conditions in the tunnel Like they don't happen.

00:10:43.765 --> 00:10:55.370
Tunnel is a place where you have immense flow, always, always, there's a flow and especially if you consider tunnel under a traffic, the traffic will create flow whether you like it or not.

00:10:55.370 --> 00:11:04.149
So actually, the probability that you will start your fire in quiescent conditions zero meters per second in my opinion, is zero.

00:11:04.149 --> 00:11:14.192
The science is to some extent irrelevant because the bias in the initial conditions prevents you to compare this to real-world conditions.

00:11:14.192 --> 00:11:20.919
So I'm really triggered by this and, yes, I do my CFD for projects like this because that's the expectation of the market.

00:11:20.919 --> 00:11:24.571
But I understand that it's not the real image of the world.

00:11:24.571 --> 00:11:32.770
I've said that especially when you do model studies, because if you do full-scale tunnel research, you have initial velocities because you cannot escape them.

00:11:32.770 --> 00:11:34.166
That's the real world.

00:11:34.860 --> 00:12:02.653
Anyway, in this episode I wanted to highlight some general biases, like true for any science, true for any experiment, and I'll start with those and then I'll try to nail down why do we have some fire-specific biases, what's so special about fire safety, fire science, fire safety engineering that introduces some specific things to our discipline that perhaps are not present in other disciplines, and why you should pay attention to them.

00:12:02.653 --> 00:12:07.496
So for the general research biases, I have a list of like 10 of them.

00:12:07.496 --> 00:12:15.187
Let's go through the first five then On my list are selection bias, confirmation bias, measurement bias, instrumental bias, publication bias.

00:12:15.187 --> 00:12:17.952
So let's start with this group of biases.

00:12:17.952 --> 00:12:19.916
So first, selection bias.

00:12:19.916 --> 00:12:55.335
This means there's a bias in the way how the sample that we chose for our research or in general the experiment that we carry out is not representative to the broader population of something Like imagine you're doing studies on batteries and you're burning one specific type of battery chemistry and you claim that all lithium-ion batteries burn the same, when we know that NMC will burn differently from the iron batteries, and the reason for this bias is sometimes the availability of the material.

00:12:55.335 --> 00:13:01.826
So sometimes it's very tough to get the material that you would really want to work with or that would be really representative.

00:13:01.826 --> 00:13:07.001
It can be the broadness of materials in the world.

00:13:07.001 --> 00:13:10.065
There could be hundreds of types of specific material.

00:13:10.065 --> 00:13:16.768
It's very, very difficult to choose a representative sample and you can just have your own preferences.

00:13:16.768 --> 00:13:18.191
Well, that's a different bias.

00:13:18.191 --> 00:13:33.053
But in general, if you select a specific material and it's not representative to the generalized real world conditions, it's a selection bias and you have to look into the papers about how the materials were chosen.

00:13:33.664 --> 00:13:36.214
Another one is called confirmation bias.

00:13:36.214 --> 00:13:42.994
That's an interesting one and that comes to the way how we pursue science as scientists.

00:13:42.994 --> 00:13:59.591
So we're humans, we love what we are doing, but this is our life, this is our jobs and as every job, you have some job security With every job, well, you have to do it.

00:13:59.591 --> 00:14:08.739
Well to be a scientist, and because we're scientists, we're also quite connected with the ideas we come up in science.

00:14:08.739 --> 00:14:12.427
Also, you know, quite connected with the ideas we come up in science.

00:14:12.447 --> 00:14:17.538
So sometimes you would have some sort of hypothesis or a belief, going into the science, going into the experiment.

00:14:17.538 --> 00:14:27.851
And the confirmation bias means that you look at the experiment, you look at the data that you have through that hypothesis that you formulated beforehand.

00:14:27.851 --> 00:14:38.339
So it's easier to confirm a pre-existing belief than find something new, and that can lead to overlooking critical findings.

00:14:38.339 --> 00:14:54.595
And this is quite interesting because many times we look at all experiments done by someone else and we start seeing things that those people have not seen, that those people have missed out because of this confirmation bias.

00:14:54.595 --> 00:15:03.197
So scientists tend to be very favorable towards their own ideas and sometimes you find that reflected in the experiments.

00:15:03.197 --> 00:15:05.489
It's always best to look at raw data.

00:15:05.489 --> 00:15:12.067
It's always best to try and come up with your own conclusions from the data that you see.

00:15:12.067 --> 00:15:17.726
Of course it's challenging because of the context of the study, so you need to understand that as well.

00:15:17.726 --> 00:15:20.894
But yeah, confirmation bias is a real thing.

00:15:21.455 --> 00:15:25.849
Another two, that's measurement bias, instrumental bias.

00:15:25.849 --> 00:15:33.907
I would say those are similar and they come from how a technology is used to measure.

00:15:33.907 --> 00:15:47.796
So measurement bias would be some sort of systematic error in measurement tools, let's say the uncertainty of your measurements, and instrumental bias would be what kind of technology was available to you when you pursued an experiment.

00:15:47.796 --> 00:16:00.655
Both will be very important in fire science as well, because we have very limited tool set to measure fires and I'll actually come back to those when we go to the fire specific part of the episode.

00:16:01.157 --> 00:16:03.826
The next I have is publication bias.

00:16:03.826 --> 00:16:10.139
So publication bias is something that we observe in scientific literature as a whole thing.

00:16:10.139 --> 00:16:15.236
So this problem is actually quite deep and very, very challenging.

00:16:15.236 --> 00:16:25.419
The thing is that in modern science, as I said, we're scientists, we have to live our lives, earn our living, and we do that by publishing right.

00:16:25.419 --> 00:16:37.167
So as a scientist, there's kind of this expectation that you will publish papers and what you write, the way how you do the papers, because creating a paper is a lot of work.

00:16:37.167 --> 00:16:38.591
It's quite challenging.

00:16:38.591 --> 00:16:41.927
Actually, you want to create a paper that's going to get published, right.

00:16:41.927 --> 00:16:52.366
Realistically, you don't want to write a paper that will never be published and it is very difficult actually to publish experiments that failed be published and it is very difficult actually to publish experiments that failed, for example.

00:16:52.366 --> 00:17:13.472
So the publication bias is something that mostly positive experiments or successful experimental results are published there within the literature and there is very little inconclusive results published and very, very few negative results in the literature, especially if you had a hypothesis and you'd done an experiment.

00:17:13.472 --> 00:17:16.326
It gives you no conclusive answer to that.

00:17:16.326 --> 00:17:22.659
That's very challenging to publish and I would add on that the reproduction crisis.

00:17:22.659 --> 00:17:40.397
So in medicine, when someone publishes a study in a peer-reviewed journal, that's great, but the study is not really used by the community until someone repeats that experiment and publishes a confirmation study that yes, we did the same and it actually worked the same.

00:17:40.397 --> 00:17:42.432
In our science we don't have that.

00:17:42.432 --> 00:17:45.335
We almost have no reproducible research.

00:17:45.335 --> 00:17:53.115
There's very, very few attempts of people repeating other people's experiment just to check if they work or not.

00:17:53.115 --> 00:18:00.617
I would call that a crisis because this is something very fundamental and very needed for modern science.

00:18:00.617 --> 00:18:09.298
But more than that, not having negative outcomes, not having null outcomes in the science, is something extremely challenging.

00:18:09.404 --> 00:18:13.896
And you know I love the story of how the cosmic background radiation was discovered.

00:18:13.896 --> 00:18:31.508
I'm not sure if you know that study, but there were some guys who were trying to build a really precise radio telescope and they wanted to, like, reduce any kind of noise that would be around that telescope, you know, to have the cleanest, sharpest radio images of the universe.

00:18:31.508 --> 00:18:34.815
And they've systematically cleaned everything.

00:18:34.815 --> 00:18:43.605
There was no source of interference with their telescope and they were shooting pictures of the sky and they were getting this noisy, noisy data.

00:18:43.605 --> 00:18:48.130
It was super annoying to them, so you could call that a failed experiment.

00:18:48.130 --> 00:18:50.550
It was super annoying to them, so you could call that a failed experiment, a negative result.

00:18:50.550 --> 00:19:13.528
Right, they wanted to have a perfect telescope, but they have a lot of noise in their research and then someone pointed them out to some other scientists and they found out that the noise is actually cosmic background radiation, the remnant of the Big Bang, and they went and got the Nobel Prize for that that.

00:19:13.548 --> 00:19:26.972
So a negative result turned into a noble price and and you don't know that you will never know where your science can go and you, because of the biases related to hypothesis the confirmation bias, for example you cannot really figure that out.

00:19:26.972 --> 00:19:33.271
So even if your result looks negative at the first point, it actually could be a positive result for someone else.

00:19:33.271 --> 00:19:40.435
So I would highly urge everyone to publish everything you get, even if you cannot publish it in peer-reviewed journals.

00:19:40.435 --> 00:19:52.037
I would highly appreciate peer-reviewed journals to publish this type of papers and I'm confident that if you write a good, convincing editorial letter, it would be considered for publication.

00:19:52.037 --> 00:20:01.867
But even if not, then perhaps using open access to publish those results could be good, and this is something I truly believe we would need more in fire science.

00:20:01.867 --> 00:20:10.817
Wow, this was a longer one, but it's a big problem in science that people do not really get and it's something that really slows us down.

00:20:11.525 --> 00:20:17.045
Another one is observer bias, so it's kind of related to the confirmation bias previously.

00:20:17.045 --> 00:20:25.059
It means that the observer can unconsciously influence or interpret the results in a way that align with their expectations.

00:20:25.059 --> 00:20:36.005
It's basically the error, the systematic error, in which the researcher interprets visual data, their observations and yes, in FHIR we have a lot of observations.

00:20:36.005 --> 00:20:41.096
So even a simple thing as like did flashover occur in an experiment?

00:20:41.096 --> 00:20:42.286
I've been in those discussions.

00:20:42.286 --> 00:20:49.172
So that's an observer bias, someone introducing some errors to the study based on their experience.

00:20:49.805 --> 00:20:55.292
The next would be stuff that's related to data analysis, so sampling bias, data analysis bias.

00:20:55.292 --> 00:20:59.215
This means that the way how you process data can give you different outcomes.

00:20:59.215 --> 00:21:03.713
There were great papers from University of Edinburgh on how to clean out the noisy data.

00:21:03.713 --> 00:21:07.236
The data in fire is always noisy because the fires are turbulent in their nature.

00:21:07.236 --> 00:21:14.951
So whatever you measure, it's probably going to be a noisy measurement and you want these nice clean lines to be used in your engineering.

00:21:14.951 --> 00:21:27.390
So you have to clean that data and the way how you sample it, the choice of methodology used to clean out the data, will highly, highly influence what outcomes you get.

00:21:27.390 --> 00:21:29.454
Another one environmental bias.

00:21:29.454 --> 00:21:31.849
We're going to come back to that in the fire part as well.

00:21:31.849 --> 00:21:40.438
This is how the environment affects the experiment and this is critical for fire science, so I'll give myself ability to talk about it later.

00:21:41.085 --> 00:21:42.469
One more is repetition bias.

00:21:42.469 --> 00:21:50.755
So people focus on specific results of single experiments, ignoring the inconsistent or divergent results.

00:21:50.755 --> 00:21:57.207
So this is very interesting and this goes back perhaps to the study of David Morissette when he was burning chairs.

00:21:57.207 --> 00:21:58.690
He's burned 25 chairs.

00:21:58.690 --> 00:22:07.083
They roughly aligned, but he was interested in why they diverge and while doing that he found some really, really interesting stuff.

00:22:07.083 --> 00:22:08.487
There's a podcast episode about that.

00:22:08.487 --> 00:22:21.115
So really looking into why there is a scatter in your results where does it come from and what does it represent is critical for science, and sometimes scientists would just go over that, would not analyze that.

00:22:21.115 --> 00:22:24.434
They would flatten out average stuff and just publish results.

00:22:24.434 --> 00:22:29.114
This is very important to understand that this can happen.

00:22:29.875 --> 00:22:36.795
And the final one on my list is conflict of interest or funding bias, and yeah, it's a real thing.

00:22:36.795 --> 00:22:44.546
So if you are funded by someone, there is unfortunately some possibility that the research could have been influenced by these people.

00:22:44.546 --> 00:22:46.250
Usually it's not actually.

00:22:46.250 --> 00:23:01.618
Having worked with a lot of funders, a lot of companies, private companies, private entities I've never been in a situation where they would try to silence me or when they would try to use their power of money to actually do some impact over my work.

00:23:01.618 --> 00:23:05.040
But I'm not sure how it's generalizable that is.

00:23:05.040 --> 00:23:22.313
Maybe it's just our case, but science knows this problem of funding bias, especially in medicine there was a lot of conflict of interest, like tobacco industry that was a massive one Oil industry and climate change, silencing some research, putting different narratives.

00:23:22.313 --> 00:23:24.567
Humanity knows those stories.

00:23:24.567 --> 00:23:29.785
So it's also something to look for when you are dealing with with scientific research.

00:23:30.426 --> 00:23:34.239
And those biases are funding sources or and sponsors.

00:23:34.239 --> 00:23:36.884
They should always be named in the papers, so you should know.

00:23:36.884 --> 00:23:39.550
And also one thing that we avoid.

00:23:39.550 --> 00:23:47.226
That is that we try to go with governmental fundings and whenever we have materials to be used in experiments, we used to purchase them.

00:23:47.226 --> 00:23:50.851
We very rarely take donations of materials for experiments.

00:23:50.851 --> 00:23:57.770
I prefer to purchase them to have a real market sample and be kind of independent from the sponsors of the study.

00:23:57.770 --> 00:24:02.952
But I know not everyone can have this luxury of being financially independent in their research.

00:24:02.952 --> 00:24:04.846
So yeah, look for that bias.

00:24:04.846 --> 00:24:11.727
So those were 10 biases that are kind of general to the research, general to science.

00:24:11.727 --> 00:24:21.932
Every single branch of science, every single experiment could be burned with those biases and I tried to nail how they relate to the fire science.

00:24:21.932 --> 00:24:26.126
I leave it to you to figure out how much relatable they are.

00:24:26.126 --> 00:24:30.664
And there will be some biases that are highly related to fire science and we'll go into those.

00:24:30.779 --> 00:24:32.446
But first let's set up the stage.

00:24:32.446 --> 00:24:37.491
So how would an objective experiment in fire science look like?

00:24:37.491 --> 00:24:41.811
Is it even possible to have an objective study in fire science?

00:24:41.811 --> 00:24:49.431
And if you think about it deeply, a fire problem is an extremely contextualized problem.

00:24:49.431 --> 00:24:54.191
In general, fire behavior and fires are extremely contextualized.

00:24:54.191 --> 00:25:01.354
This context comes from the architecture or the environment in which the fire is occurring.

00:25:01.354 --> 00:25:16.151
So the place, the physical constraints around your fire, the airflow introduced by the geometry that you're studying fire in, will extremely affect the way how the fire behaves.

00:25:16.151 --> 00:25:26.646
If you have a fire in an open space, in quiescent conditions, imagine having an aircraft hangar with stable air, nothing influencing with your fire.

00:25:26.646 --> 00:25:31.631
In there you burn a pile of material, you get this beautiful axisymmetric bouillon smoke plume.

00:25:31.631 --> 00:25:39.413
Yeah, that is something that's perhaps unaffected by architecture, but this is representative to the real world.

00:25:39.413 --> 00:25:44.092
Like in every single building, that fire would be affected by architecture in some way.

00:25:44.092 --> 00:25:54.567
You could have wall interactions, you could have interactions with the ceilings, you will have heat feedbacks, you will have flow feedbacks, you will have changes in oxygen concentration and so on and so forth.

00:25:54.567 --> 00:26:06.601
So the architecture will always influence our fires and it's not a problem, it's a feature, it's a characteristic of the space of engineering that we're in.

00:26:06.601 --> 00:26:20.471
Fires are extremely contextualized and it's difficult to present fire in an objective context Objective context, own context and the difference between those two must be very well understood by the fire engineer.

00:26:32.160 --> 00:26:35.567
Another thing that goes into this is fuel and the ignition location.

00:26:35.567 --> 00:26:43.540
So, again going back to Morissette's work, the way how you ignite will highly influence how the fire spreads over your item.

00:26:43.540 --> 00:26:58.388
The location of the ignition, the energy of the ignition, the material properties all of this will introduce some sort of bias into the study and will set the fire on a very specific course If you ignite it in a different way, you'll get a different outcome.

00:26:58.388 --> 00:27:06.986
Hell, if you ignite it like in the same place, same way, you may get a different outcome, of course, but if you ignite it in a different way, the outcomes will be different.

00:27:06.986 --> 00:27:18.521
And now let's think about is there an objective way to ignite an item in the research, because imagine, I have a car to burn down.

00:27:18.521 --> 00:27:22.431
Is there an objective choice of the location where I put my source of ignition in that car to say, yes, cars burn like this?

00:27:22.431 --> 00:27:23.252
Of course not.

00:27:23.252 --> 00:27:25.329
You can't do it with one sample.

00:27:25.329 --> 00:27:35.643
I probably would have to burn hundreds of them to have some kind of distribution of the causes of fires and then understand how different locations of ignitions act like.

00:27:35.643 --> 00:27:39.050
But I don't have $100 million to pursue such a study.

00:27:39.050 --> 00:27:39.873
No one has.

00:27:40.400 --> 00:27:50.711
This is why engineers and researchers who carry those tests and the research laboratories have to pick something, and what they pick will highly influence the outcomes of their studies.

00:27:50.711 --> 00:27:52.343
You have to pay attention to that.

00:27:52.343 --> 00:27:54.391
That's the case of electric vehicles and fires.

00:27:54.391 --> 00:28:04.205
You will only find experiments in which the battery is directly attacked, where in the real world this might not be the main representative scenario.

00:28:04.205 --> 00:28:06.190
If the fire starts in the battery.

00:28:06.190 --> 00:28:12.509
It's still a different cause of the fire than when you put a two megawatt heptane fire underneath your battery, right?

00:28:12.509 --> 00:28:17.288
So the fuel and ignition will play an immense role in the fire experiments.

00:28:17.808 --> 00:28:20.163
Last one, the conditions in which the burning is happening.

00:28:20.163 --> 00:28:25.761
And you could have a scatter from extremely well controlled fire laboratory conditions.

00:28:25.761 --> 00:28:28.989
You control temperature, humidity, airflow, everything.

00:28:28.989 --> 00:28:37.854
You have the perfect setting Up to the real-world studies where we burn buildings outdoors in whatever weather we can find.

00:28:37.854 --> 00:28:41.009
It even matters if it's rained the day before.

00:28:41.009 --> 00:28:44.509
Humidity of the air will be a factor in the experiment.

00:28:44.509 --> 00:28:46.465
Wind will be a massive factor.

00:28:46.465 --> 00:28:49.486
Air will be a factor in the experiment.

00:28:49.486 --> 00:28:50.189
Wind will be a massive factor.

00:28:50.189 --> 00:28:53.983
I used to think that if you have wind less than, let's say, two meters per second, there's no direct impact, but that's not true.

00:28:53.983 --> 00:29:04.182
Even a very low wind velocity, even one meter per second, can already change the flow paths in your experiment and, because of that, indirectly influence the experimental outcomes.

00:29:04.182 --> 00:29:10.288
We've seen that happen in full-scale experiments and it's something that you need to understand when you look through the study.

00:29:11.029 --> 00:29:40.532
The thing is I don't want to criticize that it's impossible to carry good studies, because it is possible it's about reporting those things, reporting that context in which the study happened, and the more of that report is present in the research paper, the easier it is for an engineer or someone who wants to use this study to compare the context in which the study was pursued to the context of the real world problem they're trying to solve with that data.

00:29:40.532 --> 00:29:48.294
This is critical that we have this connection between how the experiment was carried and where do you want to use it.

00:29:48.294 --> 00:29:57.413
It's not easy to transfer that knowledge and probably if you're not a scientist, you probably should not actually try to fiddle with the data.

00:29:57.413 --> 00:30:04.903
You cannot say, oh yeah, let me add one megawatt because of this and that that's going to be very difficult to justify to your authorities.

00:30:04.903 --> 00:30:33.990
But understanding the context and being able to say, okay, this represents a very unlikely fire of a battery, which perhaps is the largest fire I can have from this source, and comparing this with your interior car fire of the closest example from the combustion engine vehicle, and saying, okay, this is how a course of fire inside the passenger compartment looks like and providing another research point, that's an approach to pursue.

00:30:33.990 --> 00:30:40.700
Also, if we're back to the car park problem and the paper we've wrote, it also matters how they've tested the experiments.

00:30:40.700 --> 00:30:42.565
Was it open air colorimetry?

00:30:42.565 --> 00:30:44.569
Was it something that mimicked the car park?

00:30:44.569 --> 00:30:54.790
There's very few studies where the vehicles would be burned down in something that resembles a car park and yet the feedbacks will influence the cause of the fire.

00:30:54.790 --> 00:31:02.534
So now, if we go deeper into fire experiments, some things that will influence the outcomes.

00:31:03.099 --> 00:31:09.548
Experimental design itself is very challenging in fire experiments outcomes Experimental design itself is very challenging in fire experiments.

00:31:09.548 --> 00:31:16.528
That means the layout of measurements is actually quite tough, to be honest, to lay out the measurement system in your fire experiment for multiple reasons.

00:31:16.528 --> 00:31:22.788
First, the measurement systems are quite expensive usually and they're usually rather scarce.

00:31:22.788 --> 00:31:29.823
So you have to be very picky on where to put your thermocouples and stuff in in your in your compartment fire.

00:31:29.823 --> 00:31:42.891
There were some crazy attempts where people use like thousands of thermocouples to map everything inside of compartments but then again it took them years to process that data and it was very challenging to actually work with this amount of data.

00:31:42.891 --> 00:31:44.603
So it brings new challenges.

00:31:45.163 --> 00:32:10.749
It does not just it's not a simple solution to use hundreds or thousands of data points and sometimes we even say that you have to burn an experiment once to see where you put your data measurement systems and then you burn the second time, hoping that it's going to go the same way, because now you know where to put your thermocouples and countless times I had this problem where I've misplaced the thermocouple, but by like 20 centimeters maybe.

00:32:10.749 --> 00:32:23.771
We had that with one of the Greenwall research where I had a beautiful line of plate thermometers in my sample through the geometric middle of my sample, from the bottom of facade to the top of the facade.

00:32:23.771 --> 00:32:31.690
It was like five continuous meters of measurements, really beautifully outlined in the perfect location, as it seemed.

00:32:31.690 --> 00:32:46.844
And when we had an ignition the fire burned up like 10 centimeters to the left of this line of thermocouples and I don't have a good outcome of those measurements because the fire did not go over the thermocouples where I thought it will.

00:32:46.844 --> 00:32:57.669
So that's a measurement bias that I've accidentally, or instrumentation bias that I've actually accidentally introduced into my study.

00:32:57.669 --> 00:33:00.253
I report that and I am very open about this.

00:33:00.253 --> 00:33:02.723
But stuff like this happens all the time.

00:33:02.723 --> 00:33:11.328
We had experiments in which the plume going out from a compartment would not be vertical, it would be skewed and in the same way would not hit the thermocouples.

00:33:12.090 --> 00:33:15.465
In general, locating the thermocouples is quite challenging for experiments.

00:33:15.465 --> 00:33:38.294
In fact, the most repeatable one that we have is the Polish test method for facades, where we introduce wind and it's like counterintuitive, because if you introduce the wind the fire is very dynamic, it moves around the facade, but because it's a controllable wind and it's always the same, the movement is very repeatable between experiments.

00:33:38.294 --> 00:33:41.369
So in most of them you get a similar scatter of temperatures.

00:33:41.369 --> 00:33:48.133
There's also some biases in the reporting and investigating the outcomes of fires.

00:33:48.133 --> 00:33:51.048
I've mentioned the confirmation bias.

00:33:51.048 --> 00:33:52.787
I've mentioned the publication bias.

00:33:52.787 --> 00:33:53.983
Those are very true.

00:33:53.983 --> 00:33:57.865
In the fire science community we're not very lot of people.

00:33:57.865 --> 00:34:00.412
We have to work, we have to publish.

00:34:00.412 --> 00:34:05.807
There are just a few fire journals out there, so we fight for the spots in those journals.

00:34:05.807 --> 00:34:09.226
Fire journals out there, so we fight for the spots in those journals.

00:34:09.226 --> 00:34:16.773
So yeah, it's possible that there are biases in research that are related to what's a publishable science and how you need to write about your experiment to get it published.

00:34:16.773 --> 00:34:21.405
But anyway, I've said about that previously, so let's move on.

00:34:21.666 --> 00:34:24.481
There are some challenges with the measurement techniques that we use.

00:34:24.481 --> 00:34:27.048
So oxygen calorimetry is very difficult.

00:34:27.048 --> 00:34:36.646
You need to understand there's a lag involved in calorimetry because the air has to reach the oxygen measuring point.

00:34:36.646 --> 00:34:46.152
So that needs some very sensitive calibration and requires you to understand the calorimetry very well and the flow path very well to really get a good measurement.

00:34:46.152 --> 00:34:49.003
You also rely highly on capturing all the gases.

00:34:49.003 --> 00:34:56.228
If you don't capture all the gases and don't get all the gases through your oxygen calorimeter, you don't know what was the burning rate of your fuel.

00:34:56.228 --> 00:35:00.405
You also need to account for combustion efficiency to some extent.

00:35:00.405 --> 00:35:08.809
So there are some factors that you include in the analysis, correction factors which are well known, but still they need to be applied.

00:35:08.809 --> 00:35:12.755
So it's not a simple measurement, not an easy measurement.

00:35:12.755 --> 00:35:23.112
You also have to capture the average flow, temperature and mass flow rate very well to apply oxygen calorimetry, and this means you have to measure in a specific point.

00:35:23.112 --> 00:35:27.400
You have to allow the flow to stabilize before you measure the averages, and so on.

00:35:27.400 --> 00:35:35.045
So the way how the calorimeter is constructed will also influence the ability to measure O2 and the flow in the correct way.

00:35:35.045 --> 00:35:43.132
So there's a ton of uncertainty related or sources of uncertainty that you have to control, for it's not a simple measurement by any mean.

00:35:43.780 --> 00:35:45.445
Temperatures are difficult.

00:35:45.445 --> 00:35:47.710
Actually I had this episode in the podcast.

00:35:47.710 --> 00:35:56.608
Why is temperature so easy to measure but so hard to interpret, where I went for a very long rant on how hard it is to measure temperatures.

00:35:56.608 --> 00:35:59.018
It's one of the early episodes of the podcast.

00:35:59.018 --> 00:36:01.467
It could be quite funny to revisit that one.

00:36:01.467 --> 00:36:04.159
That was a long time ago, but I think it's pretty good.

00:36:04.159 --> 00:36:07.286
People enjoyed it so I can relate you to that episode.

00:36:07.626 --> 00:36:15.590
In general, the way how we measure temperatures in fires is difficult, especially if we talk about fully grown fires or flames.

00:36:15.590 --> 00:36:19.563
You're talking about temperatures of around 1000 degrees Celsius.

00:36:19.563 --> 00:36:26.648
This means that radiation will overwhelm the heat transfer modes and not all thermocouples will respond to that.

00:36:26.648 --> 00:36:38.630
Equally, thermocouples will also have their own bulk, so a very thin thermocouple will respond to heat flux in a different way than the bulky one, and it also influences your outcomes.

00:36:38.630 --> 00:36:42.969
If you're looking for more transient phenomena, you have to pick your thermocouples correctly.

00:36:42.969 --> 00:36:46.452
Also, when you embed thermocouples in your samples, it matters if you drilled a hole and put your thermocouples correctly.

00:36:46.452 --> 00:36:50.606
Also, when you embed thermocouples in your samples, it matters if you drilled a hole and put your thermocouple through that hole.

00:36:50.606 --> 00:36:58.992
Or perhaps you put the thermocouple inside your material and then pour the concrete or put your plank of timber over them.

00:36:58.992 --> 00:37:05.039
It matters if it's perpendicular or parallel to the isotherm Stuff like that matters.

00:37:05.039 --> 00:37:08.563
It matters if you glue it and how much glue you've used actually.

00:37:08.563 --> 00:37:11.090
So there are a lot of biases in that.

00:37:11.532 --> 00:37:21.722
When you measure them with with infrared cameras, the emissivity of the surface will also matter and you usually don't know that very well and also changes in time.

00:37:21.722 --> 00:37:34.833
So, for example, if you're measuring temperature of a steel with a with coated steel in chromium, then the chromium will change the color, it will change the emissivity tremendously at some point, which you have to understand.

00:37:34.833 --> 00:37:42.027
That it happens because otherwise your measurement will show a quick peak in temperature that you will not be able to interpret.

00:37:42.027 --> 00:37:45.530
The same goes for materials that have water in them.

00:37:45.530 --> 00:37:50.771
So you're going to see a flattening of the temperature at 100 degrees.

00:37:50.771 --> 00:37:54.811
It doesn't mean that any heat transfer has ceased to exist in that sample.

00:37:54.811 --> 00:38:01.213
It means that water is currently evaporating and until it evaporates you're going to see 100 degrees on your thermocouple.

00:38:01.213 --> 00:38:04.710
And still you have to be able to understand and interpret that.

00:38:04.710 --> 00:38:05.943
So a lot of challenges.

00:38:05.943 --> 00:38:15.240
On temperature measurement in fires An entire episode of the podcast is about that, so I would refer you to that episode Now.

00:38:15.581 --> 00:38:21.539
I've listed a lot of problems, but it does not mean, I believe, the science is done wrong or the experiments are done incorrectly.

00:38:21.539 --> 00:38:30.005
They're done right, done right and most of the people I know that do fire science.

00:38:30.005 --> 00:38:32.813
They take a lot of careful decisions planning to get most out of their experiments.

00:38:32.813 --> 00:38:40.351
There will be limitations and biases in those, but they are usually quite clearly stated in the scope of the study.

00:38:40.351 --> 00:38:47.998
What I mean in this episode is not to tell you, ah, science is unrelatable, or science you cannot trust it.

00:38:47.998 --> 00:38:49.963
You absolutely can't trust those experiments.

00:38:49.963 --> 00:38:59.391
What I mean is that you take an effort to look how the science was done before you apply it directly into your calculations or into your design.

00:38:59.391 --> 00:39:13.297
If you take a random paper on EV fires and just take the heat release rate curve from an experiment and drop it into your car park and say this is a real car burning in my car park, this is not a true statement.

00:39:13.297 --> 00:39:20.224
You've applied a study done in a different context into a new context without even trying to comprehend what the differences would be.

00:39:20.224 --> 00:39:21.086
Don't do that.

00:39:21.086 --> 00:39:35.065
But if you read the study, if you read multiple studies, if you understand how they were pursued and you are sure that this is the data point that will work for you in your context, or perhaps you don't have any better data point.

00:39:35.065 --> 00:39:38.340
This is the only thing you can work with and it comes with these limitations.

00:39:38.340 --> 00:39:45.643
If you understand that and can clearly convince that those were accounted for, you're in a good, good position.

00:39:45.643 --> 00:39:49.623
So please read the studies, not just the results.

00:39:49.623 --> 00:39:52.068
Read the objectives, read the scope.

00:39:52.068 --> 00:39:54.882
Read the material section how they were chosen.

00:39:54.882 --> 00:39:57.728
Read the method section how it was measured.

00:39:57.728 --> 00:40:10.706
Read the conclusions very carefully, because people will tell you how they interpret the data and from those exercises you will be able to get a lot of success in your engineering applying real-world fire science.

00:40:10.706 --> 00:40:13.347
So that would be it about the biases.

00:40:13.347 --> 00:40:22.367
Actually, I could go a little longer on them, but I'll stop the rant and let you enjoy your Christmas and just say a few words about the podcast in 2024.

00:40:22.367 --> 00:40:26.567
So thank you all for being here with me in the year 2024.

00:40:26.719 --> 00:40:29.461
It was an exciting and interesting year for the show.

00:40:29.461 --> 00:40:31.387
The show has matured.

00:40:31.387 --> 00:40:33.192
It's it's my work now.

00:40:33.192 --> 00:40:37.264
It's my real job right now and I am very happy to do this job.

00:40:37.264 --> 00:40:42.860
I probably have the best job in the world if you consider that we've incorporated the podcast this year.

00:40:42.860 --> 00:40:45.144
So the legal structure has changed a bit.

00:40:45.144 --> 00:40:48.208
That was a big jump and took some work, but it's done.

00:40:48.208 --> 00:40:54.463
It's not a hobby project anymore, it's a real, a real job, and I'm very happy to do it.

00:40:54.463 --> 00:41:02.945
We've published, with this episode, 50 episodes of a podcast, which means we're at like all, 95 episodes per week.

00:41:02.945 --> 00:41:07.380
This is because I'm skipping the episode next week and I think I've missed one over the year.

00:41:07.380 --> 00:41:08.842
I'm not really sure right now.

00:41:08.842 --> 00:41:19.070
Anyway, I think the repeatability of the podcast was pretty good and we've nailed it in almost every single week this year.

00:41:19.070 --> 00:41:26.793
So that's my promise to you that you're going to find your fire science here on Wednesdays, and I try to deliver that.

00:41:27.143 --> 00:41:29.521
The podcast was listed in 115 countries.

00:41:29.521 --> 00:41:31.827
That's kind of amazing if you think about it.

00:41:31.827 --> 00:41:39.695
It reaches almost all ends of the world, some ends in a better way, some ends of the world in a little more challenging way.

00:41:39.695 --> 00:41:43.583
I'm really happy with a lot of listeners from Africa.

00:41:43.583 --> 00:41:49.722
The country with the most downloads was United Kingdom, so there's no change in here.

00:41:49.722 --> 00:41:55.233
Uk is still my number one market and high five to all the listeners from UK.

00:41:55.233 --> 00:42:01.637
But 115 countries and many of them are chasing UK and probably are higher per capita.

00:42:01.637 --> 00:42:08.411
We had this funny conversation in New Zealand that perhaps New Zealand is the highest per capita in terms of listenership.

00:42:08.411 --> 00:42:10.215
I also had one listener in Vatican.

00:42:10.215 --> 00:42:11.697
I really hope that's the Pope.

00:42:11.697 --> 00:42:17.552
I'll never be able to confirm that, but in my biases I choose to interpret this in this way.

00:42:18.012 --> 00:42:24.416
In this year we overall had something like 57,000 downloads of individual episodes.

00:42:24.416 --> 00:42:34.809
That's quite a lot, to be honest, and the most downloaded episode was the new guideline for PV5 safety with Grunder, episode 155.

00:42:34.809 --> 00:42:40.929
So Grunder made a lot of noise in the social media, as you'd expect, and it was just a good episode.

00:42:40.929 --> 00:42:47.168
The next ones from this year were 143 Fire Fundamentals, part 7 CFfd simulations.

00:42:47.168 --> 00:42:52.989
134 fire fundamentals, part 5 evacuation equation with david purser ah, that was a good one.

00:42:52.989 --> 00:42:56.800
137 immobility fires with adam barove.

00:42:56.800 --> 00:43:03.905
I think we've started the year with this one and perhaps the next year we'll also start with experiments on electric vehicles from ul.

00:43:03.905 --> 00:43:07.032
We'll see the fifth one on the list.

00:43:07.032 --> 00:43:11.505
154, fire Fundamentals, part 8, compartment Fire.

00:43:11.505 --> 00:43:14.248
You guys seem to like a lot the Fire Fundamentals.

00:43:14.440 --> 00:43:19.431
I need to do more episodes like that because it seems to be highly regarded.

00:43:19.431 --> 00:43:24.659
So overall the podcast is somewhere in like top 10% of the podcast worldwide.

00:43:24.659 --> 00:43:28.590
So this, for my ego, is very reassuring.

00:43:28.590 --> 00:43:33.248
We've even reached the top one in physics in some countries for some episodes.

00:43:33.248 --> 00:43:35.146
So that's a nice thing.

00:43:35.146 --> 00:43:36.947
But I'm not doing this for rankings.

00:43:36.947 --> 00:43:37.985
I'm not doing this for money.

00:43:37.985 --> 00:43:53.989
I'm doing this to really have an impact on how far science is shared and how far safety engineering is done worldwide, and I hope this impact has been achieved in the year 2024 and hopefully will be achieved in year 2025.

00:43:54.570 --> 00:44:00.851
So, finishing this year with this podcast episode, I would like to once again thank you for being here with me.

00:44:00.851 --> 00:44:08.958
I am looking forward coming back with a new episode on January january 8th is eighth and wednesday yep.

00:44:08.958 --> 00:44:10.541
January 8th is wednesday.

00:44:10.541 --> 00:44:15.271
So on january 8th, you shall have your next fire science show episode.

00:44:15.271 --> 00:44:17.384
The next wednesday is christmas.

00:44:17.384 --> 00:44:32.172
The wednesday after that is new year's, so you'll have plenty of opportunities to celebrate and have some fun with your families, have some rest, recharge for the next year, come back here and let's do more fire science in 2025.

00:44:32.172 --> 00:44:36.291
Thank you for being here with me and see you in the next year.

00:44:36.291 --> 00:44:50.983
Bye, thank you.