Transcript
WEBVTT
00:00:00.221 --> 00:00:02.306
Hello everybody, welcome to the Fire Science Show.
00:00:02.306 --> 00:00:04.030
Merry Christmas, happy New Year.
00:00:04.030 --> 00:00:11.743
It's the last episode of the season of the Fire Science Show, so it's a good time to wish you all the best for 2025.
00:00:11.743 --> 00:00:27.344
I'm going for a hopefully rightfully earned holiday break, so in today's episode we're going to summarize a little bit of this year, but that's going to happen at the end of the episode and first I'm going to give you some technical content that you came here for in the first place.
00:00:27.344 --> 00:00:38.152
In today's episode I am going to talk about research bias and the challenges in carrying high-level research in the fire science overall.
00:00:38.652 --> 00:00:47.640
As my mission in the Fire Science Show is to bridge engineers and scientists and I believe the podcast is quite successful actually at bridging those communities together.
00:00:47.640 --> 00:01:07.081
I see more engineers using science-based methods and I truly believe that science-backed engineering is the only engineering that makes sense, like we need to be deeply embedded within scientifical principles, experiments, confirmed theories and so on if we want to do good engineering.
00:01:07.081 --> 00:01:14.828
But I also understand, as a researcher and someone who's practicing engineering, that research is very rarely straightforward.
00:01:14.828 --> 00:01:16.795
Research is very rarely easy.
00:01:16.795 --> 00:01:20.483
It's not something that gives you immediate answers.
00:01:20.483 --> 00:01:37.593
There are things to that that you have to understand when you want to benefit the most out of the findings that you can find in the scientific literature, and many people would not consider those things, but I think they are very important to be considered.
00:01:37.593 --> 00:01:59.933
And those things, those biases, those things that influence how research is being done and what it means to the research itself, what it means to the applicability All those things will significantly influence the usefulness of the studies that you use in your engineering, and that's what this podcast episode is all about.
00:01:59.933 --> 00:02:04.671
I'm going to talk about some general research biases that are true for any discipline.
00:02:04.671 --> 00:02:10.893
I'm going to talk about specific fire safety, engineering and fire science biases that are highly specific to our discipline.
00:02:10.893 --> 00:02:16.192
So, yeah, it's a good one, and it's the last one in this year, so you don't want to miss this out.
00:02:16.192 --> 00:02:19.028
Let's spin the intro and jump into the episode.
00:02:23.721 --> 00:02:25.266
Welcome to the Fireiresize Show.
00:02:25.266 --> 00:02:28.789
My name is Wojciech Wigrzyński and I will be your host.
00:02:28.789 --> 00:02:48.290
This podcast is brought to you in collaboration with OFR Consultants.
00:02:48.290 --> 00:02:51.248
Ofr is the UK's leading fire risk consultancy.
00:02:51.248 --> 00:03:02.073
Its globally established team has developed a reputation for preeminent fire engineering expertise, with colleagues working across the world to help protect people, property and environment.
00:03:02.073 --> 00:03:08.526
Established in the UK in 2016 as a startup business of two highly experienced fire engineering consultants.
00:03:08.526 --> 00:03:17.907
The business has grown phenomenally in just seven years, with offices across the country in seven locations, from Edinburgh to Bath, and now employing more than a hundred professionals.
00:03:17.907 --> 00:03:29.540
Colleagues are on a mission to continually explore the challenges that fire creates for clients and society, applying the best research experience and diligence for effective, tailored fire safety solutions.
00:03:29.540 --> 00:03:40.211
In 2024, ofr will grow its team once more and is always keen to hear from industry professionals who would like to collaborate on fire safety futures.
00:03:40.211 --> 00:03:43.548
This year, get in touch at ofrconsultantscom.
00:03:43.548 --> 00:03:52.539
Okay, it seems that in 2024, there are going to be very limited opportunities to join OFR If you would like to choose that as your career path.
00:03:52.539 --> 00:03:59.561
But worry not, in 2025, ofr is also going to recruit for a ton of positions, so keep your eyes open.
00:03:59.561 --> 00:04:03.668
Thanks, ofr, for supporting Firescience Show throughout year 2024.
00:04:03.668 --> 00:04:07.435
And I am very happy that this will continue in 2025.
00:04:08.060 --> 00:04:13.473
Now for today's episode research biases where the inspiration came from.
00:04:13.473 --> 00:04:27.886
So a year ago, I've published a podcast episode in here about design fires for car parks, where we've talked about how we need to change the paradigm of the design fire for car park, and when I was doing this podcast episode.
00:04:27.886 --> 00:04:40.242
We were writing a research paper we were finishing a research paper back then about our literature research about experiments on car park fires, and this paper eventually was written.
00:04:40.242 --> 00:04:59.648
We've submitted it to fire technology and sometime ago, like two weeks ago, we've received an acceptance of that paper with some minor corrections which were implemented, and I hope that the paper will soon be published.
00:05:00.348 --> 00:05:15.591
I've realized that there's a lot of resources out there in scientific literature that looks like something you could directly use in your engineering, but actually when you go deep into that it becomes very, very challenging.
00:05:15.591 --> 00:05:20.632
Actually it's very difficult to actually use them in your everyday practice.
00:05:20.632 --> 00:05:23.168
And where do those challenges come from?
00:05:23.168 --> 00:05:26.389
Why research would not be useful directly?
00:05:26.389 --> 00:05:37.759
So if you're looking for something very simple like, let's say, heat of combustion of a specific material, we have very good, very well established methods to estimate the heat of combustion.
00:05:37.759 --> 00:05:53.021
You can choose your sample very clearly and very obviously to say, okay, I am looking for this type of plastic material or this type of timber at this density and you can put it into calorimetric bomb and get your heat of combustion.
00:05:53.021 --> 00:05:56.194
That's not something that would carry a lot of bias.
00:05:56.194 --> 00:05:59.244
But when you think about applying that knowledge in practice?
00:05:59.244 --> 00:06:03.141
What does the heat of combustion number mean to the real world problem?
00:06:03.141 --> 00:06:07.927
I mean, it's the potential heat of combustion, it's the maximum that can go.
00:06:07.927 --> 00:06:09.766
It cannot go beyond that number.
00:06:09.766 --> 00:06:11.266
But will you reach that number?
00:06:11.266 --> 00:06:13.500
That's another question.
00:06:14.040 --> 00:06:25.745
Now you get into combustion efficiencies and chemical reactions, vitiated or well-ventilated conditions, the temperatures, materials can burn in different ways.
00:06:25.745 --> 00:06:38.630
So with even such a simple thing as heat of combustion, even though you get some unbiased number and some result of an experiment, of a test, it's not a guarantee that you nail it 100% in the real world.
00:06:38.630 --> 00:06:45.608
Now if you think about more complicated stuff, like you wonder, how does a train burn in a tunnel?
00:06:45.608 --> 00:06:47.473
That's quite a challenge.
00:06:47.473 --> 00:06:54.560
There have been people who've burned trains and yes, we do have some heat release rate curves from that.
00:06:54.560 --> 00:06:56.185
We have some emissions, from that.
00:06:56.185 --> 00:06:58.048
We know what radiation is.
00:06:58.048 --> 00:07:01.543
You can approximate some sort of a fire spread in the train.
00:07:01.543 --> 00:07:05.992
Yes, we have data points like that Very few but we have them.
00:07:05.992 --> 00:07:09.831
But how generalizable are those numbers to the real world?
00:07:09.831 --> 00:07:13.807
Like, can you say that all trains in the world will burn in the same way?
00:07:13.807 --> 00:07:14.411
Of course not.
00:07:14.411 --> 00:07:21.973
Every fire is going to be different because there are so many drivers to a fire, that there are countless, countless ways a thing can burn.
00:07:21.973 --> 00:07:32.500
And yet we have this little tiny nugget of information, the only information you have to allow you to choose something you design with.
00:07:32.500 --> 00:07:57.932
Now, when you work with such a scarce data set on a very complicated problem in a very highly specific context of a building, this is when an engineer really has to understand the research biases and understand how the research was performed, to adopt, adjust that outcome from the research into their engineering or apply the outcome of the research into their engineering.
00:07:57.932 --> 00:08:07.584
But when processing the outputs, when considering what are the results of their study, to fully understand the complexities and the uncertainties in the results of their study, to fully understand the complexities and the uncertainties in the results.
00:08:08.185 --> 00:08:15.869
In the paper I've mentioned we were looking into design fires, obviously for car parks, and looking into the literature.
00:08:15.869 --> 00:08:21.228
We've noticed a lot of electric vehicle experiments in the last 10 years.
00:08:21.228 --> 00:08:22.791
Like a lot of them, a lot of them.
00:08:22.791 --> 00:08:26.670
We really have a lot of data on electric vehicles and fires.
00:08:26.670 --> 00:08:36.440
Now what's interesting in that study is that we've also noticed that most of those tests would be ignited from underneath the battery.
00:08:36.440 --> 00:08:49.303
The fire would attack the battery and you almost have no results from fires in passenger compartments, nothing on engine fires, nothing on electrical cable fires in the car, just batteries.
00:08:49.784 --> 00:09:07.125
Now, if you think about it, if I get a vehicle to burn which costs $100,000 or whatever, and I have this one shot, one opportunity to really get it done, you know, to get my data point and get it published, I want to have a fire.
00:09:07.125 --> 00:09:08.525
That's meaningful, right.
00:09:08.525 --> 00:09:17.556
And if you think about electric vehicle fires, there's battery fire, right, that's what we're interested about, not just, you know, internal compartment fires, right.
00:09:17.556 --> 00:09:23.551
But in the end, because all of the researchers are doing like that, we have no clue.
00:09:23.551 --> 00:09:26.381
How does fire spread from the passenger compartment into the battery?
00:09:26.381 --> 00:09:34.908
We have no clue, no idea how much the battery is attacked when the fire is in the vehicle and what it takes to ignite the battery.
00:09:34.908 --> 00:09:38.067
And if you talk to the researchers, you know it takes a lot to ignite the battery.
00:09:38.067 --> 00:09:43.846
We've done research our own on electric vehicles and, yes, it takes a lot to ignite the battery.
00:09:43.846 --> 00:09:59.354
So the viewpoint in the literature is kind of skewed towards a very specific publishable results, publishable science, which is not really the true, objective image of the reality out there.
00:09:59.354 --> 00:10:04.167
And these are exactly biases that I want to show you in this episode.
00:10:04.567 --> 00:10:06.131
Another one ventilation in tunnels.
00:10:06.131 --> 00:10:07.764
Ah, that's a good one.
00:10:07.764 --> 00:10:16.369
Actually, whenever you see a study on smoke control in tunnels like magically, there's no flow in the tunnel they always happen in quiescent conditions.
00:10:16.369 --> 00:10:34.212
Especially if we're talking about CFD or about modelling, small-scale modelling you always have this perfect axis-sy, axis, symmetric, buoyant smoke plume vertical, touching the ceiling, splitting into half smoke, going both directions, and this is how they are studied when in a real tunnel.
00:10:34.212 --> 00:10:38.605
I've done hundreds of experiments in real tunnels when we've commissioned them.
00:10:38.605 --> 00:10:43.765
Maybe two or three times in my life I've seen quiescent conditions in the tunnel Like they don't happen.
00:10:43.765 --> 00:10:55.370
Tunnel is a place where you have immense flow, always, always, there's a flow and especially if you consider tunnel under a traffic, the traffic will create flow whether you like it or not.
00:10:55.370 --> 00:11:04.149
So actually, the probability that you will start your fire in quiescent conditions zero meters per second in my opinion, is zero.
00:11:04.149 --> 00:11:14.192
The science is to some extent irrelevant because the bias in the initial conditions prevents you to compare this to real-world conditions.
00:11:14.192 --> 00:11:20.919
So I'm really triggered by this and, yes, I do my CFD for projects like this because that's the expectation of the market.
00:11:20.919 --> 00:11:24.571
But I understand that it's not the real image of the world.
00:11:24.571 --> 00:11:32.770
I've said that especially when you do model studies, because if you do full-scale tunnel research, you have initial velocities because you cannot escape them.
00:11:32.770 --> 00:11:34.166
That's the real world.
00:11:34.860 --> 00:12:02.653
Anyway, in this episode I wanted to highlight some general biases, like true for any science, true for any experiment, and I'll start with those and then I'll try to nail down why do we have some fire-specific biases, what's so special about fire safety, fire science, fire safety engineering that introduces some specific things to our discipline that perhaps are not present in other disciplines, and why you should pay attention to them.
00:12:02.653 --> 00:12:07.496
So for the general research biases, I have a list of like 10 of them.
00:12:07.496 --> 00:12:15.187
Let's go through the first five then On my list are selection bias, confirmation bias, measurement bias, instrumental bias, publication bias.
00:12:15.187 --> 00:12:17.952
So let's start with this group of biases.
00:12:17.952 --> 00:12:19.916
So first, selection bias.
00:12:19.916 --> 00:12:55.335
This means there's a bias in the way how the sample that we chose for our research or in general the experiment that we carry out is not representative to the broader population of something Like imagine you're doing studies on batteries and you're burning one specific type of battery chemistry and you claim that all lithium-ion batteries burn the same, when we know that NMC will burn differently from the iron batteries, and the reason for this bias is sometimes the availability of the material.
00:12:55.335 --> 00:13:01.826
So sometimes it's very tough to get the material that you would really want to work with or that would be really representative.
00:13:01.826 --> 00:13:07.001
It can be the broadness of materials in the world.
00:13:07.001 --> 00:13:10.065
There could be hundreds of types of specific material.
00:13:10.065 --> 00:13:16.768
It's very, very difficult to choose a representative sample and you can just have your own preferences.
00:13:16.768 --> 00:13:18.191
Well, that's a different bias.
00:13:18.191 --> 00:13:33.053
But in general, if you select a specific material and it's not representative to the generalized real world conditions, it's a selection bias and you have to look into the papers about how the materials were chosen.
00:13:33.664 --> 00:13:36.214
Another one is called confirmation bias.
00:13:36.214 --> 00:13:42.994
That's an interesting one and that comes to the way how we pursue science as scientists.
00:13:42.994 --> 00:13:59.591
So we're humans, we love what we are doing, but this is our life, this is our jobs and as every job, you have some job security With every job, well, you have to do it.
00:13:59.591 --> 00:14:08.739
Well to be a scientist, and because we're scientists, we're also quite connected with the ideas we come up in science.
00:14:08.739 --> 00:14:12.427
Also, you know, quite connected with the ideas we come up in science.
00:14:12.447 --> 00:14:17.538
So sometimes you would have some sort of hypothesis or a belief, going into the science, going into the experiment.
00:14:17.538 --> 00:14:27.851
And the confirmation bias means that you look at the experiment, you look at the data that you have through that hypothesis that you formulated beforehand.
00:14:27.851 --> 00:14:38.339
So it's easier to confirm a pre-existing belief than find something new, and that can lead to overlooking critical findings.
00:14:38.339 --> 00:14:54.595
And this is quite interesting because many times we look at all experiments done by someone else and we start seeing things that those people have not seen, that those people have missed out because of this confirmation bias.
00:14:54.595 --> 00:15:03.197
So scientists tend to be very favorable towards their own ideas and sometimes you find that reflected in the experiments.
00:15:03.197 --> 00:15:05.489
It's always best to look at raw data.
00:15:05.489 --> 00:15:12.067
It's always best to try and come up with your own conclusions from the data that you see.
00:15:12.067 --> 00:15:17.726
Of course it's challenging because of the context of the study, so you need to understand that as well.
00:15:17.726 --> 00:15:20.894
But yeah, confirmation bias is a real thing.
00:15:21.455 --> 00:15:25.849
Another two, that's measurement bias, instrumental bias.
00:15:25.849 --> 00:15:33.907
I would say those are similar and they come from how a technology is used to measure.
00:15:33.907 --> 00:15:47.796
So measurement bias would be some sort of systematic error in measurement tools, let's say the uncertainty of your measurements, and instrumental bias would be what kind of technology was available to you when you pursued an experiment.
00:15:47.796 --> 00:16:00.655
Both will be very important in fire science as well, because we have very limited tool set to measure fires and I'll actually come back to those when we go to the fire specific part of the episode.
00:16:01.157 --> 00:16:03.826
The next I have is publication bias.
00:16:03.826 --> 00:16:10.139
So publication bias is something that we observe in scientific literature as a whole thing.
00:16:10.139 --> 00:16:15.236
So this problem is actually quite deep and very, very challenging.
00:16:15.236 --> 00:16:25.419
The thing is that in modern science, as I said, we're scientists, we have to live our lives, earn our living, and we do that by publishing right.
00:16:25.419 --> 00:16:37.167
So as a scientist, there's kind of this expectation that you will publish papers and what you write, the way how you do the papers, because creating a paper is a lot of work.
00:16:37.167 --> 00:16:38.591
It's quite challenging.
00:16:38.591 --> 00:16:41.927
Actually, you want to create a paper that's going to get published, right.
00:16:41.927 --> 00:16:52.366
Realistically, you don't want to write a paper that will never be published and it is very difficult actually to publish experiments that failed be published and it is very difficult actually to publish experiments that failed, for example.
00:16:52.366 --> 00:17:13.472
So the publication bias is something that mostly positive experiments or successful experimental results are published there within the literature and there is very little inconclusive results published and very, very few negative results in the literature, especially if you had a hypothesis and you'd done an experiment.
00:17:13.472 --> 00:17:16.326
It gives you no conclusive answer to that.
00:17:16.326 --> 00:17:22.659
That's very challenging to publish and I would add on that the reproduction crisis.
00:17:22.659 --> 00:17:40.397
So in medicine, when someone publishes a study in a peer-reviewed journal, that's great, but the study is not really used by the community until someone repeats that experiment and publishes a confirmation study that yes, we did the same and it actually worked the same.
00:17:40.397 --> 00:17:42.432
In our science we don't have that.
00:17:42.432 --> 00:17:45.335
We almost have no reproducible research.
00:17:45.335 --> 00:17:53.115
There's very, very few attempts of people repeating other people's experiment just to check if they work or not.
00:17:53.115 --> 00:18:00.617
I would call that a crisis because this is something very fundamental and very needed for modern science.
00:18:00.617 --> 00:18:09.298
But more than that, not having negative outcomes, not having null outcomes in the science, is something extremely challenging.
00:18:09.404 --> 00:18:13.896
And you know I love the story of how the cosmic background radiation was discovered.
00:18:13.896 --> 00:18:31.508
I'm not sure if you know that study, but there were some guys who were trying to build a really precise radio telescope and they wanted to, like, reduce any kind of noise that would be around that telescope, you know, to have the cleanest, sharpest radio images of the universe.
00:18:31.508 --> 00:18:34.815
And they've systematically cleaned everything.
00:18:34.815 --> 00:18:43.605
There was no source of interference with their telescope and they were shooting pictures of the sky and they were getting this noisy, noisy data.
00:18:43.605 --> 00:18:48.130
It was super annoying to them, so you could call that a failed experiment.
00:18:48.130 --> 00:18:50.550
It was super annoying to them, so you could call that a failed experiment, a negative result.
00:18:50.550 --> 00:19:13.528
Right, they wanted to have a perfect telescope, but they have a lot of noise in their research and then someone pointed them out to some other scientists and they found out that the noise is actually cosmic background radiation, the remnant of the Big Bang, and they went and got the Nobel Prize for that that.
00:19:13.548 --> 00:19:26.972
So a negative result turned into a noble price and and you don't know that you will never know where your science can go and you, because of the biases related to hypothesis the confirmation bias, for example you cannot really figure that out.
00:19:26.972 --> 00:19:33.271
So even if your result looks negative at the first point, it actually could be a positive result for someone else.
00:19:33.271 --> 00:19:40.435
So I would highly urge everyone to publish everything you get, even if you cannot publish it in peer-reviewed journals.
00:19:40.435 --> 00:19:52.037
I would highly appreciate peer-reviewed journals to publish this type of papers and I'm confident that if you write a good, convincing editorial letter, it would be considered for publication.
00:19:52.037 --> 00:20:01.867
But even if not, then perhaps using open access to publish those results could be good, and this is something I truly believe we would need more in fire science.
00:20:01.867 --> 00:20:10.817
Wow, this was a longer one, but it's a big problem in science that people do not really get and it's something that really slows us down.
00:20:11.525 --> 00:20:17.045
Another one is observer bias, so it's kind of related to the confirmation bias previously.
00:20:17.045 --> 00:20:25.059
It means that the observer can unconsciously influence or interpret the results in a way that align with their expectations.
00:20:25.059 --> 00:20:36.005
It's basically the error, the systematic error, in which the researcher interprets visual data, their observations and yes, in FHIR we have a lot of observations.
00:20:36.005 --> 00:20:41.096
So even a simple thing as like did flashover occur in an experiment?
00:20:41.096 --> 00:20:42.286
I've been in those discussions.
00:20:42.286 --> 00:20:49.172
So that's an observer bias, someone introducing some errors to the study based on their experience.
00:20:49.805 --> 00:20:55.292
The next would be stuff that's related to data analysis, so sampling bias, data analysis bias.
00:20:55.292 --> 00:20:59.215
This means that the way how you process data can give you different outcomes.
00:20:59.215 --> 00:21:03.713
There were great papers from University of Edinburgh on how to clean out the noisy data.
00:21:03.713 --> 00:21:07.236
The data in fire is always noisy because the fires are turbulent in their nature.
00:21:07.236 --> 00:21:14.951
So whatever you measure, it's probably going to be a noisy measurement and you want these nice clean lines to be used in your engineering.
00:21:14.951 --> 00:21:27.390
So you have to clean that data and the way how you sample it, the choice of methodology used to clean out the data, will highly, highly influence what outcomes you get.
00:21:27.390 --> 00:21:29.454
Another one environmental bias.
00:21:29.454 --> 00:21:31.849
We're going to come back to that in the fire part as well.
00:21:31.849 --> 00:21:40.438
This is how the environment affects the experiment and this is critical for fire science, so I'll give myself ability to talk about it later.
00:21:41.085 --> 00:21:42.469
One more is repetition bias.
00:21:42.469 --> 00:21:50.755
So people focus on specific results of single experiments, ignoring the inconsistent or divergent results.
00:21:50.755 --> 00:21:57.207
So this is very interesting and this goes back perhaps to the study of David Morissette when he was burning chairs.
00:21:57.207 --> 00:21:58.690
He's burned 25 chairs.
00:21:58.690 --> 00:22:07.083
They roughly aligned, but he was interested in why they diverge and while doing that he found some really, really interesting stuff.
00:22:07.083 --> 00:22:08.487
There's a podcast episode about that.
00:22:08.487 --> 00:22:21.115
So really looking into why there is a scatter in your results where does it come from and what does it represent is critical for science, and sometimes scientists would just go over that, would not analyze that.
00:22:21.115 --> 00:22:24.434
They would flatten out average stuff and just publish results.
00:22:24.434 --> 00:22:29.114
This is very important to understand that this can happen.
00:22:29.875 --> 00:22:36.795
And the final one on my list is conflict of interest or funding bias, and yeah, it's a real thing.
00:22:36.795 --> 00:22:44.546
So if you are funded by someone, there is unfortunately some possibility that the research could have been influenced by these people.
00:22:44.546 --> 00:22:46.250
Usually it's not actually.
00:22:46.250 --> 00:23:01.618
Having worked with a lot of funders, a lot of companies, private companies, private entities I've never been in a situation where they would try to silence me or when they would try to use their power of money to actually do some impact over my work.
00:23:01.618 --> 00:23:05.040
But I'm not sure how it's generalizable that is.
00:23:05.040 --> 00:23:22.313
Maybe it's just our case, but science knows this problem of funding bias, especially in medicine there was a lot of conflict of interest, like tobacco industry that was a massive one Oil industry and climate change, silencing some research, putting different narratives.
00:23:22.313 --> 00:23:24.567
Humanity knows those stories.
00:23:24.567 --> 00:23:29.785
So it's also something to look for when you are dealing with with scientific research.
00:23:30.426 --> 00:23:34.239
And those biases are funding sources or and sponsors.
00:23:34.239 --> 00:23:36.884
They should always be named in the papers, so you should know.
00:23:36.884 --> 00:23:39.550
And also one thing that we avoid.
00:23:39.550 --> 00:23:47.226
That is that we try to go with governmental fundings and whenever we have materials to be used in experiments, we used to purchase them.
00:23:47.226 --> 00:23:50.851
We very rarely take donations of materials for experiments.
00:23:50.851 --> 00:23:57.770
I prefer to purchase them to have a real market sample and be kind of independent from the sponsors of the study.
00:23:57.770 --> 00:24:02.952
But I know not everyone can have this luxury of being financially independent in their research.
00:24:02.952 --> 00:24:04.846
So yeah, look for that bias.
00:24:04.846 --> 00:24:11.727
So those were 10 biases that are kind of general to the research, general to science.
00:24:11.727 --> 00:24:21.932
Every single branch of science, every single experiment could be burned with those biases and I tried to nail how they relate to the fire science.
00:24:21.932 --> 00:24:26.126
I leave it to you to figure out how much relatable they are.
00:24:26.126 --> 00:24:30.664
And there will be some biases that are highly related to fire science and we'll go into those.
00:24:30.779 --> 00:24:32.446
But first let's set up the stage.
00:24:32.446 --> 00:24:37.491
So how would an objective experiment in fire science look like?
00:24:37.491 --> 00:24:41.811
Is it even possible to have an objective study in fire science?
00:24:41.811 --> 00:24:49.431
And if you think about it deeply, a fire problem is an extremely contextualized problem.
00:24:49.431 --> 00:24:54.191
In general, fire behavior and fires are extremely contextualized.
00:24:54.191 --> 00:25:01.354
This context comes from the architecture or the environment in which the fire is occurring.
00:25:01.354 --> 00:25:16.151
So the place, the physical constraints around your fire, the airflow introduced by the geometry that you're studying fire in, will extremely affect the way how the fire behaves.
00:25:16.151 --> 00:25:26.646
If you have a fire in an open space, in quiescent conditions, imagine having an aircraft hangar with stable air, nothing influencing with your fire.
00:25:26.646 --> 00:25:31.631
In there you burn a pile of material, you get this beautiful axisymmetric bouillon smoke plume.
00:25:31.631 --> 00:25:39.413
Yeah, that is something that's perhaps unaffected by architecture, but this is representative to the real world.
00:25:39.413 --> 00:25:44.092
Like in every single building, that fire would be affected by architecture in some way.
00:25:44.092 --> 00:25:54.567
You could have wall interactions, you could have interactions with the ceilings, you will have heat feedbacks, you will have flow feedbacks, you will have changes in oxygen concentration and so on and so forth.
00:25:54.567 --> 00:26:06.601
So the architecture will always influence our fires and it's not a problem, it's a feature, it's a characteristic of the space of engineering that we're in.
00:26:06.601 --> 00:26:20.471
Fires are extremely contextualized and it's difficult to present fire in an objective context Objective context, own context and the difference between those two must be very well understood by the fire engineer.
00:26:32.160 --> 00:26:35.567
Another thing that goes into this is fuel and the ignition location.
00:26:35.567 --> 00:26:43.540
So, again going back to Morissette's work, the way how you ignite will highly influence how the fire spreads over your item.
00:26:43.540 --> 00:26:58.388
The location of the ignition, the energy of the ignition, the material properties all of this will introduce some sort of bias into the study and will set the fire on a very specific course If you ignite it in a different way, you'll get a different outcome.
00:26:58.388 --> 00:27:06.986
Hell, if you ignite it like in the same place, same way, you may get a different outcome, of course, but if you ignite it in a different way, the outcomes will be different.
00:27:06.986 --> 00:27:18.521
And now let's think about is there an objective way to ignite an item in the research, because imagine, I have a car to burn down.
00:27:18.521 --> 00:27:22.431
Is there an objective choice of the location where I put my source of ignition in that car to say, yes, cars burn like this?
00:27:22.431 --> 00:27:23.252
Of course not.
00:27:23.252 --> 00:27:25.329
You can't do it with one sample.
00:27:25.329 --> 00:27:35.643
I probably would have to burn hundreds of them to have some kind of distribution of the causes of fires and then understand how different locations of ignitions act like.
00:27:35.643 --> 00:27:39.050
But I don't have $100 million to pursue such a study.
00:27:39.050 --> 00:27:39.873
No one has.
00:27:40.400 --> 00:27:50.711
This is why engineers and researchers who carry those tests and the research laboratories have to pick something, and what they pick will highly influence the outcomes of their studies.
00:27:50.711 --> 00:27:52.343
You have to pay attention to that.
00:27:52.343 --> 00:27:54.391
That's the case of electric vehicles and fires.
00:27:54.391 --> 00:28:04.205
You will only find experiments in which the battery is directly attacked, where in the real world this might not be the main representative scenario.
00:28:04.205 --> 00:28:06.190
If the fire starts in the battery.
00:28:06.190 --> 00:28:12.509
It's still a different cause of the fire than when you put a two megawatt heptane fire underneath your battery, right?
00:28:12.509 --> 00:28:17.288
So the fuel and ignition will play an immense role in the fire experiments.
00:28:17.808 --> 00:28:20.163
Last one, the conditions in which the burning is happening.
00:28:20.163 --> 00:28:25.761
And you could have a scatter from extremely well controlled fire laboratory conditions.
00:28:25.761 --> 00:28:28.989
You control temperature, humidity, airflow, everything.
00:28:28.989 --> 00:28:37.854
You have the perfect setting Up to the real-world studies where we burn buildings outdoors in whatever weather we can find.
00:28:37.854 --> 00:28:41.009
It even matters if it's rained the day before.
00:28:41.009 --> 00:28:44.509
Humidity of the air will be a factor in the experiment.
00:28:44.509 --> 00:28:46.465
Wind will be a massive factor.
00:28:46.465 --> 00:28:49.486
Air will be a factor in the experiment.
00:28:49.486 --> 00:28:50.189
Wind will be a massive factor.
00:28:50.189 --> 00:28:53.983
I used to think that if you have wind less than, let's say, two meters per second, there's no direct impact, but that's not true.
00:28:53.983 --> 00:29:04.182
Even a very low wind velocity, even one meter per second, can already change the flow paths in your experiment and, because of that, indirectly influence the experimental outcomes.