Charles Jelen: Anna Ready
Season three. Yeah. Geez. Come all this way. I didn't really think we'd make it past season one. Here we are.
Dan Gentry: Alright, ready? Let's go.
Charles Jelen: You are listening to Cool Air Hot Takes.
Welcome, welcome to Cool Air Hot Takes. We are back in better than ever for season three. Uh, we're your hosts. My name is Charlie Gellan, along with my running mate here, Mr. Dan Gentry. And this is a show all about buildings energy, HVAC. So if any of that is interesting to you, you are in the right place.
Dan Gentry: Thank you to everyone who's listened from seasons one and two, and of course a very warm welcome. If you're a first time listener, we're very happy to have you here popping into the podcast.
Charles Jelen: So what we've got lined up for the episode today, first off, HVAC headlines, we're gonna get you caught up on the latest news in HVAC.
Our guest today, now listener, this guy has the pulse of the largest company in the world. We've got, Ali had the director of technical design from Nvidia. He's gonna join us today. It's gonna be a very, very interesting, uh, interview. And so we're gonna extend it out a little bit. And then after that. We're gonna go right into your stat of the day.
Gonna be a good one. Now Danno. Yo, I know you like to jump right into the hot take portion of the episode, but we actually got a good little letter from the inbox. Let me pull it up here. All right. Hi. Cool. Air hot takes, guys. This question is for Dan. I've been working on an air cooled chiller design for a data center, and I keep running into one challenge sizing the atic cooling system.
Now, if we stopped here, you'd be like, yes, this, this makes sense. People reaching out for help on, on sizing things. All right, I'm gonna keep reading. Now. If I had a mustache like Dan's, which let's be honest, has more surface area than some fin coils, would that reduce the need for atic pre-cooling because of the natural shading and airflow optimization?
Or would the sheer size of your mustache demand even more cooling just to keep everything at peak efficiency? Thanks for taking my question. Sincerely, cool. Air Hot takes fan club member.
Dan Gentry: We love the email and I love the, uh, applause for the mustache. How would we answer that? I mean, a lot of fins per inch, maybe, if you will, whiskers per inch, maybe whiskers per inch, you can reject more heat.
Um, I've never heard a question relate heat rejection to mustaches before. So I'm just gonna give props to the writer.
Charles Jelen: Guess who it was?
Dan Gentry: Uh. I'm gonna guess Grant.
Charles Jelen: Good one. Grant from Minnesota. Yeah. Grant from Minnesota. Uh, Tony Arango. Okay. Yep.
Dan Gentry: Well add a mustache envy out there. Guys. All you gotta do is stick with it and, uh, be strong.
Charles Jelen: All right. What's, what's your hot take for the, for the episode,
Dan Gentry: season one or season three? Season three. First hot, take of season three. So, I'm excited about this one. Carburetors. The take is, we've been talking about a lot of electrification, but let's not forget carburetors are out there, something we're gonna live with.
And I want to give a couple examples. So all within the last week, you guys, this is actually really funny. So about a week ago, wood splitter wasn't running, so I'm like, ah, you know what's going on? I ordered a new carburetor. Century slapped it in, had to do a little fuel line stuff, whatever. Couple pulls, adjusted the throttle of bit, and she's running like a dream splitting wood.
So I felt like a, you know, a mechanic. I was very high. I was feeling pretty good about that. So we're going ride dirt bikes last weekend. Uh, my son's dirt bike is idling a little low. So I get in there, adjust the air and the fuel. I screwed it all up. It was like throttle dead. I was like, man, what the hell?
I got kind of upset. So I, you know, wallowed around in misery for a little bit and then went back outside and gave it a whirl, gave it a couple turns, and this baby was humming. I'm like, Manav, get out here. Take her down the street. And he was running like a dream. So I'm hot on carburetors because I'm figuring out how to get my stuff running.
Charles Jelen: Okay. What do you got? My hot take is maybe this isn't so hot, but engineers need some humility. Oh, okay. Mm-hmm. And, and this one hit me in the face real hard. Oh, I was talking to Tony Arango, same guy that sent you the mustache email about lift or load. We were talking about data center stuff, but why would I select a centrifugal compressor over a screw compressor?
And I was like, well, a lot of it's gonna come down to, to lift for slow and, and where the operating map is gonna be. And, and one of 'em will, will, will be better served for that application than the other. And then I tried to explain Lift for slow and I did an awful job. It was, it came out just like didn't come out smooth, you know, like you
Dan Gentry: used to like dream about lift first load in your sleep.
I,
Charles Jelen: I know, I know. That's what I'm saying. You need a little humility. You gotta go back to basics. You gotta remember the fundamentals. No, I, you
Dan Gentry: gotta brush up on those. We've been getting a lot of asks from our department to go to offices and be like, Hey, can you guys just like cover the basics? I like to say blocking and tackling.
Yeah. Lyft versus loads big ones. Starters versus drives. What's my Delta T? Yeah. All that kind of stuff. I think our industry is getting so young. With all these senior people retiring and, uh, all these younger folks are coming in and there's a lot of education that has to be done. And a lot of this basics is isn't taught in school if you're not all into HVAC.
So, I, I love it. I think that's hot. Alright.
Charles Jelen: All right, listener. Up next, we got your HVAC, headlines,
HVAC. Headlines your news today. Alright, listener, it's three o'clock in Manitowoc. Here's your headlines. Headline number one, Legionella just one of many HVAC risks. Microbiologist says, uh, this is from Facility Drive. If you're paying attention to, to HVAC news at all, you, you've probably seen this, there's been a handful of outbreaks of legionella across the country.
Most notably a couple months back, there was an outbreak in New York City and actually four people died from this, and dozens more got sick from it. I didn't realize that. Yeah. Yeah. That's crazy.
Dan Gentry: Yeah.
Charles Jelen: Brutal. For listeners not familiar with, with Legionnaire's disease, it's a type of very serious pneumonia caused by a type of bacteria called legionella.
Legionella was originally discovered from an outbreak in 1976. There was a a, a Convention of the American Legion, which is where Legionnaires comes from, and then that's where it was originally discovered. The Legionella bacteria is actually. Natural, it's present all over in really low concentrations in lakes, rivers, and groundwater.
Where it becomes a problem is when it's allowed to grow uncontrolled. So this usually happens in either stagnant water or warm water where the water itself is untreated. In our industry, in the HVAC space, cooling towers can be a spot where legionella becomes a problem because you often have elevated temperatures, cooling towers, reject heat.
The water temperature is sometimes slow moving, sometimes kind of stagnant depending on its operation. And then on top of that, you have condenser fans that are. Evaporating water. And so you get these like mist droplets that get thrown around, and if you have legionella, that becomes a problem. But to reduce or eliminate this risk, first thing is good preventative maintenance for your cooling tower, for any of your water systems.
If you need help with that. We do have resources available. We can check your water quality, we can check and make sure that you have the right preventative maintenance in place to help take care of that. Secondly, if you're looking to design something new, you can actually completely move away from water cooled systems.
There is a trade off there around energy efficiency. Cooling towers are very, very efficient in terms of how they reject heat, and if you move away from that, you end up putting more energy into that system. So there is a trade off there, but. One thing to note, you know, Legionnaires due to cooling towers is not common.
There are millions of cooling tower systems operating all over the world that are properly taken care of and do not have a big risk to human health. Yeah. Alright, headline number two. Why prices are soaring in the country's largest grid region. Explained in five charts. This is from Maryland Matters. I like it.
Nice. Maryland matters. Nice site. Maryland matters. Uh, we actually talked about this once before. I think it was in season one. We were talking about the pricing dynamics on PJMP. JM oversees the grid and the electricity markets. The territory stretches from New Jersey ish to Chicago ish in that part of the country.
Originally, we brought this up because there was a massive price spike for the PJM capacity auction. It went from about $2 billion in 2024, all the way up to $15 billion in 2025. And these are costs that are ultimately spread across to the users of electricity in that market. Now, there was some speculation at the time that this was gonna be a single event, that there were gonna be some measures put in place and we were gonna add a bunch of generation, and this was gonna kind of go away.
Well, that didn't happen. Uh, the 2026 auction closed a couple months ago and it went up to $16 billion, so it went even higher than last year. The article cites something that's pretty simple,
Dan Gentry: I'm guessing, closing down power plants and not bringing on enough new energy.
Charles Jelen: It's not location, location, location.
This time it's. Supply and demand. Just supply and demand. So there is, you know, there's an unprecedented, uh, like high in demand for electricity. There's one chart that is really kind of indicative of where we're at right now, and it shows a projection of how much future capacity they thought they needed.
So in 2021, they projected out how much power they needed in 2036. We need 155 gigawatts of power. The next year in 2022, they did the same projection and they projected I need 156. So one more gigawatt. Okay. Seems to be going the wrong way. Pretty close. In 2023, they projected we need 165 gigawatts in 2025.
This year they projected 212 gigawatts. So originally they estimated that they thought we'd add five gigawatts in 15 years. Now, today they're, they're estimating in 10 years, we're adding 62 instead of five. Um, it just shows that the, the demand is for electricity is, is skyrocketing. It's from data centers, it's from electrification, it's from industrial build out.
All of those are increasing data centers, no doubt being the biggest part of that. That is interesting. What can people do? Uh, be more efficient. Be more efficient? Yeah, that's a big one. Get efficient. Uh, when energy costs are high energy reduction projects pay back faster. And then. Second one, if, if you weren't interested in demand response programs in the past, check 'em out.
Um, it's, it's definitely if, especially if, if you already know you have flexible loads and you can turn things off or, or shift energy around in that market, specifically in PJM, demand response is gonna be more and more lucrative moving forward. Interesting.
Dan Gentry: It's good headlines.
Charles Jelen: All right. Up next we got Ali Hadi, the Director of Technical Design at Nvidia.
Don't wanna miss this one. Stay tuned.
Let's talk comfort, eh? Dan, what's your favorite kind of comfort?
Dan Gentry: I would say, uh, like sunny South Florida in like July.
Charles Jelen: Okay. Well, mine is the kind of comfort that doesn't make you cry when you see your utility bill.
Dan Gentry: I like that. We've just, the thing introducing trains precedent, hybrid rooftop, dual fuel, heat pump, the system, that's basically the HVAC equipment of having both brains and biceps.
Charles Jelen: Unfortunately, we have neither of those things, so in mild weather it runs on electric power. When things get frigid, it flips the gas heat, giving you the best of both worlds.
Dan Gentry: So whether you're heating or cooling in a mild climate or cold climate trains. Got you.
Charles Jelen: Alright, listener. We have a true expert and there is zero hyperbole in that statement. This man is at the forefront of his industry. He's designing some of the most critical systems that the world will be reliant on for the foreseeable future. Welcome to Cool Air Hot Takes Mr. Ali had. How you doing today, Ali?
Ali Heydari: I'm great, uh, honored to be here, uh, privileged to have this opportunity to talk with your audience.
Charles Jelen: Well, we're gonna get into the professional side of what you do. Absolutely. But I'd like to give the listener a little more personal introduction. I, I follow you on LinkedIn. You're traveling all over the world.
You're meeting with big companies, you're going to conferences, but what do you do for fun? What does Ali do for fun? What are your hobbies? Give the listener a little bit of personal intro.
Ali Heydari: Personally, I play soccer. I play soccer twice a week, and that's, that's one of my biggest passions. I follow Liverpool.
I love Liverpool for everything they do. Big Liverpool fan. So besides that, uh, personal life, I am a beekeeper. I have multiple, multiple hives of pea and I make a lot of funny, nice. That means I make a lot of friends with
Charles Jelen: Love it. What, uh, what position do you play for soccer
Ali Heydari: forward? I play forward, I try to score three or four goals, but you can imagine at my age, you know, people have to leave a for me to score.
Charles Jelen: Oh, that's all good. Well, that, that's very good. Thank you for that. Um. So we wanna start a little bit of, of your background. You've worked for many notable companies. Can you give us a little bit of your background and, and how you got to where you're at today?
Ali Heydari: I graduated from Berkeley and I'm terminal scientist.
Actually. My work, um, on my PhD dissertation had to do with nuclear reactor cooling. I worked for, as a professor for 10 years of my life. I taught term sciences, uh, in Iran, moved to, uh, barrier back again as a visiting professor to Berkeley and uh. This was just around the.com era, and I got, um, engaged in industry.
I wanted to always explore my passion in building products and technologies outside of my classroom. And, you know, I found ways of, you know, being able to build, um, industries that, you know, based on the knowledge I had. I worked at Sun Microsystems for about eight and a half years and moved to Facebook early days, 2018.
Built Facebook. After that four years I built Twitter. These are all pretty much pure, IPO took a lot of, you know, work getting through the companies as they tried to shape and then was pioneering, uh, the open compute project pretty much, you know, one of the first product that was. Built openly from, uh, Facebook was what I designed, at least on the cooling side of it.
And then, uh, Twitter. After that, I moved to Baidu, a Chinese company. Uh, spent four years there building data center, cooling, um, technology. Uh, it was very u unique opportunity, exposed me to liquid cooling. After that, I moved actually to a completely different direction. I moved to build, uh, quantum computers.
So my life, I've been building everything with air cooling, uh, you know, 30, 40 degrees C two. Quantum to four degrees Kel wind and moved to Nvidia in 2019 and started, uh, looking to liquid cooling. We knew that innovation, my experience, what I had done previously in industry, that air cooling will have its time.
And, uh, Nvidia had produced a great opportunity to. Bring in widespread application of single phase liquid cooling. So it settled with PG 25. We changed basically the medium of exchange of heat, which was air all around the world to a unified PG 25, 20 5% concentration of propylene glycol with water. And that choice has been the working fluid of record.
And based on that, we built, uh, what, where we are today, we have some of the biggest synergies in the industry moving from air cooling to liquid cooling, something that has been the plan, I mean, in build since early forties. And, uh, we are completely moving everything to a hundred percent liquid cooling.
Dan Gentry: Can you give our, our listener a, a reason why you would go to, uh, 25% propylene instead of pure water?
Ali Heydari: So our biggest challenge is, and anybody who has any spas and pools, those, you know, the maintenance of anything that has to do with water alone, with odd additives, you know, is adding bacterial buildup.
And so we wanted to avoid that. Corrosion is one of the biggest issues in liquid cooling. And so PG 25 was a concentration that. Kept the conductivity and the thermal performance of water to an acceptable level, but brought in great deal of resiliency and we've been running with PG 25 pretty successfully, with very little issues since about 20 20, 20 21.
Charles Jelen: You are the director of technical design for nvidia. What are you responsible for in in that role?
Ali Heydari: My role is developing data center, cooling technology that meets, uh, NVIDIA's, uh, roadmap. So, um, since 2019 I've been working with my team, which is a great team of, uh, engineers and we created a platform for under.
How liquid cooling can be developed in, in an IT data center industry as well as, you know, um, where they're gonna be deployed. So our work basically started with creating what we call emulators. We emulated, uh, thermal what would be a kind of a future design of an any rack that has demand for high intensity.
So. Never used any silicon. Everything that was done was, you know, built up with, you know, this, uh, emulators thermal test vehicles that, you know, basically emulated the thermo physical behavior of a rack with CPUs and GPUs and everything inside of it. And. So based on that, we basically were able to create our own roadmap and we looked into what if the racks went from what used to be 40, 50 kilowatts to couple hundred kilowatts and beyond, with having that level of, um, ability in, uh, you know, creating heat in the same platform footprint of an actual CP or GPU.
And working with industry, our partners, uh, mostly from Southeast Asia, some in the US and Europe to create cold place. Uh, you know, we. Created the first one of a kind lab. I call that lab ground zero for liquid cooling development. And so it created the first cluster of this rack. So we had multiple of racks with thermal test vehicles, highly instrumented inside, outside of a rack.
We know exactly how much fluid is going through each one of the coal plates. Temperature, everything in between. And so we were able to create the basics of what would become today, uh, a cluster or a part of an AI factory. And so having that ability enabled us to look into the transition from air cooling to liquid cooling, which will take place.
And, uh, actually interesting thing, thing that we found out is. That thermodynamics did not break the back of air cooling. It was geometrical limitations and volume limitations of the racks with the density required that we needed for interconnectivity and placing so many silicons next to each other, uh, with the heat sinks that were like 80 U height or six U height with fans behind them.
Because yeah, you could always get a hundred kilowatts plus of air cooling with fans and heat sinks. It would be very loud, over a hundred decibels, potentially very disruptive. But you can, you could still cool and but the moment we changed that six to eight U hide heat sink with fans behind it with the one.
Cold plate with the Skyped fin, kind of surgically removing heat from silicon with the heat cold plate attached to the back of the GP or you know, CPU emulators. We were able to demonstrate that wow, there's a huge amount of ability to remove heat from this rack. And we demonstrated that, yeah, actually multiple hundreds of kilowatts of heat can be removed.
And then, you know, suddenly we started seeing that. Yeah, I mean there's actually. A roadmap for us ahead of us that Nvidia has announced, you know, multi megawatt, uh, racks. So if you look at the Moore's Law and you know how Moore's Law flattened, uh, you know, for years with GPU and invention of ai, we see that take off and we are back to the same Moore law sort of, um, pathway and we, we liquid cooling, we are able to
Charles Jelen: capture that.
Yeah, thanks for kind of like the, the transition there, it sounds like what pushed us from air cooling to liquid cooling was, is physical constraints. Exactly. How did cold plate or liquid cooling direct to ship, how did that win out over other technologies, you think, submersion or two-phase? What made you guys pick direct to ship single phase?
So, during
Ali Heydari: the years I was at Bedu, um, we had data centers that was like. Inner Mongolia, I mean as far away as you can think about, and I always think about the data center has to be managed by ways that anybody, without any, any tools other than their fingers without any instructions other than what they're used to with air cooling, have to be able to manage that.
So I started looking at liquid cooling, exactly the same way that we treat power and electric electricity. You never see electrons running in the power lines. You just connect with a C 19, C 20 plug to the back of a server and you, you are safe. You are safe in operations. We thought about liquid, exact same thing.
Liquid can flow inside of the pipes and you can plug in with a qd quick disconnect to the servers and to the racks, and you know, basically you would not require. Any special tool for your hand except your own, your own fingers, and the torque and the, you know, force associated with that should be such that anybody, female or male should be able to handle that.
So the model that I had always in my mind was an air cool drag that's, you know, for the LA last, you know, six 70 years, everybody has been so used to let's build something that follows the norm that everybody, anywhere in the world. Inner Mongolia, or let's say downtown Manhattan is able to operate without any special instruction.
When I put myself in that position and look at it, you know, maybe two or 300, you know, pond server, that has to be lifted from a tank of liquid with ARM or with robotics and having to drain that because you know you have a immersion tank. It put me in an awkward position. I mean, is this something that can be deployed anywhere in the world?
Anytime and can, can anybody without any tools deployed, it became obvious. That's not gonna work. That's not gonna be something that, you know, not to say that you know, they having deployment of that, but you can see naturally from human behavior from where we are so used to. Any industry, anything that we work with, any generation of car that comes pretty much follows the same trend of what used to before.
I tried to do the same thing with racks going to megawatts, racks keeping exact same behavior that, you know, all of us, uh, you know, through this industry have been used to and operator. So that eliminated immersion very quickly. And then looking into the immersion, we actually did a study with immersion tanks and we saw that, you know, there's actually limitations, the roadmap, there should be a roadmap.
And so we didn't see any roadmap to the future of that. And we saw a very clear roadmap in the future of Direct to Chip liquid cooling. We started with PG 25 as a single phase, luckily in between Department of Energy granted us a project and we got a project for developing a two-phase application, uh, for megawatt track.
So now we have a two phase application with direct to, to pump refrigeration in a cold plate. And we also have a single phase that NVIDIA is producing. So. I think end to end, we have technology very well covered and, um, to me that that will be basically the technology that can, you know, save us for, for the upcoming, the near future, at least till we see what happens next.
That's really
Charles Jelen: interesting. As you think about that, you ubiquitous kind of concept with the human element. As racks get bigger, what what has to give? Because in my mind there's kind of this. Back and forth, almost like a seesaw effect between temperatures for the technical loop they wanna increase so that we can get rid of mechanical cooling.
But then we're packing more and more heat and we're getting denser, and then those temperatures come down and it's almost like they go back and forth with each iteration of technology. Where is that
Ali Heydari: headed? This is a great question That's a, you know, challenge that we are facing head on and I would strongly recommend, uh, your audience to tune into, uh, G-T-C-T-C event.
Our CEO is gonna present, uh, some of the elements of, uh, AI factories and some of the, uh, releases, uh, that NVIDIA has produced. And you've seen similar gigawatt scale AI factories. We call it AI factor, not data center because it's a mission, you know, oriented. Data center, it's only is built to produce tokens.
Tokens is Intelligence Chat. GPT Open AI has, that's one version of that, and that's, that's a outcome based on the platforms that Edia produce for them. Now, going inside of a data center, there's demand for higher competitions as we are trying to get as close as possible and beyond capabilities of human mind.
Producing as much, you know, tokens that, uh, uh, a cluster of our devices can think as fast as a human mind and as efficient as human mind. Now with that, there is a cost of, you know, power consumption. So that, uh, brings a challenge. We need power. We need to have enough power to, you know, power of the data centers that can create tokens and intelligence and AI and cooling is probably the biggest hurdle.
In front of, you know, AI factory, if I have a gigawatt AI factory, if I'm spending 20% of the power to the IT for auxiliaries, majority of which is like through the killers and CDUs and the pumps and the cross and what have you, that means that I have to put 200 megawatts reserved for a gigawatt of power allocated for it.
200 megawatts today is bigger than most data centers that are built. So when I was building Facebook data center, there were like 50, 60 megawatts. So 200 megawatts is not something that we can ignore. If I can cut it down 200 megawatts, that means that I have a hundred megawatts of additional it that for a hundred kilowatt rack, that's a thousand additional racks that they can put inside a thousand additional.
Ways of producing tokens and intelligence just in terms of the cost. If each one of those racks is multimillion, uh, three to $5 million, you can think about the cost benefit of not having to lose, you know, power to losses in this hit reject system, but rather produce tokens now. How can we do that efficiently?
Uh, you know, of course anything that, you know, goes with cooler water to the chip means that we have air cooled or water cooled chillers, which are, you know, vapor compression mechanical systems. There are losses engaged with them anytime that we can use dry coolers and not get any mechanical cooling systems engaged.
Drive, you know, heat directly from fans and pumps to ambient that is gain high, high heat loss from the racks. So we have racks that are multi MegaBots that require, let's say a certain temperature, but at the heat reject portion of it. We have chillers and chillers that can produce lower temperature.
It's much better for, you know, reliability. It's, it can manage a whole lot of heat removal with a smaller footprint of that, you know, heat reject system. But there's a cost associated and the cost is the loss of power. Now, if we increase the temperature and have dry coolers and reject heat with odd interfering of the mechanical, mechanical systems, then we gain, you know, tokens.
But there is reliability impacts and there is functionality. Can I deploy that anywhere? Anytime they work. So we are working with industries first, figuring out what best works for gigawatt scale data centers. So we are working through, build up of the blueprints of this AI factors, what we call it, working with partners such as strain, finding ways that technology, I mean we, we can't get the technology of 20, 30 years ago.
For AI factors that are, you know, in some estimates, you know, we may need to have 250 gigawatt of, uh, you know, uh, that's tremendous amount of, I don't know how, where is that? But if you have 250 gigawatts of heat rejected to ambient, you can think about what type of losses we are talking about. So. VAs Recovery is one of the biggest challenges that we're looking into working again with companies that are predominant in this domain.
We working with small modular reactors. We're working with onsite on-prem power generation fuel cells. Anybody who has any ability to bring in power on onsite and uh, as well produce ways that we can. Recover that base heat and that recovered base heat can be used in absorption, chillers that can be used for creating cooling.
A lot of ways that that can be, we are, we are looking into partners into ways that we can work with industry and, you know, be good citizen of the universe and be able to take all of us through this new revolution, the AI revolution very smoothly, very efficiently, at the same time, you know, be able to, you know, not harm the environment.
Think that was a long answer to a small, short question. That
Charles Jelen: was a, that was a long answer, but I think what I picked up there is, there, there's the trade off between the, that technical loop temperature is, I, I, if we go higher, it probably means we, we don't get as dense and we, we take up more square footage to use dry coolers or we use chillers.
Strength of footprint. Cost goes up. Energy goes up. So there's trade offs on either side and where the future goes is, is probably not, not predictable at this
Ali Heydari: point. Exactly. You know, one of the tools that we create is, uh, called Digital Twin. Digital Twin is we have a platform, omniverse, and through Omniverse we create data center digital twin.
Part of what we are doing to our digital twin buildup is looking to the impact of the total cost of ownership, the poverty, utilization, efficiency. Think about digital twin is like a realistic 3D version of an AI factory. You can do walkthrough, you can change a chiller, you can change a, uh, cooling distribution, you can change, uh, a cry unit.
You can see instantly the impact on the entire data center, including the cost, including the. Public visualization, efficiency, all of that is part of making sure that, you know, we use the food that we create for our own, you know, so we create intelligence. And intelligence is helping us, uh, very effectively to accelerate our, all of our design works.
So we use AI extensively in our work. Right.
Charles Jelen: So I got one last question for you. Yes. Other than Liverpool winning the Premier League, what are you most excited about? Second time in a row, what are you most excited about in the industry?
Ali Heydari: I see that industries industry in, uh, they're super excited about ai.
I see a great deal of synergies in companies wanting to work together. And it's not either an Nvidia problem or, uh, Intel or a MD or you know, this and that. It's everybody's problem and the fact that everybody's coming together in one room. Work as one team and you know, truly looking into this, as you know, probably the biggest event that I can think about that has happened since Industrial Revolution is the AI revolution and I'm, I'm really pre, we.
I'm very privileged to be in this era of working with the teams that I'm working. And as much as I love Liverpool, I also love our partners.
Charles Jelen: Wow. We're, we're very privileged to have you on today. Oh. Thank you for, for coming on and spending some time with us. That was awesome.
Ali Heydari: Thank you. Thank you for this opportunity.
Alright.
Dan Gentry: We are all about the future of HVAC over here, so we want you to join us at the A HR Expo, the world's largest H-V-A-C-R marketplace.
Charles Jelen: That was kind of Uncle Sam of you right there. We want you train will be showing their latest innovations in heating and cooling solutions, discover, cutting edge technology, sustainable practices, and the products that are shaping the future of the industry.
Dan Gentry: Charlie, don't forget about the best part. We'll be there, live at the expo. Stop in at our booth and be a part of the conversation. We'll be diving into the latest trends and sharing hot takes on everything we see. Whether you're an HVAC veteran or just starting out, this is the place to be. Learn, network, and be inspired.
Let's connect. We can't wait to meet you.
Charles Jelen: Here comes Joe
Dan Gentry: Stat of the Day
Charles Jelen: of the day. Stat of the Day.
All right, listener, it's your stat of the day. This one's brought to you by construction physics, which states lost the most power in 2024. The stat that we're looking for is number of hours per customer of power lost in 2024. Let's start with the best, which, which states do you think have the most reliable power?
Dan Gentry: The most reliable? I, yeah. Gimme two. Gimme two. Yes. Like, uh, I'm gonna go with like, Texas is a good one and I think, uh. Uh, like Atlanta, Georgia. That's a state
Charles Jelen: location, location, location. Alright, the best, starting with number five, Rhode Island, they had two and a half hours per customer of power loss in 2024.
Connecticut 2.5 as well. Arizona 1.8, Massachusetts 1.7, and the District of Columbia, 1.6.
Dan Gentry: Hmm.
Charles Jelen: I wouldn't even had District Columbia on my radar. Well, they're not a state, so I can't blame you there, but, alright. Worst, what were the worst states? I'm gonna say Washington State and Vermont. Okay. Uh, number five.
In terms, or like number five, in terms of the worst, West Virginia, 27.2. Texas 27.4. Florida. Not a surprise there, you know, a lot of hurricanes, a lot of storms. 29.4. Big jump to Maine. Maine is 51.7. Hey, Maine's close to Vermont. That is, yeah, you're right. And then the worst state in the union for the most power lost is South Carolina.
Do they have like a natural disaster or something? What? Uh, 57.8. That's hours per customer in 2024. And the best is 1.6. Why? That seems like a lot. That's a lot of hours, but down. Wow. Get some generators going there. That's a gener Generac, uh, prime, prime sales spot. Tell you what that sta of the day is.
Very interesting.
All right, listener. Thank you for listening to this episode of Cool Air Hot Takes. We're very happy to be back for season three. Remember, new episodes are released every two weeks on Tuesdays. Leave us a comment on. Spotify or YouTube or a review on Apple,
Dan Gentry: big news listener, we've got merch. Send us your hot takes to cool air.hot takes@trane.com.
If we featured on the show, you might find some of that merch in your mailbox. Until next time, stay cool and keep those takes hot.