Erik: Welcome to the Industrial IoT Spotlight, your number one spot for insight from industrial IoT thought leaders who are transforming businesses today with your host, Erik Walenza.
Welcome back to the Industrial IoT Spotlight podcast. I'm your host, Erik Walenza, CEO of IoT ONE. And our guest today is Troy Harvey, CEO of Passive Logic. Passive Logic has built the first fully autonomous platform for managing the whole building lifecycle. In this talk, we discussed Passive Logic’s approach to applying edge computing based on their hive controller and swarm sensors. And we also explored the application of deep physics models to build a digital twin foundation for building operations optimization.
If you find these conversations valuable, please leave us a comment and a five-star review. And if you'd like to share your company's story or recommend a speaker, please email us at team@IoTone.com. Thank you. Troy, thank you for joining us today.
Troy: Absolutely. I'm glad to be here.
Erik: A lot of companies, technology, or industrial IoT companies are building something that is horizontal enough, where especially younger companies have a little bit trouble determining who are we serving. So I'm really looking forward to learning who you're working with, what types of problems you're solving, and then the tech stack behind that. But before we get into that detail, I think you're personally very interesting background. So I'd love to learn a little bit more around what you were doing before you set up Passive Logic in particular at Heliocentric, because it looks like this kind of directly led into the logic behind Passive Logic. And then why did you make a jump? You are you're already CEO of Heliocentric, I don't know if you sold that, or if you moved on from that organization, and then set up a Passive Logic back in 2016?
Troy: Yeah. So, what we really built at Heliocentric was a lot of expertise in the early 2000s around building simulation and building physics. And what we recognized at the time about buildings was for hundreds, if not thousands of years, architects and builders of buildings had always try to design buildings, and then just build them. And as an electrical engineer, this is sort of a crazy idea because in most other kinds of engineering, you always beta test things before you make the real one.
And so what we really built was a lot of expertise around simulating virtual buildings, the virtual people inside those buildings, and the virtual systems, so that we could work with architects and developers and engineers around how do they build a better building and take that their concepts and build different kinds of walls and different kinds of materials and different kinds of [inaudible 03:21] systems and reshaping the buildings and ways that we could get the outcome we wanted going through hundreds of variations of a building. And then we had a lot of guarantees around now we can go build that and have some certainty around how that building will operate and work.
And I think a lot of the rest of the industry, what we found was that you could design really great buildings. But as we found through the Lead process, when they hit the ground, they rarely lived up to their expectations, even if they were a great engineering, good installation, well-engineered and tweaked and tuned and commissioned and all the ways. And so I think we got involved through a department of energy contracts to start looking at these buildings.
And we recognize that the real core challenge in buildings was something that maybe the rest of the building industry hadn't kind of come to grips with, which was buildings are the world's most complex systems. They're effectively these huge giant, stationary robots but with a level of inputs and outputs. Sensors and control points are what we call degrees of freedom in robotics that's out of the scale of anything else people make. And that scale of problem just wasn't solvable with the tools that we were given to go make buildings work and this sort of fundamental underlying challenges.
Right about that time, I met my cofounder who hired my company to help them build a really cool house. And he had just come out of an IPO for his previous company called Fusion-io. So Fusion-io really innovated, and he was this founding engineer that came up with all the core technology of these flash drives that we all use today. So before that, there were flash chips, that were hard drives, but nobody had figured out how to use commodity flash chips in a really durable way, really, really fast and made these million times faster hard drives.
And that's what Fusion-io did it, really successful IPO, and they were selling themselves to SanDisk at the time. And he was building his house, and he hired my firm and he's like where's all the good controls? Where's the good automation platforms? And I was like, well, you're looking at all of it. And so I started showing him what we were working on. And he came on as an investor in that, and then within a couple months, he's like, wow, this is so much more interesting than Fusion-io ever was. I'd like to come on as a cofounder, and let's go build this company. So that's really the starting of Passive Logic four years ago,
Erik: You’ve mentioned two big tracks here. So one is when you have a better understanding of how this complex asset will function, you can identify potential issues, you can optimize efficiencies, and so forth. So there's this optimization angle in this sense check aspect. And then there's also potentially an innovation. If you're just going to build this complex asset, you can't really test it in advance, you probably got to do what you've done before. I imagine now if you can really do some deep digital twin, you have a lot more flexibility around testing out new concepts and more confidence that they might actually work in the real world.
Troy: And I think even what Heliocentric was doing in terms of let's help the design and engineering process on buildings make buildings better, even with that, what we found, and what the rest of the industry does is that's sort of the end of the road. You can design engineer things. But whatever you come up with, once it hits the ground, once it starts getting built, those are just paper. That's paper in the field held by somebody else who has been expected to implement it.
And so one of the big challenges was what we design what we make, or what we think about or how we engineer buildings or other systems, the end of our CAD system, they go out our printer, and that's the end of the road. And so there's no connectivity as to that engineering or design or architectural intent. Transferring to the next step of the process, how do we build and create digital work sites and operate and maintain and manage those same assets?
Erik: So if we go now a bit deeper into the business itself, I suppose it's all fairly clear to listeners that the market here is buildings. Is this primarily commercial buildings? Are you covering industrial residential? What's the 80% rule right now?
Troy: Yeah. So our core market is really commercial buildings and industrial buildings. But that includes multifamily residential, and you're our investors, our multifamily residential REITs. But more broadly, what we're doing well apply to process control in industrial setting. So there's kind of a big umbrella. But you could say the outer skin of that umbrella is what happens in buildings.
Erik: There's a lot of challenges along the lifecycle of ability. So where do you tend to be involved? Are you full lifecycle? Are you more in the planning phase in the operations phase?
Troy: So we're full lifecycle, and that's one of the core values of Passive Logic is that what you do at the design side is really all you have to do and then that lives through lifecycle that building. But for our marketplace, really, you can pick up Passive Logic without dependencies at any stage of the building's lifecycle.
So whether you're pre design or early design, or you're in construction, and you want to apply automation to buildings, or you're doing retrofits, and you're thinking about how do you maintain and manage buildings on that end of the spectrum? So one of the core values and backing up more broadly, to understand buildings as these robots, these big, complex robots, the framework to understand Passive Logic as an autonomous platform for buildings.
And if you look at how autonomous systems change the way we think about vehicles, we think autonomous technology applied to buildings has far deeper ranging implications than ever had vehicles with larger economic and systems integration impact. So you look at buildings, for instance, in the first world, buildings are twice almost the Energy Impact of vehicles. The costs of integration, the footprint of these systems, and the economies around them is far bigger than anything else.
And so we're in simple terms you can think of Passive Logic is what if you took the brain from the [inaudible 10:49] vehicle and taught it not about driving, but taught it about buildings and building systems and the underlying physics and how to operate all those things as a whole, unified, complete system.
Erik: There's been BMS systems around for decades. But the BMS systems were not intelligent in terms of decision making. They were kind of platforms for visualizing. And so can we view Passive Logic as a next generation BMS system that now has this AI capacity built in to make better decisions?
Troy: Yeah, I think what's probably what you can think of. You can think of as the BMS, or what we would say more broadly control. Because this is the fundamental of every building, you need a control system. This is really a requirement to make a building go. And whether that's at the “high end of BMS”, or at the low end, just simple controllers, what we've seen over time is this evolution.
So the big automation companies today, they all came out of the 1850s with the breakthrough innovation of the mercury switch thermostat. That was the beginning of the controls industry. And what we saw is the evolution of that, that went from thermostats to fixed function controllers to the PID control loop to these BMS control systems or digital control systems in the 80s.
But largely, what we have today is virtually no different than what we had in the 1800s. It is just digitized. And given some tooling around, how do we organize lots of thermostatic control or lots of PID control in a unified box. But that box, the context is really like 1970s or 1980s style programming. And you could go through a series of proofs in terms of computer science. You could say, well, given the complexity of a building, how many different if then else’s, for instance, would I have to write, and you would find that it would take 100, engineers thousands of years to ever write a complete set of if then else's for any given building?
So where the end of the road was for BMS, and why we're not getting to where we need to is that it's really quite the end outcome of where we started in controls in the 1800s. And we're just ready as an industry to say, okay, what does that new foundation look like, what does that new path looks like because we're out of steam on the old path?
Erik: I'm looking at one of the videos on that's visualizing how you would use this in a complete building to visualize the skin or the operations under the skin of the building. Who are the buyers and who are the end users? Because you have this product that is managing an extremely complex asset that's being passed from hand to hand. So there's the architectures, the developers, that construction team, then there's the building operations, there's maintenance. And my feeling here's the tool has functionality for quite a range of different users. And also through the lifecycle, there might even be different buyers. I guess the building owner might be the buyer here. But can you share a little bit, who are the key stakeholders here? And then how does this evolve over the lifecycle of the construction?
Troy: So as you're talking very tangibly to our autonomy studio software, [inaudible 14:38] to design their autonomous system for their building. We can talk about why that is and why that's important. But yet there's a sophisticated value chain in buildings. And one of the things that Passive Logic does is really address problems along that whole path for any one of those value chain stakeholders.
But what we focus on the most, and I think this is rare and what we see today and IoT and AI in the buildings control industry is we focus on who is the fundamental customer. And generally speaking, that is installer of automation equipment or an HVAC installer who more broadly is typically the organization who's doing control in buildings are kind of the de facto control user because they have to control their systems.
And we focused on building a user experience around the workflow of how buildings actually go together, and how those installers have to interact with that whole business process from getting a project to get in and out of the building. And so part of that viewpoint is that one of the things that Passive Logic has been able to do is while we provide all these value ads, whether that's in the maintenance side, or the energy side, or the analytics and insight side, we decided as a company really not to monetize those things, we decided to give away those as a side effect of the core need control, and really sell our control platform at a competitive price to what you could buy in the market, but then make our platform really scale from the little coffee shop on the corner up to a skyscraper level, and make the pricing work across that band. And that's another thing that's really unique about how we're handling technology.
I think, often in industrial systems that we see is these kind of old crafty systems that just aren't really good in the first place but they're the flagship of the automation provider. And then they try to hit a lower end market. And what do they do? They take an old crafty system, and they cut half the features out or 90% of features. And then you've got a craftier system, and nobody really wants to buy it.
And so when we look the mobile phones, you don't get a different iPhone than the CEO of Apple, he doesn't get a better one, you all be can buy the same one right. And so Passive Logic has the same theory about building from the building scape is that we need to have the same scale, the same technologies, these new advanced technologies for every building, not segmenting the market unnecessarily. So we really focus on that installer base that the contractor base and build a user experience and this workflow around how their businesses work. And that's really ourselves into the market, and then we give away the rest.
Erik: So let's say a contractor base, then that's going to be I guess, depending on the project 6 months to 24 months time period, is this like a SaaS based solution pay per period of time, and then based on the building size or the number of users on the contractor side? What would that look like from a business standpoint?
Troy: No, and this is another differentiator or how Passive Logically thinks about the market. So first, we really think that one, investors love SaaS. It's an investor driven decision point to drive companies towards a SaaS model. Customers generally don't like SaaS. And in this market, when you're talking about core infrastructure, and you're saying to somebody, well, pay my monthly bill, or else your building doesn't work, that's not a really strong position to enter a market and you'll probably lose 90% of the marketplace right at the front door.
So one of the core concepts here is when we say autonomous system, we really mean that. This is an autonomous system, all the AI, all the decision making it's in your building, when you buy our controls. When you set up our software, when you put it in there, it's yours, it's in your building, and that's a core infrastructural technology platform for your building. That doesn't mean that we don't have some SaaS services, but they're opt in.
So those services are along the lines of how do I manage my portfolio or how do I get remote access to my buildings? These are natural places to have the cloud intersection, where you say, well, from my office, I want to access all my buildings. So this is core to how we think about markets, how we think about how this market structured and working within its structure.
Erik: And that means I suppose that the software is running on servers in the building or is there also even if it's owned, is it also deployed on the cloud or what would that look like?
Troy: When we talk about a ton system, the core product is our hive controllers and this is a control system that provides really about what is a dozen products today that you have to integrate in one box that you can install on the building. It does all the control. It goes through all the control engine, physics engine, decision-making into box, and it provides the interfaces to all your IoT devices, all your sensor devices, all these controllable devices in a single box without having to do any integration.
So, again, back to the mobile phone analogy, why did the iPhone out-sell in four years, all of desktop computers in 40 years? And we think there's a lot of reasons for that. But one of the key reasons is pre-integration. So in this market, in the IoT, and building automation space, we generally call the installers integrators, because that's the core of their work. And we think this isn't what people should be doing with their time, they should buy things that just work, and they should spend their time designing, installing and getting new jobs. And so that control system has inside of it, it's not a server, it's got high performance processors right in the control system that when you plug in one of them, even if it's in a coffee shop, it's doing all the decision making right on that right on that control box.
Erik: I suppose there's going to be significant differences in requirements between a coffee shop and a skyscraper. So is it just you have your Passive, the hive controller, and then based on the requirements of the ability, and they might require one or more, is that it?
Troy: So that's part of it. So part of the integration story is that the AI, the underlying technology, the software enabled us to then really integrate the hardware, make the hardware very software defined. And this allowed us to replace what is today a catalog of different controllers and automation devices, nodes, and network devices into one box. And then it really becomes a question of if you're in a coffee shop, do you need one? And if you're in a skyscraper, maybe you need hundreds of them throughout the building, each one of them doing their own role.
But when you daisy chain them together, when you connect them together in a network, they work out those roles together in a swarm computing cluster. And they are both taking care of their own jobs as far as IO, talking to sensors, talking to IoT devices, talking to control balls, but then all of their capacity, all their compute capacity gets shared across that whole network. And so as you're building gets bigger, you have more controllers, you have more compute capacity right there in the building.
Erik: Yeah, we've been doing a fair amount of work into edge computing, lately. And I mean, it's a topic that's very high interest right now. But this seems like a very practical approach to enabling fairly sophisticated edge computing. Where's the business today? So I know you just raised 16 million, and you're coming out of stealth mode? Are you already selling into the market? I imagine you've already done some projects? Are those more pilot projects? Or do you already have a commercial product out there that’s scaling?
Troy: Right now, we've had private pilots out in the field for two years with a variety of different systems, a lot of different complexity that were managed that we were brought into those projects to make go. We just raised this $60 million a Series A and then fun to see the both top tier lead investors as well as a lot of strategics from a variety of different building points of view. And we're going to be unrolling our private pilots with some strategics over the next few months and you'll see us go to general markets to the rest of market about summer next year.
Erik: Are you able to share any of the investors or do they prefer to not be?
Troy: Sure. So our lead investors were the investor edition and [inaudible 24:41]. We were actually additions first investment in their new fund, which was the second largest debut fund of all time. In our strategics, we have strategics from both the US market and European market, both building owners on the commercial and multifamily residential, as well as HVAC manufacturers components, one of which is based in Switzerland.
And we ended up actually that investment raise was about three times oversubscribed. So there's quite a lot of other investors that we really wanted to invite in but we really ran out of room for collaboration at this point. But we'll see more of that as we get into next year and then get into our next raise and can really offer more room for more strategics within our next round.
Erik: One of the most interesting technology trends right now it's these industries that have not evolved so much in really the past several decades now all of a sudden having the opportunity to adopt best in class communications and in computing technology. Why don't we go now into the tech stack a little bit? So you have your server, and then behind that you have management platform and this deep learning, or this AI solution? We've already talked about a couple of things that you're doing, but could you just give us the tech stack walk through, what is the hardware software and also if there's any services that that you're also providing there, just the full perspective here?
Troy: Yeah, absolutely. So you brought up this notion of edge, and edge AI in particular, so we think, Passive Logic, not just in this industry, but when we look across multiple industries is really leading what that notion of edge based AI looks like. And I think you probably are certainly aware why this is super important.
There's just certain things you could never do from the cloud, not from a point of durability, or resiliency. When we think about autonomous vehicles, at the speed of light, when you see that you have to break from your vision system because you're going to hit a crowd of people and you go up the server and back down, you've traveled 40 feet, and you're going to run into that crowd of people. It just can't be done from the cloud, by speed, but also by resiliency. What happens if you lose 100 milliseconds, your connection? This happens all the time.
Well, building systems aren't quite as fast as that we're generally not driving our systems as quickly as that. But it's the same problems. We can't really do things from the crowd in real time in the way that you need to do for some of the real time AI. So our AI system is in the building, it is in each controller. As you spread those controllers around, they're working together as a swarm team.
Now, a couple of things that are really unique here is we're built on what we call deep digital twins. So you may have heard this notion of digital twins in the industry, and usually, what people mean is something very thin. They mean, it's CAD, or what we'd call in the buildings, industry BIM, 3D CAD with names on everything, and maybe you can attach a sensor point to it. There's a lot of companies looking for a value point in this idea. It's a solution in search of a problem.
But when we talk about deep digital twins that are core to what Passive Logic does, we're talking about the digital twins are the AI. And this is really important notion because if you're doing building systems, or you're doing campus systems, you're doing energy systems, or maybe you're doing process control, the thing that's different about this from autonomous vehicles is that every building, every system is completely different. There's no true buildings that are exactly the same.
Whereas if I'm Tesla, I can train 10 million miles of test track and then I can train up a neural net, and then I can sell that to 100,000 people. It's a static model for 100,000 people exactly the same. But in building systems process control, we need something where the average user can define their own AI system, their own autonomous platform for their own unique building, unique architecture, unique system topologies, and so forth.
And so at the core of what we enable is instead of programming and we've already established, that's somewhat of an impossibility with conventional means, how to control these systems very well, we give people an environment where they can draw what a building looks like, they can draw what their systems look like, drag and drop, different components, pumps and fans and valves and boilers and movable robots that move like shelves around with boxes inside of a warehouse that might be like an Amazon logistics center, or whatever those componentry are, and those individual components that they drag and drop, are these digital twins that understand their own physics, so giving a notion of what a pump fundamentally is.
And then a number of things become possible. First, when people draw their system, what they're actually drawing is their AI topology, but it just looks like a simple drawing to them of their system. So we translate it into the world they know and then we get a system topology that meets their building.
The next thing that digital twins enables this new kind of AI called Deep physics. But basically, deep physics gets rid of all this training time that we have in deep learning, and allows you to draw what your system looks like, and start controlling it right away without millions and millions of pieces of training data. And then it learns right in your building about your building and its behavior. And that, again, is unique for most AI systems. Most AI systems have been learned and they're given to you statically.
And so this digital twin then has another face, which is how do we talk to all these IoT devices? How do we talk to all these protocols? And the one thing I know about protocols is people keep on making more. And so the challenge today is being able to bring those together in some kind of structured meaning about the whole building all the systems together that anybody can query. And these digital twins provide that structure. And they enable something that we call semantic lowering. When you understand your physics, this is the most meta thing you can know about yourself. So if you're a pump, understand what a pump does, and what it does in a system. And knowing that I want to turn on a pump, in physics terms, is more meta than saying I want to turn a pump on and BACnet or Knet, or lawn or Modbus or whatever it might be.
And that allows me to do what we call semantic lowering to take that meta notion of pump turn on it in physics, and lower it down to a simpler semantic, whether that's BACnet, or Knet, or English or Chinese. So there's these multiple facets to what these deep digital twins enable. And then the final one is how do you provide a high level API to all those services that want to plug into? So that's kind of a big walkthrough on the top level. But I think there's a bunch of things that we could go down that tunnel, talk about the basics, or things that grab your attention.
Erik: How mature is this technology domain? Are there other applications that maybe our listeners would be familiar with that are already using deep physics in consumer industrial products?
Troy: It's totally new territory. Over the last 10 years, most of your listeners are going to know about like the revolution in deep learning. So the question is what is that revolution really about? And when you ask that question of even AI researchers, usually, people say, well, it's about neural nets. But it's not about neural nets, neural nets has been around since the 1950s. So then what is it about? And the next answer often is, well, it's about CPU and GPU power, or the amount of data we have available. But it’s actually not about those either. Those are maybe order magnitude leg up.
At the bottom of the deep learning stack, when you pull out TensorFlow, and you start using train runs and data, there's something at the very bottom of that stack of code that makes it all possible, and that's a thing called differentiable programming. And differentiable programming, to date has been locked up in the bottom of deep learning libraries. And what differentiable programming is this? Let's take the neural net or the deep learning is.
If I wanted to train a neural net with a million neurons in it, prior to differentiable programming, I would have to take my computer and figure out with some kind of algorithm, some kind of heuristic, how to tweak and tune a million parameters for those million neurons to get the right answer per se, recognizing an image and recognizing that that's a dog or cat or horse or a person. And that would have taken hundreds or thousands of years on the world's largest computer to tweak and tune all those variables. And so that made neural nets not possible to tune properly, or not possible to really have enough compute to ever really train them.
So what does differentiable programming do? Well, all of the last decades of programming, you put inputs to functions and you get the outputs. Differentiable programming is what if your compiler could start with the outputs and give you the inputs? And this means that you could start with the answer, the one output, and you can reverse the code and get the million inputs. There's a couple more processes to that. But essentially, that's what difference will program it does and it changes what would have taken thousands or thousands of years, turns it into minutes or hours or days.
So this core technology had been locked up. And we've been working with Google Brain over the last couple of years on generalized differentiable programming. What if you could write any program, and you could run it both forwards and backwards? And then what if you could now build heterogeneous neural nets, which is the library, we've been working on Passive Logic where not every neuron is the same, but every neuron is different, and every neuron has a specialty and now you can make those specialties, in our case, physics, but it could be in another case, maybe economic systems, or a variety of different industry interests?
So with Passive Logic, what we've been able to do is build these physics networks. They break down, they look a lot like a neural network. But they represent the physics of a pump and a boiler and fan in a window and the building and humans and human comfort. And you can train them just like you train deep learning. But there's a few really big advantages.
The first thing about deep learning to understand and that's really problematic about deep learning is when you go through the steps of making a deep learning system, the first step is you have to have data scientists to figure out how to shape their neural net. The second step is that it knows nothing, that neural net is completely blank, and you have to train it on a ton of data.
One, in physics, what are you training on? You're training on stuff that Newton already invented 200 years ago. Like, what why spend three years and gobs of data to learn something that we already know? And those two steps are 99% of the work. The last step is optimizing your network. So what this allows us to do is basically build these heterogeneous neural networks of physics that are represented by components, that when you connect those components together, you're building that network, that now you have the structure of the models, and you're 99% of the way done. It's already usable and now you're just an optimization that can be done locally, and you can do it many orders of magnitude faster than you do regular deep learning. That's how deep physics works on a top level.
Erik: And you said, you're working with Google Brain on this. Is this something that was funded, I forget who you mentioned was funding some government agency before. Did this came out of funded research program, or how did you identify this as a potential breakthrough?
Troy: So we had been building this autonomous platform at Passive Logic and identified that you really needed to get this performance speed up, that could make it a real time process. So before Passive Logic, the next best thing you could find on the market was from the Department of Energy, they have a physics simulator, and they have some optimization software. And you can get an answer in a week of all sorts of computer simulations of what the next step would be.
But we need an answer in a second. So we started working with the Department of Energy. We've had several contracts with the Department of Energy, they just gave us a follow on contract to make a standard around these computable deep physics digital twins. The goal there is multifold. One, in our systems we need to make those decisions second by second. What is the next operation in the building? Given all the dynamics, we're solving the whole building as a system all of the heating cooling ventilation energy systems, internal process control type systems, all as what is the right solution for right now. And basically, we can digitally simulate and optimize that millions of times a second faster than real time. And that allows us to make these decision by decision by decision in the building and do it on a relatively modest processor.
The digital twin interest to the Department of Energy is once we go beyond the shell of the building, how do we make energy networks that are peer to peer and can communicate back and forth then you can do demand response and utility providers and energy storage and distributed energy and buildings can all work together in real time? And you need some currency have proof for this and the digital twin satisfies that currency have proof that all these entities can work in a peer to peer way while proving whatever they says is true is true and nobody can game the system? That's one of their aspects of their interest.
Erik: So you've already mentioned that data is not really the bottleneck here. But I can imagine that in some circumstances, you still need high quality data in order for these systems to operate. Is it your experience that the buildings already have, in general, sufficient sensor networks and connectivity? Or when you're being deployed in a building, do you have particular requirements or recommendations for the architects or the developers for what is required in order to make proper use of your system?
Troy: That really actually segues really well into deep physics. Because one of the things that the deep physics network does is it provides automatic data fusion or sensor fusion of the information you have and it also provides this greater context of this physics based network that represents all the things that are happening in the building.
For example, say I have, 100 zones, but only 50 of them have sensors for temperature, or humidity, or whatever it is, it's going to automatically sensor fuse based around the thermodynamic model of that building, what the rest of those rooms temperatures are. And it can also automatically synthesize the quality of that synthesized information. So it can tell you whether it's 90% or 70%. So that quality can be used internally for learning. And it can also be used externally, if you're trying to query the building as if you're a service provider of some sort of service on our app store. But it infills this challenge that we see around IoT devices, which is, maybe multi-fold.
First, there's no really site intelligence for IoT devices today. Most IoT devices are going up to private clouds from the manufacturer to IoT device, there are middleware solutions where you can go to data pools in the web, but data pools are just data. Data is not all that valuable. What people pay for is action. And so this provides this umbrella that fuses that data together.
But essentially, if you can imagine if you had this neural net of all these physics nodes, within the outer network, or this complete network of the thermodynamics of the building and those systems that are operating, you're basically plugging sensors in to points in that network that would, on their own, synthesize their own information for what would happen in that node relative to everywhere else in the building, you're plugging in the higher quality of data by saying, I've got this IoT device, I'm going to drag and drop it into this zone. Now, let's say it's a humidity sensor, that humidity is plugging into that with this higher quality of data based on that sensor’s accuracy that then propagates through the network around it. So it ends up solving this really interesting IoT problem as a side effect of how do we control buildings in the first place.
Erik: And then the other question here is around integration or interoperability with either legacy. Especially an industrial building, there's a set of companies that have built quite profitable businesses on keeping competitors out and owning the stack. Maybe we can look at the elevator unit potentially, or in an industrial building at different controls. Is this a challenge, do you have to come up with clever ways to gain access to this data? Or is this for whatever reason, not a challenge?
Troy: So we have a domain that we're focusing on right now, which is really these what we say functional systems, the HVAC, the energy systems, lighting systems, there’s this process kind of control type systems. Things like the elevator are places will head but right now are a little bit peripheral to us, we do connect into those systems to say occupants who are moving around the building, those are corridors of flow that are interesting does as a system.
This is somewhat regional, I think the US probably has the most open networks, and there's been a lot of piling on to particularly BACnet. Europe is maybe not as well aligned around a singular set of protocols. There's a little bit more breakup BACnet Modbus and Knet. In Asia, I can't speak to the Chinese market as well as you could. But I know in Japan, in working with some customers there that things are a little bit more isolated, and they have been more proprietary.
But we think that that's actually a big advantage for Passive Logic because we provide not only this very deep AI stack, but the thing is, as a user, you don't ever have to care. So it's a really a democratizing force, that we provide this really easy to use software environment, that now the engineer or the installer can just sort of bypass those experts who have largely locked up things around proprietary stacks. And then we're providing a lot of connectivity via broad sets of protocols. Of course, we get a lot of protocol coverage and a few protocols and we'll be adding to those over time.
But one of the advantages we have is those digital twins, their ability to do this semantic lowering that up until now we've always had to do this like one-to-one mapping from language to language and this provides a lot of acceleration.
Erik: But if you can just give us the path of use for Passive Logic, so maybe there's one of these pilot cases that comes to mind, and just give us as much detail as possible about who is the buyer, what was the situation, who are the users, and how the use of this system evolved as the ability went through construction and operations?
Troy: Yeah, I'm going to pick one or more [inaudible 46:21] projects because I think it highlights the challenges that controls up pace. So this is a project that's owned by a fellow, his name is Kurt [inaudible 46:33], he was building out a property of there's multiple different buildings, different use cases, residential, and some utility buildings on 100 acre property in southern Utah right off the national park called Capitol Reef National Park. This is one of the more remote parts in the US. And it was all off grid.
So this whole property was is powered by a solar system with battery storage generator backup. It also had to maintain all these buildings in a fairly sophisticated way, including heating, cooling, ventilation, air quality, lighting systems, occupancy systems. And one of the more interesting aspects of this project was that this solar system, if you're off grid, whatever power you don't use just doesn't ever go anywhere, the solar panels only generate what's been pulled from the solar panels. And the flip side of that is you don't know how much power potential is in those solar panels until you draw that power.
And so what Passive Logic’s controller does is it has this 60 stage electric boiler that goes from 500 Watts to 25,000 Watts, and slowly uses up more and more and more power until it reaches the peak of what that solar system can produce and it backs off, and it draws more in the backs off. And it does what it called peak power tracking of taking the residual energy that's not being used by the property and turning it into heat. And then the question becomes, well, where do you put the heat?
And so the system in real time has to figure out well, should I put it into the building and a radiant floor? The radiant floor has half a million pounds of concrete, that's a storage system. But you can only do that up until you meet the thermal comfort of the occupants? Shall I put in the domestic hot water which you might have value, but it has a limit of maybe 100 gallons? And is that going to be used readily? Is somebody going to use that hot water soon? Or is it going to wait till tomorrow and it's going to have some heat loss? Or should it put in two banked seasonal storage tanks?
And so it has these large 5,000 gallon insulated seasonal storage tanks buried in the ground that it can choose to put the heat in those places. But if you're doing it in July, there's going to be a lot of heat loss before you might need it in December. But if you're using it in January, you might use it later that day. And so all these real time decisions of where to best optimize the global system between energy use, the storage of energy, the real time needs, the occupants, the air quality and so forth.
So what we saw in that system in just like tuning into one of those components, which was a geothermal heat pump. Passive Logic improve the efficiency of that system 40% over conventional geothermal heat pump operation from a study from the Department of Energy, just in how it could predict what would be used, how comfort will happen for the day and think ahead about the right control sequence given the day ahead, as it saw in its simulations the best navigation path for that control from right now to 12 hours from now.
And so that's really tangible case on the energy and consumption and maintenance side of the building. But maybe more important is, we saw a 90% reduction in the amount of efforts to do Passive Logic versus a conventional control just from that programming to the commissioned step of just getting into the building. So there's a win versus conventional control right out of the gate.
Erik: So when you're looking at the ROI, you can say, just on the installation cost alone, because I imagine what…?
Troy: [inaudible 50:50] And by the way, you've got 40%, energy savings just to better control.
Erik: I know it’s a lot of systems, you look at the software and you run an RFP, and then you get quotes on the software, and you deploy something. And then you find out that the integration cost actually is three times higher than whatever the technology was and that wasn't part of the initial quotation process, right?
Troy: Right. And that's the real core value. When we talked about autonomous buildings and we look at the autonomous vehicles market, and we said the autonomous vehicles market, and I think, we struggled with this how smart is smart when people talked about smart buildings, this became kind of a pet peeve of ours. Because when people say smart, they really don't mean smart, they mean connected.
And so what we realized a year ago is we could largely leverage the autonomous vehicles industry nomenclature from level zero to level five autonomy and it fits very perfectly with the levels that we see in buildings. Level zero is manual control, that's cruise control, and car that's thermostatic in a building. Now, level one is adaptive cruise control. Level one in the building is adaptive conventional control systems where you have AI systems just tweaking conventional dumb systems.
Getting to level five automation or autonomy in a building is a really good goal and will be the end goal of all systems. In buildings, there's something far more valuable, which is automating the installation so that you can broaden its usage. And that savings right at the front door I've seen well, you can still Passive Logic for a 10th the effort of something conventional. We all care that you have better energy efficiency, but that no longer becomes a value add cost equation decision make. You're making the decision on just getting a building going in the first place.
Erik: So I know this, this depends on the ability and size and so forth. But can you give us a rough estimate for what the cost might look like for maybe a medium sized building or a couple of different situations?
Troy: Yeah, so if you're looking at automation today, a BMS system, we price compare right across the board. So if you're looking at a building installed, it's like 50,000 euros or $50,000, it's going to be about the same in Passive Logic. The real distinction is current BMS systems really had a price floor of about 30 to $40,000. And we keep on scaling down to about a $1,500 entry point.
So when we talk about that coffee shop on the corner, that $1,500 entry point is where we start, and it scales up based on the complexity of the controllables, and the different sensors and IoT devices, how many points that you're, you're interfacing with. In general, we see across the automation market that works out to around $1 square foot, or you could think of 10 euros per square meter type of price point ±$0.50 per square foot. So if you’re on big scale projects, a million square foot or 100,000 square meter projects, those projects tend to start getting into the $0.50 per square foot in smaller ones go higher. But then we see about average that dollar square foot throughout the automation industry.
Erik: Let's say we're in this coffee shop scenario, so your simplest potential user, is it feasible for a small town architect to deploy this and get this up and running and then for the coffee shop, the proprietor to operate the system? What kind of training would be required there?
Troy: Yeah, I mean, that's one of our real advantages that we. So it probably in those buildings not be driven by the architect or the owner, it would often just be driven by the HVAC company. So they get hired to come make that building work, install whatever kind of HVAC system It might be, air systems, braking systems, whatever. And they've got to put in the control system.
And what we really focused on in that user experiences, they can in that coffee shop situation get from start to finish in our autonomy studios software in less than 20 minutes to set up a coffee shop with a full control system that is more value and more detailed than any BMS system you can buy, but a price point that they would otherwise be putting in thermostats and [inaudible 455:29] controllers. And then, for the owner, they're getting the benefits of that platform being in place. But we're really making that trade to the HVAC installer who's going to do it instead of something else.
Erik: Troy, is there anything that we're missing here?
Troy: No, I think that's a really good conversation. I would like to hit upon this notion of levels of autonomy. When we're talking about smart buildings. I think people need to be cognizant, what kind of smart building technology are we talking about? Are we talking about level zero, level one, level two? What are those levels of autonomy that we're discussing, we’re seeing smart building, and thn start think down the road about what a smart building looks like as opposed to a connected building?
Erik: I completely aligned with your point. It always frustrates me when somebody kind of bottlenecks the IoT, or IoT concept into connectivities, connectivities, it's critical, but by itself, it's not valuable, so it's really the system perspective?
Troy: That's right. I mean, data, it is a whole lot less valuable than most people act like it is. And there's this pyramid of value and we start with data. If you can make information out of it, if you can say structure and digital twins, that's a lot more valuable. But what customers are buying is action. They're buying their works or their process control system is running line property. Very few people are like, well, give me a bunch of data, and I'll figure out what to do with it and see if I can find some action within here.
Erik: So Troy, we'll put your website into the show notes, and we have a pusher contact from the company there. But what's the best way if one of our listeners is interested in getting in touch with Passive Logic, is to go to the website, or do you have a preferred contact point?
Troy: Go to the website, and they can find contact point at the website.
Erik: So Troy, thank you so much for the conversation today. Super interesting.
Troy: Yeah, absolutely. Thanks, Erik. Good to hear your questions and thanks for all the thoughtful pathways to go explore on this topic.
Erik: Thanks for tuning in to another edition of the industrial IoT spotlight. Don't forget to follow us on Twitter at IotoneHQ, and to check out our database of case studies on IoTONE.com. If you have unique insight or a project deployment story to share, we'd love to feature you on a future edition. Write us at erik.walenza@IoTone.com.