Erik: Welcome to the Industrial IoT Spotlight, your number one spot for insight from industrial IoT thought leaders who are transforming businesses today with your host, Erik Walenza.
Welcome back to the Industrial IoT Spotlight podcast. I'm your host, Erik Walenza, CEO of IoT ONE. And our guest today will be Ed Kuzemchak, CTO, and director of IoT at Software Design Solutions. Software Design Solutions specializes in embedded software, system development, and Software Process Improvement consulting in the medical, industrial, transportation and defense sectors. Together, we discuss trends in network connectivity and edge computing that are reshaping the IoT landscape. We touched on the improving connectivity, storage and processing cost structures, why 5G is not relevant for many IIoT situations and many more topics.
If you find these conversations valuable, please leave us a comment and a five-star review. And if you'd like to share your company's story or recommend a speaker, please email us at team@IoTone.com. Thank you. Ed, thank you so much for joining us today,
Ed: Erik, thank you for having me on.
Erik: So we'll be focusing today on trends in network connectivity and edge computing and discuss how those two are reshaping the IoT landscape. But before we really dive into that topic, we last spoke in Q2 of 2018, and I think probably a lot of our listeners are still hearing you for the first time, so I'd love to take a few minutes and go into your background and also Software Design Solutions. Maybe before we get to Software Design Solutions, if you can just share a little bit about where you're coming from personally.
Ed: Sure. I have a pretty long history in embedded systems software development. I have about 32 years now in embedded systems software. I grew up here in rural Pennsylvania, then took a position at Raytheon, and then with a startup that got purchased by Texas Instruments, and then later founded Software Design Solutions. But throughout my career, my focus has been on embedded systems, low power designs, software development tools, did a good bit of that in my career. And so it's always been a part of building solutions for people who are not necessarily building end consumer devices, but building industrial devices, or sometimes military or medical and those devices.
Software Design Solutions, we founded that 16 years ago, and continued our focus on embedded systems machine-to-machine because it wasn't IoT back then, but grew into IoT as that term emerged. Three years ago, Software Design Solutions was joined together with another company, Applied Visions. They are in Long Island, New York, and focused on enterprise level cloud applications, mobile applications. And so we had these two companies, one of them focused on embedded systems, all the way from building and designing sensor boards to building gateways, and another company focused on cloud applications. And so when we put these two together, we figured we'd have a really strong player in the IoT space.
Erik: That merger certainly makes sense. Was this a bit more opportunistic, or did you hit 2016. And think now's the time to look at how we can combine your embedded system expertise with a cloud expert?
Ed: Frank and I found each other while we were looking to grow both companies. I was looking to take Software Design Solutions into a larger space and needed to make some decisions about how to do that. Frank likes to Pittsburgh area, and also wanted to get back into embedded which he had some experience in the past. And so we got the talking together, it's like, well, this this really makes a lot of sense, let's do this together.
Erik: And I want to go into Software Design Solutions. So one of your first job, I think was working for Raytheon Missile Systems. And I've got to think when we're talking about embedded software, you were probably kind of at the cutting edge when this was all starting to get developed back in late 80s. What were you working on back then? What was the state of the technology in 1988 as compared to today?
Ed: My primary role with Raytheon was I led the team that developed the internal compiler tool set and all the toolset for the Patriot missile program. And that was prior to anyone knowing about Patriot because it was prior to either Gulf War. So at Raytheon, I had a chance to participate in not only that program but other new other new military programs. And so that was a great deal of fun.
But being from Pittsburgh, it seems like Pittsburgh, people always move back to Pittsburgh. So moving back from Boston back to Pittsburgh, I was able to continue in the military space by joining a company called Tartan Labs that was a Carnegie Mellon spin up, and that company built high performance ADA compilers for the military space. And so there, we got to get this very broad exposure to all of, not only military but avionics programs that were using ADA in the late 80s and early 90s. Are you familiar with ADA, Erik?
Erik: No, I'm not.
Ed: So it was a language developed by the US military funded by the US military with the intention to be able to do software programming in the large. At the time, everything was either in FORTRAN or some early C. and ADA was really ahead of its time in the capabilities that it offered.
Erik: Has it evolved into something else, or did C kind of become the standard?
Ed: In military, C++, and even Java nudged it out because there were just a lot of trained programmers out there. I still like it to a great deal. They're still in these programs. And just like, we see an industrial equipment today, and when you get on an airplane, it's very likely that airplane, or at least that model of airplane might be 40 years old. And so you have these long, long range programs that there are plenty of systems out there still running and flying with ADA in them.
Erik: In this theme of combining very old technology with cutting edge technology is really on point today, because we're going to be focusing on industrial IoT systems. Let's talk a little bit about Software Design Solutions and the types of problems that you typically solve. Are there focus industries use cases, technology demands, how do you kind of define the scope of Software Design Solutions?
Ed: So we work in a lot of domains. But primarily, we're in industrial control, and that includes manufacturing and transportation and oil and gas. But we're also in medical devices, a little bit of defense. A lot of it is folks who are building systems, either from scratch. That's not as often as they are retrofitting, or building the next generation of their product and saying, well, let's put some intelligence into this product. And that's where they're going to be coming to us. And we're going to be working with them to get to a prototype and do some use case and some return on investment analysis with them.
Because I think that's one of the hardest things about starting an IoT project is you have the sea levels banging on the next level down at the company and saying, we got to have IoT because I was at a conference and everyone has IoT, and then those folks scratching your head saying, well, but we make door hinges, where can we use IoT to build our door hinges more efficiently? And so that's where really taking a broad look across their company, and talking to their domain experts.
I mean, one of the most important things about SDS is we like to say that we're not experts in anything because we have this broad expertise, that we can join projects, and work with their domain experts. Because there's no one who knows more about making door hinges than the shop foreman has been stamping them out for 25 years. And he knows or she knows what this machine acts like a week before it's going to fly apart, and it's going to be down for two weeks. And so, being able to instrument that machine, for example, being able to put instrumentation on that machine, and it's not about going out to a catalog and leafing through it and saying, do you want to leave? It's about listening to that person and saying, what did they know about their domain that we can help them build something than a system?
Erik: Are you typically focused on products or equipment or contained systems or are you often brought into a factory where you're working across the entire asset base of that factory and looking at how processes flow through a factory? Do you kind of do these larger systemic projects? Are you typically looking at a specific piece of equipment and looking at how, for example, we can understand the maintenance schedule of this equipment using some sensor data?
Ed: We're often kind of a little bit in between, more toward the specific piece of equipment, but often working with the folks at that company to understand how it fits into the larger system. We're not the type of folks that come in with clipboards and let's analyze your whole entire supply chain, and then six months later deliver you a three ring binder report. We're more likely to be well, let's talk to these specific people that have the problem and let's work and let's get some prototypes together. Because that's the most important thing is getting some prototype things together. Often people have a good sense of what they want to build, what they're trying to achieve before they contact us and we just help them refine it.
Erik: So then, topic today is network connectivity and edge computing. Where's a good place to start here? Before we get into architecture, and so forth, what are we talking about? If we can just lay the foundation when we're talking about changes in network connectivity and changes in the development of edge computing, how would you define the key technologies involved, and factors that are driving the development of these areas?
Ed: In 2019, and 2020, when people hear about changes in network connectivity, they think of cellular 5G. I want to acknowledge that cellular 5G is a big deal but it's not a big deal to me because that's at the high end of the spectrum in terms of bandwidth. This is the thing that's going to get us tons and tons of bandwidth to our phones into our homes and all of these things.
But for the things that I care about, we are talking about advances down at the low bandwidth, low cost, low power of the cellular spectrum, and that is LTE Cat-M1 and NB-IoT. And so here we have this cellular connectivity landscape being built out in two directions. And the 5G is getting a lot of the press and a lot of the advertisements and all of that because there's a lot of meat there. But for what we need is a low cost, low power way to get cellular conductivity and that is these other technologies.
Paired with that, and at the same time these things are happening, we have a lot of advances being made in hardware, and even software for computing out at the edge, and that is out of the almost at the centers.
Erik: 5G, there's such avalanche of marketing behind this. And I talked to a lot of people who are really quite convinced this is transformational, and I think you're probably right that it is transformational for some use cases like maybe connected vehicles. This will be quite revolutionary potentially. But how do you determine whether this is relevant to a particular use case? So if there's a company and maybe the CEO starts talking about 5G and how important it is that they start adopting this in manufacturing, for example, which I kind of hear increasingly frequently, how do you determine whether this is actually going to be useful, or whether this is just a distraction, and there are maybe NB-IoT, for example, or Cat-M1 as an alternative that might be a better fit for the solution? What would be the major decision criteria that you would bring into a conversation?
Ed: So it's really going to be based around two things, and they are related. And that is how much data are you trying to move, and of course, how quickly? And then the cost structure of it, pound for a pound, 5G is going to be inexpensive, but you're going to be moving a lot of data, so the bill is still going to be pretty vague.
When it comes to an industrial system, I mean, let's say a meter out on a pipeline, or a leak detector out on a pipeline, that system is not likely to need or want 5G because it doesn't have to send a lot of data. A connected car, on the other hand, sends and receives a lot of data: traffic data, weather data, is sending and receiving information that it's sensing about its environment so that it can be this distributed compute platform, it may be connecting, assume they may be communicating with other cars and negotiating traffic together.
But if we're just talking about back to the pipeline meter example, we're just talking about how much oil or fuel or whatever has flowed through this pipeline and what the temperature is and what the flow rate is, that's just a few little bits of data. And the importance is that the NB-IoT services, the modems that do NB-IoT, those need to have a price point that's going to allow us to put them on each of the meters that we want to put them on.
Doing cellular data is not new. You've had cellular systems, your alarm system in your house, it's likely to have a cellular backhaul on it, or many of these other things have a cellular modem built in. But those are classic LTE cellular modems, the modem itself, component costs may be in the $40-50 range. The data plan for that may be in the $5-10 a month kind of range. And that's just not going to work on a sprinkler system on a large agricultural farm, where you're going to have to have hundreds of these things. And so you just can't do that math and make it work.
But NB-IoT is built to send much less data, and as a result, you're using the network much less, and so they can provide that service at a much lower cost. We're talking about NB-IoT plans that are in the order of $6 a year for a device instead of $10 a month. Our first investigation with them is how much data are you sending? And what's the cost price point of the end node? And those two things will help us determine which ends of the spectrum or maybe the middle classic LTE is the right choice?
Erik: And I suppose there's a third variable that comes up fairly often, which is battery capacity because batteries haven't progressed quite as rapidly as a lot of the other technologies that we're working with here. We are interviewing a company called Ever Active a few months ago, and they build sensors that derive energy from ambient light from vibration and systems and so forth. And they can just kind of sit there for 10-20 years passively generating electricity and sending small amounts of data, and the cost it's really just the cost of taking that system apart, getting inside and changing the battery of the sensor is prohibitive. This, I think, is a factor that's a bit overlooked in a lot of cases.
Ed: Oh, that's exactly right, Erik. A lot of the choices for building low power designs aren't the fact that the batteries are going to be an expense, it's the fact that you can't afford to pay the person to go out and change all of them. Because if you have something like a water meter in your house, it's not a big deal that battery costs $0.75 in that water meter. It's a big deal that if you had a town of 50,000 people, you may have one employee who does nothing but change batteries every day because as you have each of those wearing out.
And so you really need that kind of energy harvesting that that previous company you mentioned is very important in these designs. And to do energy harvesting, you have to start with a low power design because energy harvesting is only going to get you so much power, and you got to match those two things together, saying, okay, what do we have? Do we have motion to work with? Do we have light to work with? Do we have vibration to work with? Maybe we have fluids going by we can put a turbine on, those kinds of things. I think that's a very important part of IoT is matching the power profile of what you're trying to build with what you're trying to do. And the battery life really is driven by when would you go visit this thing anyways?
Erik: So then we have this first axis here, which is network connectivity, and then the second is the architecture. So I think people are generally familiar with the fairly classic edge gateway cloud architecture. And now you're saying we're moving towards a fog edge computing architecture. What does that architecture look like, and what differentiates it from the more classical architectures that people are familiar with? Well, maybe before we go into any details, if you can just help to illustrate the difference between fog and edge computing, what do we really mean when we're talking about those two as separate architectures?
Ed: As you mentioned, most folks understand IoT as having this three-tiered architecture. You have the sensors that are sensing the physical world, and they might be temperature or vibration, or movement, or light, or whatever. And those sensors are out attached to something physical. And at the other end of the spectrum, you have the cloud which is where lots of fancy data analytics happens, it's where your dashboards are that you have a map of the United States and here's all your trucks that are out on the road, or here's all your meters along this pipeline, or here's all those things.
Then in the middle, you have to get the data from the sensors to the cloud because they receive the data from the sensors and send it up to the cloud. And there's really now three places where you have opportunity to do processing. And those three places where the edge, the sensor nodes out at the edge of the gateway, and the cloud, those three places have very different properties in terms of how much compute power they can have, whether battery powered is a requirement.
And so what we found is it's great to do processing up in the cloud, you have tons and tons of resources. As much as you want to pay for just write the check to Azure, or AWS, or Google Cloud, or whoever and they'll give you as many compute servers as you want. But that's awful far away to be making some decisions, to have the data go all the way from the sensor to the gateway to the cloud, and sometimes back. And folks said, well, let's do the processing at that next level down at the gateway, because that seems like a good sweet spot.
And when we talk about fog processing, it really is bringing some of the processing and some of the analytics and some of, say even simple machine learning from the cloud, bringing it down onto the gateway. And that's what Fog Computing really is.
I was doing a podcast recently where we had a lively discussion about whether Fog Computing is here to stay or is it just a transitory thing until the edge becomes capable enough to do all the computing at the edge. I'll make the claim it probably never will be. Because the difference between a gateway and an edge device usually is that the gateways plugged into the wall as much horsepower as I want.
Most gateways and most systems are way overpowered for what they're trying to do. I mean, you might have a an industrial computer that has a quadcore arm running at a gigahertz and all it's doing is receiving some data from a sensor over a wire or wireless and packing it up and sending it to MQTT broker. And so that system has tons of leftover capability where it could be doing interesting things, maybe some image analysis, maybe some pattern classification, doing some work that you could take information from several sensors, and correlate them before they're sent to the cloud, and maybe even make local decisions.
If you're familiar with the old SCADA systems for industrial control, that was really the first fog processing systems. These were usually desktop computers or embedded computers that were connected to sensors on industrial production line doing control and sensing. So this idea of a smarter edge, I think that there's plenty of things that can be done out of the edge. But I think that we were always still going to run into the sticking point that the edge devices are often going to be battery powered, they're often going to have to live in very harsh environments. And so the more compute power you put out there, the more heat you have to dissipate, and all those things. And there's an order of magnitude, or sometimes two or three orders of magnitude difference in how many edge devices there are versus the gateway devices. You may have one gateway device receiving data from 100 sensors. So do you build up that gateway device a little bit and do some compute there? Or do you try to build up 100 sensors with more capability and do the compute there?
Erik: Well, you might need the data from those 100 sensors in one place in order to do the compute, right?
Ed: That's exactly right. You may be trying to correlate a temperature sensor and a vibration sensor. The folks who love mesh networking will say, well, they'll just talk to each other. And that's just a complication we can usually do without.
Erik: We've we basically squeezed supercomputers to fit on our phones in the past, whatever, 40 years. But at the same time, the supercomputers have gotten a trillion times more powerful, so that seems to be the way things work. We will have supercomputers on the edge.
Ed: Yeah, software guys will always find a way to use up however much computer you hand them.
Erik: So how do you look at more a typical on-premise environment where you have servers? Especially in a factory where maybe you do want to have access to the cloud, you want to be able to move data, but you're probably not moving it in real time: you're maybe sending it in batches, and you have some security concerns, and so forth, and you want to be able to do some edge computing. So then you might have a server on-premise, somewhere that's doing a little bit of heavier computing than a gateway might be capable of. Do you see gateways just developing and replacing the servers in these on-premise environments? Or do you think there's going to be a durable need for servers and factories and other industrial environments?
Ed: I think that specifically factories are not ready, for whatever reason, to move their long term data to the cloud. I think that there definitely is a role for a centralized place where data and historical data, and all the analytics you can do with that are kept. Now if the industry has embraced the cloud and has moved there and is comfortable there, that's great. If for whatever reason, they're not comfortable there, it really doesn't have to be the cloud.
The cloud is just a server that is in someone else's server room. And if you're comfortable having the servers in your server room and the IT and the security and all of that that goes with that, that's a perfectly fine place to be. What you can't do is expect that the data is always going to be on the half a dozen or a dozen gateways around in your factory, and you're going to have to go visit them with a USB drive, pull it off. That's just not a recipe for success there because what you want the gateways to get you is this local compute and local image classification part of apps or pattern matching or that kind of stuff. But you still need that centralized place where you can say, well, how did this machine perform last week compared to the same week two years ago? And those kinds of analytics belong in a centralized place.
Erik: So do you see the change in architecture more as an evolution where we're getting a bit heavier compute power on the edge and maybe on-premise servers becoming less important and more action is migrating to the cloud but primarily the architecture remains similar but somewhat modified? Or do you see really radical change occurring in terms of how these architectures are being structured?
Ed: I don't think there's really radical change on how the architectures are structured. I think that's just a more evolution. What's going to be a little more radical is the amount of data that gets saved and analyzed, and those types of things. I mean, some industries, especially heavily regulated industries, transportation and medical, they're used to saving a lot of data already, and using it for quality assurance. But I think that a lot of other industries, classic manufacturing, what they have been learning throughout the years is to do process improvement and to do quality improvement, you have to save data.
And I think that that act is going to become more and more important as we start to get so much more data. It's not just going to be what was the final measurements from the final QA check on this part. It's going to be which exact stamping mill put the pieces together for this part and which operators were there? And was it the Monday after the Super Bowl? And all of those things are going to come into play as you start to look at historical data.
And so you're going to have an order of magnitude, two orders of magnitude, three orders of magnitude more data to deal with. And so I think that folks who haven't in the past been used to storing and sorting through that much data, what data do you save today? And what could we look at? Well, we don't really have any data. And then sometimes they surprise us and say, well, we've got these notebooks of information going back to 1930 when this machine first started stamping out door hinges.
Erik: But as you started to aggregate all of this data, you start to build up data lakes that are a little bit more similar to an ecommerce environment where you really do have tons of real time, data that's fairly rich and well structured. And then, of course, on ecommerce, people have figured out pretty effectively how to monetize their data. They figured out okay, this data has value for different parties, and internal stakeholders and external stakeholders, and they have the contractual mechanisms to monetize.
On the industrial side, that's hardly ever happened. There's still a lot of legal and regulatory, and just trust challenges. Just, it seems like there's a huge amount of value somehow there. Have you worked with any clients or seen any situations where an industrial somebody has been able to monetize or create value out of their data for another stakeholder that's not one of the direct drivers of that use case, the direct owners of that use case?
Ed: We have in a couple of places. First of all, there's the classic problem of data silos, where the folks who are doing materials ordering have their ERP system, the folks who are doing quality, have their quality management system, the folks who are doing sales have their system. And while this doesn't sound like an IoT problem, what we're talking about is having sensors feed into these data lakes. But the issue now is, well, which silo do I want to put this data in? Because each of these people could get value out of this.
And that's where you really start to have interesting conversations with people about, well, why were there silos to start with? Often it was procedural it was, well, we use this system for ERP and this other system for quality management, and those don't talk to each other. And so helping people through that is a good job for our folks over Long Island.
But then on top of that, you also have the interesting security and privacy concerns that consumer devices have now grappled with. And we have a whole another set of people who didn't really think they had to worry about those security and privacy concerns. Because now maybe you're building a piece of industrial equipment, or maybe even a piece of industrial equipment that ends up getting also sold to consumers.
And when you go and go to your local rental store and you rent a piece of equipment to tear up your yard, you got to realize they're tracking you. They know where that piece of equipment is. They know where you live now. And there's all of those things that consumer devices have been forced to deal with. And at least that means that there's a platform for dealing with those concerns. But you have another set of people who have to be educated about security and privacy, which is why security is a big part of our business as well because of our domains. But we also have to educate them about privacy.
Erik: I mean, certainly on the creating value from data, the low hanging fruit seems to be just breaking down some of those barriers and internally, and then at least you're just dealing with politics and structural issues, you're not dealing with client related issues. But then on that side, it's quite interesting that really, probably, when you were starting your career, industrial was really leading the way in terms of technology and consumer was following. And now it really seems to be the opposite: consumers charging ahead and figuring it out as it goes.
One of the things we encourage companies to do is to pay attention to some of the better run consumer technology companies to see what they can learn from them in terms of how they structure their business models, and how they solve some of these privacy, security and business model challenges.
Ed: I completely agree. I think it has turned around that we used to see industrial systems get consumerized and now you see the opposite.
Erik: Coming back to the topic of architecture, there's one area where Gartner has been pushing on a new architecture, which is event-driven architecture for real time computing using Kubernetes, and containers, and basically trying to figure out this problem, how to do real, real time computing with quite a minimal time lag. Coming back to this topic of radical differences in architecture, do you see this as being more of a radical change in architecture when you're communicating between different sensors or different nodes without any intermediary? Or do you see this as just being a modification that has figured out some tweaks to be a bit more responsive than the architectures that have more of a stable server in the center?
Ed: It's interesting. The event-driven architecture at the scale, they're talking about it. I think, it's an interesting new idea for that level of system. I think that at the micro level, in terms of deep down in embedded system software, event-driven architectures are very common. But I thought it was very interesting to see these ideas of event routers. And I think that even these message brokers, these kinds of things are a good maturing factor in these new large scale architectures, which is going to allow them to scale easier and going to allow folks to get their head wrapped around them much more easier.
I applaud that move. We certainly like using event brokers, message brokers, MQTT is our particular favorite, and so that just makes the whole system easier, like I say, to get your head wrapped around and easier to say, oh, we're going to add this new client to this data. Well, that's great, just [inaudible 38:28] broker and pick up data. I mean, it just makes planning for the unknown a little easier.
Erik: Maybe a little bit outside of the core technology and talk more from a business perspective about the use cases that you see picking up now. I know that in this year, asset tracking is a priority for you. What is it about asset tracking that makes it particularly relevant as a focus area in 2020?
Ed: So I think these two things that we've been talking about, this idea of low cost, low bandwidth, network connectivity, and a little bit of more capability and compute at the edge, both make asset tracking a lot more capable in 2020 than they would have been in 2015. Because we can now build a reasonably small power efficient battery operated asset tracking device that we can afford to put on lower cost assets.
Asset tracking has always been this balancing act of, well, what is the value of the thing you want to track? If it's a semi-tractor trailer, it's a few $100,000. I can afford to put a pretty hefty tracker on that thing with a satellite modem and add on there can be $1,000 and no big deal. But if I'm trying to asset track a shovel, I said, that's a different animal is a $22 shovel. And I really can't spend a lot of money to figure out where my shovel is today. And so, [inaudible 40:13] saying, okay, depending on what the importance is of these things, we can place asset trackers on much lower valued items now, and get a great win out of being able to know where they're at where, they've been, and sometimes just how to find them.
We've had some interesting use cases where, I'll try and generalize a very large manufacturing company has these dyes or molds that they don't use very often. So they park them somewhere in a warehouse, and they might only use them every couple of years, being able to find those things a couple years from now is useful. And then being able to bus down the warehouse and your forklift and know oh, it's down here and to the right, that's a lot better. I've seen some very neat asset tracking examples at a trade shows for high value medicines even to be able to pull them out of the cabinet when they expire, to be able to know what the stock is, to be able to make sure that their stock, and you [inaudible 41:23]. All of those things have a lot of added value.
Erik: I remember reading a case study about the French post maybe two years ago or so and they were using Sigfox. And it was a solution where it's a sensor the size of a postage stamp or so and they're basically determined when you open the mail. But what is the cost structure that you're seeing? I know there's a couple different variables. There's the hardware and connectivity you could have at something where it actually needs to have a cellular connection, or it could just be it's crossed a particular point, or it's moved in proximity to a sensor. But what would be a reasonable cost structure to track an asset now? Are we talking about a few dollars of fixed cost and a couple dollars a year of connectivity cost? Is that like the low threshold now?
Ed: Yeah, I think it is. I think that for an active asset tag that actually has transmission capability, you're going to be in the $10-20 range that we'd like to be there and few dollars a year for a more passive tag, something that maybe uses RFID, but it also has some sensors built into it. Now, you could be down to a few dollars, and no cost for each tag that costs at the gateway level. And so that's one of the things that we work with people is to understand what their system requirements are, and what they're trying to do with it, and what the cost point can be.
Erik: And do you see particular customer segments that are really focusing on this now? We had a European global SIM card provider that we were working with a while back, and we were helping them in China identify what segments might be ready for this. And it was difficult. Because you talk to the heavy equipment providers, and on paper, it seems to make a ton of sense because they have these big heavy assets that generate data, but then they're quite traditional and slow moving. So it's a little bit difficult to talk to their R&D and understand where they are in decision making process and so forth.
And then you have the medical devices, it's not always just a logical decision of what asset makes sense to track. It's also the organization's and their mindset about this. Are there particular industries or types of companies right now that you see you're going to really driving hard to start getting a better sense of where their assets are, and the condition of those assets?
Ed: We get surprised by there may be an industry that you think of as being pretty stagnant but then there'll be a player in that industry that is very forward thinking. And they're going to shake up this industry. We have a customer in the handheld tool [inaudible 44:30] they are very forward thinking and putting asset tracking and other types of sensors on their handheld tools. And these are not something as simple as a hammer or a wrench, but something that it's pneumatic.
And so, you don't think of these industries as being particularly folks that embrace technology, but you'll end up finding someone in that industry who's very forward thinking. So I think that it really comes down to finding someone with that vision and having them say, look we can make a difference and for ourselves make a distinction of our product by adding this capability to it.
Erik: The home tool in categories is perfect, because where's the tool when I need it? And I think there's something like you always have an extra 50% capacity, just to account for tools that are misplaced and then you have the other is the tool being used properly, and the power supply and so forth? Are there any use cases that are perennially on the top 10 list of people's IoT use cases that you feel like are not quite ready for mass adoption and are maybe a little bit overhyped right now? Are there any use cases that you tend to advise your clients not to prioritize in the near term? Or is it always just case by case?
ED: Specific use cases, there is nothing that wholesale, we say, well, that's not ready for primetime. I will say that there are some that we tend to look real hard to see if there's easier better way to do it. Like, for example, things requiring image recognition still very tough in a non-controlled environment. I mean, image recognition works great in a production line, you've got your choice of lighting, all this good stuff. We've had folks come and say, boy, if we could just do image recognition and they've been watching a lot of Google IO videos that show how great this stuff works in a controlled environment. And often we can find a different better way to do it.
Another thing that we tend to do as we approach machine learning very sparingly, in other words, there's places where it should be used, and are places where just a plain old heuristic is going to work just as good and run on a lot less hardware, and the easier to understand and easier to change. But because there's a lot of magical things that machine learning can do, but in the end, it looks like black magic when you try to pick apart why did it choose x versus y. And if all you're trying to do is measure temperature on something and an alarm when it goes above a value, that's a simple heuristic.
Even if you're trying to do something like a middle of the road vibration analysis to do predictive maintenance, that's one where we would kind of approach it with either of these things could work, let's try them both and see how it works. Maybe machine learning is going to work if we can gather enough data, maybe a simple heuristic will tell us that this bearings ready to fly apart just by the G forces. So I wouldn't say use cases are something we avoid. But we avoid shiny new nickel technologies if a standard technology it's going to do the job.
Erik: Well, avoiding complexity when simplicity does the job, that's a good rule of thumb, especially when you're dealing with new technology and pilots and a lot of unforeseen costs. Ed, is there anything that we didn't cover that you'd like to spend some time on?
Ed: I think this has been a great conversation, Erik. I appreciate you having me on. This has been just as fun as the last time, first repeat guest. I hope that this happens with all your guests, because listening to your podcasts I enjoy them and like to hear from some of those folks again.
Erik: Well, it's a pleasure to talk to you again. And maybe in two or three years, some topic will come up and we'll have a chance again.
Erik: Great. Thank you, Ed.
Ed: Thank you very much.
Erik: Thanks for tuning in to another edition of the industrial IoT spotlight. Don't forget to follow us on Twitter at IoTONEHQ and to check out our database of case studies on IoTone.com. If you have unique insight or a project deployment story to share, we'd love to feature you on a future edition. Write us at erik.walenza@IoTone.com.