Erik: Daniel, thanks for joining us on the podcast today.
Daniel: Hi. Thanks, Erik. I'm happy to be here.
Erik: Daniel, I'm really looking forward to the conversation. Maybe I'm looking forward to this first part of the conversation where I get to ask you for a status update on the Industry IoT Consortium, the IIC. We were chatting before we kicked off here. I was just telling you that I used to be quite involved in the IIC. Now I've been trapped in China for three years, so I haven't been as involved as I would like. But I know that you're on the steering committee. I would love to hear a little bit more about yourself, and also hear the work that you're doing with the IIC these days.
Daniel: Yeah, well, great. Well, first off, I'm glad that you got the name changed. Probably when you were involved, it was the Industrial IoT. Now we've had the name changed. It's Industry IoT Consortium. That's to reflect a broader scope of different business objectives and vertical markets. So, for my case, I've been with Toshiba for over 20 years now. My background is in electrical engineering. I've worked in manufacturing, test equipment, design, software, quality assurance. For the past 10 years or so, I've gotten involved in different standards, organization, and consortia, such as the IIC.
Currently, I work with our Toshiba's Digital Innovation Technology Center. What we do is we promote best practices, whether that's standards, organizations like the IIC, promoting industrial IoT systems and our reference architecture, which is based on the IIC's IIRA, Industrial IoT Reference Architecture. A lot of acronyms.
Erik: How would you classify the IIC in terms of — I guess, to some extent, it's a standard-setting body. But it's not setting the technical standards that are dictating how data is transferred, for example. It's more setting of these are business architecture concepts for best practices, and so forth. How do you look at the IIC's role in this ecosystem?
Daniel: That's a great way to put it, yeah. So, we're not a standard-setting body. But we promote best practices, including standards. When we do that through a variety of things, we have the core fundamental documents like the reference architecture. We have the connectivity framework document that was recently updated, networking framework, and a security framework. So, we do those as core documents and promote different standards that are internationally accepted. We also work with these standards bodies where there are needs. Maybe there's not a standard for a particular area. So, we might work with a standards body, whether it's OMG — which is the parent company — or IIC, ITU, et cetera to see if there's a larger need for that.
The other thing we do is test beds and test drives. So, real world situations. Currently, we have over-the-air updates for automotive or any type of transportation type vehicle, whether it's a robot in a factory or a car. Then we work with best practices and digital transformation. That's one of our new up and coming things, where we're looking at companies on how they're transforming themselves, what are the things that they're doing, what are they doing, and what is helping them achieve those goals. So, there's a lot of standards out there that are helpful with that.
Erik: And yourself, are you more focused internally on Toshiba, the organization, and helping the different functions, business units, understand the standards and best practices and comply to them? Are you more focused on helping to represent Toshiba's interests to the standards bodies and participating in steering the standards of the future?
Daniel: A little bit of both but more of the latter, promoting certain standards, helping develop them. Then like what the IIC is, promoting the ones that we use internally that we find useful, and promoting that outside of our organization.
Erik: Okay. Got you. Maybe this is a good jump off point to start talking about the state of the industry today. I mean, standards is maybe not the sexiest topic. But it's an important one, right?
Erik: Where are we with standards? Because obviously, we're coming from a legacy where everybody had their own proprietary protocols. We're, hopefully, moving in a direction that's a little bit more similar to the consumer internet, where you have very universally adopted standards. We probably won't end up quite there. But where would you say we are today in terms of having standards that allow companies to play nicely with each other when they're integrating solutions?
Daniel: I think there are a lot of standards that exist that help with that. I would note OPC UA is very well-known. It is helping in our UPS business. OPC UA is being used to communicate different UPS vendors. So, we can use services that originally worked with our UPS. Now it can work with competitors because of a standard like that. I think where there is lacking — that's a big topic with all types of organizations, including the IIC — is accurately representing certain types of data models and having this ontology of equipment capabilities. So, that's going to take a lot of time. It's spread out between these different vertical markets, whether it's building, smart buildings, or in the energy domain, a factory or electric generator. But there's a lot of common parts to these, It's just going to take some time to see where that overlap is.
Erik: Help me understand that in a little bit more depth. Is it like if I'm talking about steel, for example, then we can very precisely define different grades of steel and what differentiates one from another? But if I'm talking about two pieces of equipment, and they both have some kind of predictive maintenance solution, it's hard to compare those two and say why one might be superior. Are we looking in that perspective? How would you view the challenge here?
Daniel: Yeah, a little bit, or not understanding the full set of capabilities that each device has. Maybe one vendor calls it something different than another vendor does. I'll try to use a good example. It's like a temperature sensor. It could be a temperature sensor, and it just has temperature. But most temperature sensors also have humidity. Do you need the humidity control or not? Some might, some might not. But you need to have that as part of that capability description. That's a really simple example. But extend that out to other types of equipment where you have just those very small seemingly, insignificant differences, but they actually cause big problems when you're trying to — maybe a vendor wants to change from one set of software to another. Those type of differences make incompatibilities that you then have to deal with. That gets into the system integration issues. One of the things that Toshiba is trying to do is minimize that amount of system integration. Standards play a huge role in that.
Erik: Is software also included in here when you're talking about aligning ways of capabilities? Because if I look at product categories, the IoT platform is probably the most challenging product category to compare three and say, here's why A, B, C are different from each other. You have marketing teams that are just throwing out bullet point list of functionality. But understanding what they're really capable of, it seems very difficult until you actually do a pilot. By that point, you've invested a tremendous amount of effort into the assessment.
Daniel: Right. Yeah, that's a huge challenge today. I think, especially for cloud adoption, it makes it a little bit easier as we start to have newer tools that can dynamically switch between a cloud provider. Example, we use TerraForm scripts that allow us to move from one cloud provider — say it's a Jury — over to AWS. It used to be Google, but now that's gone. Those type of tools help. You can automate some of that system integration, but you can't automate all of it. So, there's still some pain points on that front.
Erik: Yeah, I've seen some stats. I mean, you never know how much to trust them. The one I saw, we're calling the system integration being something like 30%, 40% of the cost of a typical solution. So, that's a lot of, let's say, fat to trim. It's work that needs to get done. But in the end, it's not necessarily value-added work if you can accomplish the goal without that.
Daniel: Right. I think this newer trend is maybe not new to IoT, but it is to industrial. It's the use of microservices that are allowing this more modularization of components so that system integration still exists, but it's a lot easier than it used to be. Because now you can fit these modular components together instead of trying to rebuild a monolithic piece of software.
Erik: Before we get maybe more into detail around technology deployments, and so forth, I'd like to pick your brain a bit on the state of manufacturing in the US. Because I'm sitting here in China. I'm American, originally. It's a topic that is always — I'm always reading about it, and so forth. Now I have the chance to pick the brain of an engineer sitting in. I think you're in Nashville. Is that right?
Daniel: That's right.
Erik: Okay. All right. We'd love to understand. Obviously, we have, on the one hand, a lot of labor shortages and some challenges here, which are also an impetus to invest in automation and technology. You also have some degree of reshoring. How would you characterize the state of manufacturing in the US today?
Daniel: I think it's the wild, wild west in a lot of ways. The big challenge is the brownfield scenarios. If you're building a brand-new plant, then you can use all these latest tools and technologies. No problem, because you have nothing. You're starting from scratch. But it's the existing plants and operational facilities that are struggling to update or even see the need to update. The biggest challenge we see right now, especially with cloud adoption, is just data privacy — who controls that data, trustworthiness issues of the vendor or the cloud service provider. Those are really big challenges.
Erik: This data privacy issue, is that more of an individual privacy issue, as it might be more in Europe? Is it more of which company owns the data and more of a leverage position, relative to vendors and manufacturers?
Daniel: What I'm seeing in our US businesses is the latter. The business is very reticent to share their data, not necessarily with a company like Toshiba. If we have a good working relationship, that's not the issue. But it's delivering that data to the cloud. That's where a lot of that reticence comes in. Especially dealing with — if we get into like artificial intelligence models, we really have to do a better job of communicating exactly what it is we're sharing to the cloud to allow these AI engines to work. Maybe we're not transmitting all of your raw data. We're just transmitting some of the parameters that the AI model needs. So, it comes down to, it's not a technology roadblock. It's human understanding. We just all need to do a better job of communicating what it is we're trying to do and what type of data is needed to do that job.
Erik: This trust issue is interesting. We were doing a project for Siemens recently. We're helping them, one of their businesses, figure out their AI strategy for manufacturing. One of the things that we're looking at is you have all these technology startups that have pretty interesting solutions. They're trying to go into the market. But then, they're trying to convince a customer to be, "You're going to be our customer number 6." Nobody wants to be customer number 6. They want to be customer number 600 because they're going to be giving you a lot of data, sensitive operational data.
You're a company that has 50 employees and has been around for three years. It could be bought out. It could be bankrupt. It could be whoever. And so, looking at the kind of the opportunities between a large company like Toshiba and to help to bring some of these technologies to market, because you have the established trusting relationships, is that part of your portfolio? Do you play that? It's not really a gatekeeper role, but it's more of a coordinator role or an integrator role. Is that something that Toshiba also does?
Daniel: Yeah, definitely. I think that's one of our key strengths as we have these existing relationships and as we change our technology portfolio. We're in this process of changing from a product-oriented, hardware-based company into this as a service, services-based. That helps us immensely. Because we can go to existing customers and talk about what it is we want to do and prove it out, proof of concept, and then get that engagement. Maybe there's changes that they want to make. Those are enabled by a very close business relationship.
Erik: For the dynamic in the US right now, you said it's a bit wild west. But there's a very prominent labor crunch that I read about every day in the Wall Street Journal. How is that impacting the adoption of technology? I guess, on the one hand, it means you only have so many engineers available to actually deploy new solutions. Right? So, you need, actually, IT talent. Maybe a lot of mid-sized factories just don't have that level of talent to manage and deploy. On the other hand, you have shortages on the factory floor that are probably incentivizing more automation. How do you see that dynamic interacting with technology investments?
Daniel: So, there are several fronts on that. Working on back the labor shortage on the manufacturing side, that's where we're seeing our technology being used not just to augment capabilities. Not necessarily to make it completely automated, but to enable those workers to do the type of tasks that only a human can do really well. We're also seeing that we can turn around and offer those services as training models. So, we can do training with new employees that is much less. It doesn't require as much of the veteran staff to do that training, because you have these new digital tools that can look at the real-world situation going on in the plant and help train those new staff.
Then on the technology front, as far as the engineers and such, that's always a concern. I know, for Toshiba, we've been hiring hundreds of AI engineers over the past few years. So, hopefully, we're keeping ahead of the curve in that game, at least.
Erik: I got it. Yeah, it's a bit of an arms race there in accumulating talent right now. Maybe we just look at this from Toshiba's perspective. What are the technologies that you're seeing? I guess, industrial never really hits that hockey stick where something just takes off. But what are the ones where you see starting to get enough broad adoption, where people are moving beyond pilots and are quite comfortable now scaling into a national or global deployments?
Daniel: I see two things. One is the remote maintenance and predictive maintenance tools. That spreads out into basic alarm type stuff, but getting more so into what's actually causing this problem and can we predict it in advance. Can we predict a motor failure? Such that I could tell you, "Hey, your motor is going to fail within a year's time. So, go ahead and plan ahead to have it maintenance or have it replaced." We see that as a huge benefit and something that customers really like.
The other is visual inspection processes. Automated visual inspection is something that Toshiba has been working very hard with. We're seeing a huge amount of applications on that. Whether it's monitoring meters in an existing plant. You've got all these dials and indicators that usually a human has to walk around and look at. We're seeing applications where we can use cameras and robots with cameras on them to go and monitor that using image recognition. I have a case study I can talk about in the steel inspection industry, where we're using a more advanced image acquisition and processing to actually measure particles in steel samples.
Erik: Oh, interesting. Let's get into that. Particles in steel samples, what is that? Is that measuring impurities?
Daniel: Right. Measuring impurities. You think of the scale that comes off when you're melting metal, and you have to clean that off. Some of that microscopic still gets into the samples. So, there are different regulatory and standards processes for measuring those little — they're called nonmetallic inclusions. We measure those, and they can be down at the submicron level. But generally, those get measured. That allows a grading of the steel for its quality. Think of like structural steel, building bridges and other buildings. You want a much stronger steel. So, that grading process in the past has been very manual. You have a very well-trained inspector who is looking through a microscope. They do have tools, image processing tools, to help them autofocus and such. But at the end of the day, they're actually looking with their eyeball and measuring these very, very small inclusions, and then performing a grading process of what the quality of steel is.
Erik: Okay. Interesting. Both of these are interesting cases, because they are applications of machine learning where you have a training cost, right?
Erik: Each situation is a little bit different. So, if you have to do the training for each unique situation, then you have the high upfront cost. But as the algorithms improve, the data tagging, processes improve, and so forth, then you can bring those costs down. I'm actually just personally very curious in this, because we're working on a similar project. Also, machine vision in automotive. We're trying to figure out what the cost structure might look like, because we're at the very early stages. We're trying to think, are we going to be able to get to a reasonable accuracy level within a reasonable investment? Or we can find that by the time we are starting to approach that, we're already way over budget. Maybe you can talk about where we are. How do you optimize that process or assess that to make sure that you're addressing a use case that's really financially viable?
Daniel: Yeah, so in this case, this is all the combinations we're talking about with labor shortages and such. Steel inspection, it's very highly-trained. You have to be very highly-trained to do this. So, you can imagine with labor shortages, and you don't have a lot of trained inspectors to do this. But on the other hand, for this type of measurements and the regulatory requirements, you need a human inspector to give that judgment. So, you can't just automate that process.
What we looked at doing is, well, okay. We know that we're going to have a human inspector performing this grading. So, what can we do to assist them? What can we do to take away all the rote work from that? So, let's automate the microscope, the auto-focusing. Let's do all these things that the inspector usually had to do, and takes a lot of time to do. These processes, this inspection can take upwards of like four to five minutes per sample.
I think a good idea is, think of a camera. In the old days, you had to focus the camera and change the F stop and all that stuff. Now you can just pull out your phone and click. Is the point of taking a picture changing all those parameters, or is it to capture the image that you want to capture? In this case, we focused on, we need the inspector to actually have the image ready and be able to make these measurements, and maybe do some of them for them.
So, the AI tool, the training part is looking at these nonmetallic inclusions. We got that up to about a 90% accuracy of measurement sizes. Again, that's down to like submicron level. What that does is that it takes a lot of the burden away from the inspector for, let's say, the easy inclusions, and lets them concentrate more on the very difficult ones. So, that ended up we can take a five-minute process down to one minute by just doing those things — automating the rote work of that inspection process. But at the end of the day, it's still the human inspector that provides that final judgment.
Erik: Got you. Okay. That sounds like a smart approach. You're dealing with the same existing process, existing infrastructure, but automating its more labor-intensive steps. What about in terms of robotics, 3D printing? Are you seeing any other maybe IoT platforms? I guess you've touched on it a little bit already. Digital twins, edge computing, anything else that you're seeing starting to kind of move beyond this pilot phase more towards wider deployment in the industry?
Daniel: I've seen research in our Bristol labs on swarm robotics. So, taking factory floor space, taking robots that can move equipment around, or move stalk around. When the stalk gets too big, then the robots can merge together and pick up larger objects and move that around unassisted. That's maybe more of a greenfield application. But I think that's going to be very powerful in the future.
Going back to machine vision, I think there's a huge question. It used to be where's the edge? Where is the edge located? We had that idea of fog computing. I think now we're seeing that in the AI realm, of where is the AI. Is it in the cloud? Is it at the Edge, or is it on the device? So, we have some video processing chip, Visconti, that's being used with one of our partners, Labforge, out of Canada. They're using the deep neural network capabilities of that chip in their smart cameras. They can put multiple cameras out in a location and map an entire area using very low power — using solar power in many cases. All the AI is running on the camera itself. It's not running in the cloud. All that's being transmitted to the cloud are the mapping data points. So, very low power, very low network bandwidth needed. I think that's really interesting. It's going to open up a lot of use cases for drone security, mapping, securing physical locations, and such.
Erik: Yeah, I know there's an incredible amount of innovation going into this space and at different layers. I think we had, maybe six months or so, an Israeli company on that was designing chips, specifically for machine vision applications. It's a small company and just making their contribution. But you see at each level of the tech stack, they're moving this forward because it's such a practical application.
You mentioned earlier, dark factories. I'm curious because this is very much a standards topic. If you're going to have a dark factory that has some pretty good standards. It's a topic I've been hearing a lot more about in China lately. Sometimes I feel like, China, maybe they read McKinsey reports a little bit too diligently and get a bit excited a few years before they should. Do you see that as a viable manufacturing approach, aside from maybe a few extreme cases, in the coming years? Do you think it still makes more sense to pick and choose your use cases but not disrupt processes so much? Where do you see that particular topic?
Daniel: I think as we look at factories, we need to be more mindful of what it is they're trying to do. It can come from a more user-focused mindset instead of a product-focused. What can I put in your factory to make it better? Think more about what are your existing processes and existing problems. How can I facilitate that? And if that means completely automating some tasks away, then okay. But that's not necessarily applicable for every factory. I really don't have a good answer for that. That's a big topic.
Erik: Well, I think that is probably the right approach. It's a difficult approach for somebody who's selling the technology, right? Because your goal is to sell the technology.
Daniel: Yeah, so, to come in and tell someone, well, your equipment is fast enough. You don't need a bigger processor and more memory. What you need is a better algorithm. But that's what we're seeing today in a lot of cases. We can improve algorithms. We can use AI models to do things that in the past was, "Well, we have to go shut your plant down for two weeks and install a bunch of new equipment." The benefit now is that I don't have to shut your equipment down. We can upload a new software. If it's in the cloud, then there's no downtime whatsoever. So, that's a huge benefit for customers once they understand that.
Erik: Yeah, absolutely. Especially, in the US, a lot of the factories, a lot of the equipment are a couple decades old. So, you're dealing with these legacy environments. Maybe we can touch quickly — it's kind of a wrap up here — on some of the challenges that you see in terms of deployment if you have any insights around how to address those. You've already discussed this one, this related to data ownership data access.
If you look at it from a technical standpoint — organizational, defining the business case, setting the standards — are there any others that jumped out at you as really common challenges, that still lack great solutions in industrial automation or digitalization? Where would you say that you're sweating the most when you're planning into deployment?
Daniel: Sweating the most? I think one of the challenges a company like Toshiba has is we have a very strong technical focus. Like in predictive maintenance applications, what we see is we're going to customers and we show them what we have, what we can do. They say, "That's great." But it's overkill. I don't need to know that a bearing is failing in the motor. I need to know that the motor is failing, and when should I replace it.
So, what we're trying to do is back off and say, "Okay. We need to have granularity between what the customer actually needs." For some plants that don't have enough staff to really get down and say, "Well, I can replace a bearing in a motor," well, okay. We just need to let them know you have a motor failing. For another customer that's much more maybe advanced in that, they might want to know a more detailed analysis of why a piece of equipment is failing. Then we can get even more granular with that and start predicting when some part of it, a piece of equipment is going to fail so that customer can plan ahead. That's a challenge for us. It's to learn what the customer actually needs, how granular do they need it, and how much information do they need to know. Because sometimes they don't need to know as much as we think they do.
Erik: Yeah, that's a great thought. We also do a lot of work in Germany. You have this trend there, where these companies have teams of people that have worked for them for 20 years and have just a very intuitive understanding of the process and have a tremendous amount of experience. This generation is, to some extent, retiring. Then you're maybe bringing in somebody from Turkey, from Poland, somebody who's relatively young who lacks that experience. They need to be quickly up to speed. So, do you have the right interface? Are things, to some extent, simple enough, in the right level of complexity that somebody can plug themselves into this system?
Daniel: Going back to the metal industry case, as we deployed this new update of our product, we realized in the middle or at the end, it's like, well, these people are still having a problem with trained inspectors. Although we're alleviating the workload on the veteran inspectors, we're not doing anything to help these new inspectors get up to speed. So, we changed. We looked at the AI model we had developed. We said, hey, this is a training model. We can turn this around, take real world data, and train these new inspectors on it.
Because the AI model can always be updated, and we always have new datasets, we can continually train these people with real world, up-to-date image samples in this case. So, I think that's another revelation that we have. It's we have the power to train these new inspectors. We have the data available. We just need to think about what's the best way to train this new staff?
Erik: Yeah, that's a very interesting way of looking at this asset that you've built up, that the work can flow in both directions, to some extent.
Erik: Cool. Well, anything else? I guess, we've touched on a bit about the situation in the US and technology adoption challenges. Anything else that you're seeing that you think is particularly interesting in manufacturing in the US today?
Daniel: I think the other part, I'm not in the weeds on the cost of this stuff. But what I will say is, I've talked with people. The big benefit of a lot of these industrial IoT changes is your capital equipment cost and your operating costs get flipped. So, you don't have a lot of CapEx cost upfront with some of these new deployments. You can do very low initial cost. Then it's a recurring cost. It's monthly as a service. It's better accounting-wise, but it also lets you evaluate the benefits you're getting as you're getting them on down the road. So, now you don't have a huge million-dollar capital equipment project. You have a $20,000 product, or project rather. So, proof of concept is a lot easier in that case. You can monitor it as it goes, and make decisions on month-to-month basis or year to year. So, I think that is allowing more proof-of-concept trials for this type of services.
Erik: Yeah, that's a great perspective. The way that I've heard this communicated by companies that are running these solutions is that as a technology business, you have to prove your value to your customer every month. Whereas previously, you had some lump sum sale. Then they're pretty much stuck with this until they'd appreciate it, and then they decided to go that's fine.
Daniel: Then they go to another customer. They go to someone else. Now we have to prove our worth with the services month to month. That's a great way to put it.
Erik: It sounds like you're a bit more on the technology of the IIS side of Toshiba. But when it comes to also selling equipment, do you see adoption of new business models there, where also the physical equipment is being sold more as a service? Then if you have the predictive maintenance and machine intelligence, you're able to say, well, it's a little bit of the jet engine. We'll manage it for you and you pay us for thrust. Are you seeing much adoption of those models?
Daniel: We're seeing that more with very, very high value hardware, more so. Something that is really, really expensive like a jet engine. We are seeing those business models rollout. But not so much with less expensive stuff. Maybe a UPS for a building might still be a single fixed purchase. Maybe if it's a UPS system for an entire area, say, several buildings in a big manufacturing plant, that is a viable option. It's to run that as a service. Then the service contract is covered by that. So, when there's equipment outages or something, that gets dealt with immediately.
Erik: Yeah, fascinating. This is, for me, one of the more interesting parts of this whole journey. You really get to rethink how the businesses operate, how organizations interact with each other. I remember the IIC. I don't know if this has made progress. But I think they had a test bed between Fraunhofer and — I forget. It was an organization in Korea, which was the — I forget the name here. But it was this test bed where you have suppliers around the world. You have visibility into their capabilities. Then you're able to create an online platform where you can say, I want to buy this. You know exactly who can produce it, who has capacity, who has the right capabilities. Do you remember that one?
Daniel: I do. It's still active. Actually, it's manufacturing as a service, where you can build a product and then go find different manufacturers with different capabilities. Maybe one manufacturer is doing CNC work, and another is doing 3D printing work. That's very active. Fraunhofer has actually made a demo of that service. The Korean Industry 4.0 is trying to map what Germany is doing. It's still very active. They just published something recently. If I had the link, I would send it to you. But yeah, very active. Very interesting.
Erik: Yeah. Great. Well, Daniel, thanks for taking time to share with us. I think you're doing great work at Toshiba, and also at the IIC. I really appreciate you taking time out of your morning to share your experience with us.
Daniel: I appreciate it. Thank you very much, Erik.