About Brian Sathianathan - co-founder and CTO of Iterate.ai
Brian started his career at Apple where for 6 years he led iPhone and Mac initiatives in the New Product Introductions Group. Brian left Apple to found Avot Media, a software platform used to transcode video for Mobile. Avot was acquired by Smith Micro [NASDAQ:SMSI]. After Avot, Brian joined the seed-stage investment team at Turner Media. Brian is now the Co-Founder and Chief Technology Officer of Iterate.ai.
“Every technology follows an S-curve. Depending on where a technology is on the curve, there may be a mismatch between expectations and reality. I think that blockchain is going to become very important for internal tools related to process efficiency, security, reporting, and compliance.”
About Iterate.ai - www.iterate.ai
We are a developer of AI-powered low code software intended to accelerate innovation projects within large enterprises. Our platform accelerates the development and deployment of AI-centric enterprise applications. Our low-code environment empowers us to develop and deploy digital solutions faster, enabling enterprises to go to market faster with digital initiatives.
ITERATE.AI FOUNDING STORY
Erik: What was the impetus behind the setup of the low-code AI application development platform Iterate.ai?
Brian: Interesting story. I'm a serial entrepreneur but at the time I was in corporate venture capital and was sitting on a couple of boards. That was the time when AWS was gaining traction among startups. In 2007, it could cost seven figures for a startup just to buy servers. Infrastructure and compute were very expensive. But by 2012 costs came down to the point that someone could set up a startup for $5,000. We saw that a great explosion was going to happen so I asked my co-founder, the angel investor Jon Nordmark, "How do we help these startups get traction with corporates?"
At the time, there wasn't a creative, formal mechanism for executives in large companies to work with startups. There were venture funds. There were a couple of accelerators. But there was no feed mechanism. So, we ended up building Iterate Studio as a clearing house for startups. Today, we catalog 17 million startups and 78 million patents through our platform.
Very quickly we realized that a clearing house platform was not enough. The platform could provide information so that when somebody talks about an AI trend in a board meeting the C-level can follow the conversation. It's educational. But it doesn’t help them build solutions. The problem is that an innovation leader's tenure in a big organization is exactly two years. Then they become VP of innovation, head of a new product, or a venture investor.
This was the 2017-18 timeframe, and with the current AI technology two years was not enough time for a corporate innovation leader to bring anything to market. At that point, we had relationships with 80-odd AI startups. So we thought, “Why don't we take the capabilities of these startups and put them into drag-and-drop software blocks?” Low-code wasn't mature yet. Microsoft and others had solutions but they were essentially enterprise RPA. They were not targeting the innovation market. So we created digital building blocks, a development canvas, and a runtime environment with middleware where organizations could quickly build full stack digital solutions. This became Iterate.ai.
Our business boomed. Customers said, "We love it. Let’s bring some products to market." First we built a conversational commerce AI application with one customer. That blossomed into other AI-based applications with retailers. Before we know it, we had a double-digit million dollar revenue and grew from 20 people in 2020 to 70 people today.
Erik: On your website you have the tagline: "We help companies build AI applications 17x faster." What is the process of developing an AI application at a large corporate, and why is it slower than how a lean tech startup might work? Let’s break down the process from ideation through deployment and highlight the stages with the highest risk of cost overrun, missed deadlines, or product failure?
Brian: The traditional enterprise process starts with business analysis and identification of a business need. Often certain projects are earmarked for implementation as part of strategic planning. This is the top-down starting point. The bottom-up place to start is when an AI project is already recognized by the business as creating value for them. It's a push and pull mechanism. The pull is the business and their needs. The push is when the innovation teams says, "Based on our strategic goals, here are five things we could do.”
Let’s assume that the product is earmarked and there's budget for it. Now it’s time to build a prototype. At large organizations it's not only about building the product but also building the snowball around it. With a typical startup, you need three roles: the designer who defines the specifications, the hacker who builds the product, and the hustler who sells it. It’s the same at a corporate. You need the design, product development, and sales and marketing teams to buy in. Without budget getting everyone aligned is impossible.
Modern prototypes are complicated. If you just want to build a mobile app you can go to an agency and get it built. Simple. But everybody is overrun with apps. There are over 3.6 million on Apple’s App Store yet the average consumer uses 9 apps a day. The situation for enterprise applications is not much better. How will you get someone to adopt your solution? Nowadays, you need to provide something far deeper. For example, rather than building an app to find your gas station, the app has to be able to automatically unlock the pump, specify the volume, and pay. This gets complicated. You need cameras at every gas station to read your license plate. The app needs to talk to the pumps, which are mostly legacy systems. And you have to integrate with payment and accounting systems.
A lot of projects fail at big companies when that complexity is unearthed. You can easily build a prototype or an MVP that fakes it. But when you drive your vehicle to a gas station, and the camera recognizes, that requires computer vision.
And when you unlock pumps you're dealing with IoT connectivity. Building the infrastructure is complex and expensive. Suddenly the head of innovation says, "Oh, my God. This is not what we planned as an MVP. This is far bigger." Dealing with this technical complexity is where the “building blocks” approach become useful.
On top of that, you also have legacy tech and business complexities. The legacy complexity is from existing enterprise systems. Typically, you need IT to integrate these systems because they are owned and operated by IT. The business complexity is confirming that all cases are satisfied. If you try to solve all this complexity, you will never release a product. So you need to test and learn. This requires aligning with IT and the business on a phased approach. The low-code approach makes this much simpler.
AI COMPLEXITY MANAGEMENT
Erik: I like your framework. You touched on technical, business, and organizational complexity. Where does a low-code platform reduce this complexity?
Brian: Let’s start with technology. The best analogy is building a house. In California, with all the city permissions it'll take you 19 months to build a home. Alternatively, you could customize a modular home online, have it delivered in a truck, and assemble it rapidly on site. This same philosophy applies to low-code development. The AI and IoT building blocks are already pre-built. You only have to configure them.
That creates a few benefits. First, it enables upskilling. Even in larger organizations you might only have a few people in your data science team. And they are probably overloaded with projects. Now if you bring components that an application engineer can configure, you don't need data scientists. This enables the innovation team or business to build solutions themselves. Across traditional organizations —companies whose core is not digital technology — there are 25 million web or application engineers because the internet has been around for about 25 years. But there are only 160,000 ML and AI experts. Upskilling with low-code increases the size of your potential labor force by 160x.
Low-code cannot technically do everything but it is a strong application of the 80/20 rule. Most problems can be solved by connecting pre-built components. We focus on abstracting these five forces of innovation: AI, IoT, blockchain, databases, and startup applications. This creates a can-do spirit that gives the innovation team confidence to say, "We're going to go build this quickly."
The other thing about organizations is that the leader who builds first wins. If you build something quickly, then you own that project because everybody else is pitching PowerPoints. And a lot of times the sponsor for projects is marketing or the business. These are visual people. If they see things working, you can convince them a lot faster. Low code enables that process.
First, it abstracts the five forces of innovation. Second, it enables web engineers to build IoT and AI applications. Third, it creates an iterative rapid development process.
CLOUD VS EDGE COMPUTING
Erik: When you apply AI to problems that are touching the physical world you face a lot of complexity related to messy human behavior, hardware management, and compute constraints. What are the differences of using a low-code platform for edge versus cloud applications?
Brian: The IoT is everywhere today. With this propagation, the data that comes out of these IoT devices can be harnessed in a much smarter way when you apply AI locally. Edge deployments are critical because you often need to process data in real time. But edge computing is complex. There is a lot of processing ability in the cloud and everything auto-scales based on demand. That luxury is not available when you're running on the edge. Most edge devices are very limited in compute power and storage so support for your AI model on that chipset may be limited depending on the type of computations you're trying to run.
Our approach is to build our platform in the cloud but to make sure that it scales vertically. One of the challenges is that a lot of kids coming out of school were taught on high-level languages like Python and NodeJS. They don't know what happens underneath. They have no idea how an application code really works. I don't blame the engineers. It's just that coding is going up stack. We focus a lot on how to make sure the runtime can scale at the same benchmark of the underlying NodeJS or very close even if we are running on Docker. In each system, if you're running on a single node, you can scale vertically on that node. In fact, we have benchmarks where we show that we can run 35,000 simultaneous sessions on a four-core CPU. When you start thinking that way, then you begin to optimize and figure out what your overheads are.
The other thing to think about is storage. AI engineers tend to be very liberal with storage because it’s cheap in the cloud. With an edge device you don't have that luxury. So we think carefully about topics like versioning and the duration of data storage.
The last thing is that there are a lot of corner cases in edge computing. For example, people may assume that they can always update or refresh an application. But in an industrial environment sometimes the connection gets cut. There is no phoning home so you need the system to recover automatically.
Erik: Your platform is oriented around five forces of innovation: AI, IoT, blockchain, databases, and startup applications. Four of these seem intuitive. But I get pitched a lot on blockchain concepts that seem to be high on hype and low on adoption. What has inspired you to include blockchain in your portfolio of building blocks?
Brian: Every technology follows an S-curve. Depending on where a technology is on the curve, there may be a mismatch between expectations and reality. I was on the team that worked on iPhone in 2005 at Apple. People forget that Motorola put out a mobile phone in 1984. It went through an evolution of over 20 years before the market took off. AI and IoT are two technologies that seem to be propagating a lot faster than blockchain. But they have also been in development since the 1970s, so we are much earlier in the s-curve with blockchain.
I think that blockchain is going to become very important for internal tools related to process efficiency, security, reporting, and compliance. Consumer adoption will lag. But even on the consumer side, there are a couple viable use cases. For example, companies have to be very careful when using third-party data because of privacy regulations. Companies are working on blockchain-based approaches to
enable consumers to sell their data to retailers or to share their health data with healthcare providers. This could work very similarly to Facebook Connect, but with the consumer in control.
Consumer-initiated data sharing is a very powerful blockchain play because activity is logged in the blockchain. You only share the portion of the data that you choose. Recently we helped a medical company acquire asthma data so they could run AI models on top of it. But we realized the relevant patient visit data and healthcare data is stored by several providers. In the traditional process, you need to hire third-party anonymization companies to remove personally identifiable information. By the time you access the data to run AI models it will take months or years.
With blockchain, sharing can be done very gracefully. All you have to do is create an app with an authorization model. The user reviews data requests and shares the data they are comfortable with. You put the power back into the hands of the user. It's authenticated. It's distributed. It's protected by better tech. I think users will care about that. So there are consumer use cases where blockchain will be very impactful.
Then of course, there is a lot of noise from metaverse and NFTs. The market goes up and down. You need to look beyond the noise. I think the valuable use cases are going to slowly propagate.
MACHINE VISION USE CASE
Erik: Great point. We often forget that people were working on IoT systems back in the 1970s. It takes a lot of sweat to prepare a technology for the mass market. Let’s wrap up with a case study that illustrates the value of low-code platforms. Which one do you have in mind?
Brian: We recently worked with a body shop in the collision damage estimation industry. When your car gets hit, they coordinate with insurance and repair it – repainting, body job, whatever needs to be done. Their problem was that the customer would get a price estimate by submitting a photo of the damage online via mobile. Then when they arrive at the shop, the price could be significantly different from the estimate. For example, you might have a little dent in your car. But depending on model of your car, it might impact the entire backplate. Their previous solution would provide an estimate for a little dent costing $100 dollars. But in reality the body shop has to change the backplate. That's a $1,500 dollar job. For customers that surprise is a terrible experience.
They decided to build a machine vision solution to improve estimate accuracy. The problem is that they didn't have enough images of damaged vehicles to train the algorithm. So they worked with another partner in a completely different part of the world to acquire a set of images from different accident scenarios. They used machine learning and transfer techniques to transfer the images and built a database of about 200,000 images. Auto-tagging enabled them to tag the images in a few days. Finally, they analyzed a couple of million transaction records, and merged the records and vision data together to build an ML pipeline.
By looking at transactions, they were able to match vehicle damage with specific scenarios. The fusion of machine vision data, historical image data, and transactional records enabled them to reach a very high accuracy level. Today, it's running in hundreds of shops. Customers are happy because they receive accurate quotes. The company can better forecast workloads. And the shop managers no longer have to argue with angry customers.