Apple and IBM take their partnership to the next level with Artificial Intelligence (AI)
Apple and IBM revealed an enhancement to their existing collaboration at the Think 2018 conference with Watson Services for Core ML and the newly designed IBM Cloud Developer Console for Apple. The offering marries Apple’s Core ML tools for developers with IBM’s Watson to make it easier for companies to add artificial intelligence into mobile applications. The companies already have a longstanding history of partnering to create contextual mobile business applications with IBM’s MobileFirst for iOS. While not widely discussed, the two companies have built hundreds of mobile-enabled apps for companies, including Amica Insurance and Japan Airlines, that were redesigned from the ground up to work in a connected world. Combining Watson and Core ML should open up a wide range of machine learning enhanced features for applications in areas such as image, text and speech recognition on Apple’s mobile devices.
Apple announced Core ML last June as an API to make machine learning run more efficiently on devices such as iPhones. Core ML supports a number of essential machine learning tools, including neural networks (deep, recurrent, and convolutional), as well as linear models and tree ensembles. It can run on any of the Apple products including macOS, watchOS, tvOS and iOS. Core ML is for on-device processing, meaning the data that developers use to improve models resides on the customers’ phones and tablets. While Core ML allows you to run trained algorithms on an Apple product by only adding a few lines of code to an application, it doesn’t create the model.
A model must be trained for specific tasks, such as recognizing an object or understanding what a customer is asking a chatbot, by analyzing large volumes of tagged data with a powerful computing structure. This infrastructure could reside within a corporation’s data center or be purchased as a machine learning/AI cloud analytics service. Once trained, a model can then be converted using Core ML to efficiently run on products such as an iPhone or iPad. When Apple made its Core ML announcement, it provided some basic models to help a developer kickstart their machine learning efforts. It also hosts some of the more popular models on its developer page. However, these weren’t designed to help an enterprise create AI-enabled business processes.
Companies struggle to build AI-enabled apps
Artificial intelligence in applications promises to deliver Right-time Experiences that offer the right information, at the point of need. The challenge that lies ahead for companies is understanding where and how to build artificial intelligence models to improve a business process. Enter IBM with a combination of cloud computing infrastructure for data processing, cloud-resident mobile app development tools and AI know-how with IBM’s Watson. IBM’s Watson has been trained with specific industry data in areas such as healthcare and financial services.
IBM’s services can help companies build algorithms to support function such as image recognition, speech processing and sentiment analysis. It has purchased data sources, such as Weather.com, that offer information to enhance applications. For example, weather data and AI can be used to predict potential shipping delays and help a retailer predict regional changes in clothing demand based on events such as unseasonably cold weather.
A partnership that makes sense for the enterprise
Why Core ML? There are several benefits of running these models directly on the device. First, it removes any lag time (also known as latency) associated with sending data, analyzing it and receiving an answer from the cloud. It also means the models can run whether or not the employee’s device is connected to the Internet. Finally, with companies focusing on data loss prevention, IT will appreciate the ability to keep data on the device and define where the data goes to train their models. Basically, data privacy for the enterprise. Why IBM? IBM has spent years designing AI solutions for business and breaking these solutions into multiple consumable parts (e.g. microservices) that can run in the cloud. Additionally, IBM has developed cloud infrastructure services that support both AI workloads and mobile app development. The new IBM Cloud Developer Console for Apple puts all the resources a developer needs for designing Swift apps into one portal.
Of course, there’s competition for both Apple and IBM. Google’s Android and others have offered their own versions of tools to help developers add machine learning into apps. Other cloud companies such as Amazon’s AWS, Google’s Cloud ML, Microsoft’s Azure Machine Learning also offer cloud-resident machine learning and cognitive services. The trend of vendors adding AI to products is a win for enterprises. In my mind, one of the main differences in the IBM/Apple relationship is how the partnership supports enterprise app development through a set enterprise-focused tools but also by leveraging IBM’s services organization. A majority of the heavy lifting in designing AI apps will fall to systems integrators for the next several years while companies search for AI talent that just isn’t available today.
Companies are interested in AI that solves real-world problems
In a demo with IBM, I saw how a field service representative could use the camera on an iPhone to quickly identify various electronics parts and assess the cause of a part’s failure. The rep could verify if the Watson algorithms had correctly identified the component and the issue. Additionally, the rep could add more information to the record. Data from these records can be stored and uploaded at a later time to improve the quality of the model. Overall an application like this field service example highlights how a company can improve employee productivity and consistency of service with applied machine learning. It also offers the opportunity to capture more information and learn more from it to create data-driven business processes. Apparently, Coca-Cola agrees as it’s testing this use case today with Apple and IBM.
While there’s no “easy button” for AI, IBM’s services combined with Apple’s Core ML makes it easier for a company to construct a mobile app and run machine learning models within the app. It’s a huge step in showcasing the art of the possible for apps that are contextual, intelligent, learning and predictive. The combination of mobile and artificial intelligence will finally usher in the era of “Right-time Experiences” that deliver the right information, to the right person, at the right time, using the best device for the task.