Ruiz: Democratizing artificial intelligence and deep learning

Armand Ruiz
Armand Ruiz, Product Manager at IBM Watson

Years ago, when I was taking my first steps in computer programming, coding was for geeks and computer programs had limited use. Development tools were very crude, writing code was hard (remember Assembly, C, Pascal?), compiling and linking was a nightmare (MAKE files anyone?), and debugging was even worse. Long story short, programming was not for the faint of heart. You needed nerves of steel and had to patiently fail over and over before you got the hang of writing good code.

borland turbo c++
What programming environments looked like in the early 90s

But as software gradually rose in prominence, the entry barrier to programming lowered. Presently computer software has found an important role in almost everything we do, from shopping to applying for university course and communicating with friends and family and whatnot. Accordingly, the tools to create software, the programming languages and Integrated Development Environments (IDEs) have become more intuitive and easy to learn. If in my days, it took an average of two jam-packed years to become a decent developer, today the same can be achieved in a fraction of the time.

We’re seeing the same trend happen in artificial intelligence. With the advent of machine learning and deep learning, AI has changed from a sci-fi fantasy to an inherent part of everyday life, from reading news to fighting cancer and detecting fraud and more.

However, like in the early days of software, developing artificial intelligence and machine learning applications remains elusive to most organizations and people, and only large tech companies can make productive use of it.

A number of experts and companies are leading efforts to address this issue. Among them is Armand Ruiz, Product Manager at IBM Watson, whose team is working on tools that not only make data scientists more productive, but also make data science, AI and deep learning more accessible to the enterprise. In an interview with TechTalks, Ruiz shared insights and experience on the challenges of the AI industry and solutions to democratize it.

The challenges of developing AI applications

artificial intelligence

A recent Gartner survey of CIOs ranked artificial intelligence as one of the hardest technologies to implement. “While the level of difficulty varies with the type of AI technology being implemented and the process it is being deployed into, there are several key barriers to adoption: skills, standardization, complexity and a lack of collaboration,” says Ruiz, adding that many challenges are specific to deep learning, which is a relatively recent innovation that increases the amount and type of data that can be tapped by an AI system.

When creating deep learning applications, developers build and train “neural networks,” a software structure that is roughly inspired by the human brain. These neural networks can accomplish a variety of tasks, such as identifying the content of images, performing face recognition or analyzing the meaning and intent of human language.

But developing deep learning models is a very painstaking and expensive process. “First, deep learning is a computationally intensive and highly specialized field,” Ruiz says. “It requires a highly tuned system with the right combination of software, drivers, compute, memory, network, and storage resources,” the combinations of which can reach thousands or millions of dollars in costs.

In fact, deep learning as a concept has been around for a long while, but only became a reality with the explosion of availability of storage and compute resources, mostly held by big cloud providers. While many of these companies make their AI tools available to developers and companies, they remain the exclusive gatekeepers to those platforms.

Ruiz also points to the lack of skilled engineers to develop AI applications. “While formalized education in deep learning is growing, the talent pool is still limited,” he says.

The acquisition of scarce AI talent become an arms race between large tech companies, which sometimes pay their engineers salaries that reach millions of dollars per year. This makes it very difficult for smaller companies and organizations to gain access to talent and innovation and compete in the space.

“There is also a lack of standardization across deep learning teams, with different data scientists preferring different open source frameworks to build their models,” Ruiz says. This can make it increasingly challenging for data scientists to share and re-use models within their own teams.

Another fact worth considering is that neural network design is just one stage of a much larger workflow, Ruiz notes, which also encompasses the training, evaluation, deployment, monitoring and enhancement of deep learning models. “Data scientists must understand many functional areas beyond design. This includes being familiar with, and able to work on, various infrastructures and architectures that differ widely in their use and application,” Ruiz says.

Finally, Ruiz points to the disconnect in many organizations between IT (those with the technical expertise to analyze the data) and domain experts (those able to glean insights from it) as a barrier to the adoption of AI across organizations. “These teams often work in siloes, with differing tools and little visibility into each other’s work. The result is AI that falls short in its promise to augment people’s expertise,” Ruiz says. “This is an issue I’ve seen personally in my experience in data science and is a challenge I am passionate about working with my team to overcome.”

There’s an analogy worth reminding here. In its earlier days, plain-old programming was a deeply technical endeavor that only highly skilled engineers and developers could engage in. A large part of their time would have to go into understanding the problem space they were developing for and bridging the gap between domain experts and computer code. But as programming tools became easier to use and Application Programming Interfaces (APIs) became more capable, software development became democratized and more and more people from diverse backgrounds became able to transform their domain expertise into software. AI needs to go through a similar process.

These elements can make it challenging for organizations to deploy deep learning on an enterprise-wide basis, hindering them from maximizing the business value they get from their data, Ruiz says.

How do you democratize AI and deep learning?

artificial intelligence brain

“To realize the full potential of AI, we need tools that make it easier for today’s data science teams to develop AI systems that can help glean insights from data,” Ruiz says.

For this to happen, deep learning solutions and frameworks must enable data professionals to more quickly and precisely design new neural networks, optimize their training models, understand, re-use and enhance the networks that their peers have already created, and collaborate across the organization. “At IBM, we are already doing this by offering tools that make deep learning accessible to individuals of varying skill levels,” Ruiz says.

Ruiz is working with a team of product managers who formerly worked as data scientists. So they understand the challenges that various users face when working with data and AI, and they’re focused on building tools that make data science and other complex technologies accessible to everyone, from scientists to the average enterprise user.

One of their notable efforts is Watson Studio’s Deep Learning as a Service solution, which recently made its debut. Deep Learning as a Service draws from advances made at IBM Research to enable organizations to overcome the common barriers to deep learning deployment, including skills, standardization, and complexity. Among the features of Deep Learning as a Service is a Neural Network Modeler, an intuitive drag-and-drop interface that enables non-programmers to build models by visually selecting, configuring, designing and auto-coding their neural network using popular deep learning frameworks such as TensorFlow, Caffe and PyTorch.

IBM neural network modeler
IBM Watson’s Neural Network Modeler (source: medium.com)

“Our goal is to make it much easier for enterprises, along with data scientists and developers, to cost-effectively build, optimize, and train neural networks at scale, using well-known frameworks with minimal code,” Ruiz says.

The future of AI depends on democratizing it

As data scientist Doug Rose pointed out in this column not long ago, AI needs greater representation of the humanities in order to overcome its hurdles and challenges. And as acclaimed historian rightly pointed out in the annual World Economic Forum Davos, the future of humanity might depend on how distributed AI and big data become. So the efforts of Ruiz and his colleagues to democratize AI might be more significant than they seem.

AI holds enormous power to transform our world and the businesses leading the future, including how we consume information, how we shape communication, and how we engage with technology,” Ruiz notes. “However, AI can only have this impact if it is easily accessible and can be applied with purpose. The democratization of AI technology, combined with improved widespread understanding of its usage and growing consumer trust, will help to break down current barriers that make it difficult to integrate AI into enterprise workflows. Organizations deploying AI technologies at scale will see significant increases in productivity and efficiency, allowing employees to focus on more complex, creative, and higher-impact tasks.”

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.