The Role of Cloud Computing in Artificial Intelligence

Though artificial intelligence started much earlier than cloud computing, cloud computing and its technologies have improved AI very much.

Kivanc Uslu
Towards Data Science
5 min readApr 13, 2021

--

Photo by Markus Winkler on Unsplash
Photo by Markus Winkler on Unsplash

The term Artificial Intelligence (AI) was used for the first time by John McCarthy during a workshop in 1956 at Dartmouth College. The first AI application programs for playing checker and chess were developed in 1951. After the ’50s, AI was on the rise and fall until the 2010s. Over the years, there have been some investments in AI by vendors, universities, institutions. Sometimes, hopes were high and sometimes hopes were low. It was referred to as the artificial intelligence winter when there was not enough interest in AI from industries.

For the last ten years, AI, especially Deep Learning (a subset of AI), has been on the rise after two successful events: First, IBM announced the Watson super system which defeated several Jeopardy multiple time winners, namely Brad Rutter and Ken Jennings on February 14–15 and AlexNet, the name of a convolutional neural network (CNN), competed in the ImageNet Large Scale Visual Recognition Challenge on September 30, 2012 and achieved the top in this challenge. Based on various market research, almost all institutions in different industries have started to invest in AI and they will increase investments in AI for different use-cases in upcoming years. Future jobs related to AI will be needed more based on the World Economic Forum’s ”Future of Jobs” report (October 2020). Moreover, according to Glassdoor online employment company, data scientist has been called one of the best jobs in the USA last 3 years.

References to the phrase “cloud computing” appeared as early as 1996, with the first known mention in a Compaq internal document. The term cloud was introduced even before 1993. But modern cloud computing was created and popularized by Amazon in 2006 with Elastic Compute Cloud service. The first cloud delivery model was Infrastructure as a Service to provide pre-packaged IT resources to users. Later delivery models PaaS (Platform as a Service) and SaaS (Software as a Service) were introduced. All these delivery models were used for different workloads including AI workloads. There are there cloud deployment models namely Public, Private and Hybrid. These deployment models are also important to locate AI workload based on functional and non-functional requirements and constraints. According to the World Economic Forum’s “The Future of Jobs” report, cloud computing is the highest priority for business leaders and until 2025, cloud computing will be adopted the most by companies.

Though artificial intelligence started much earlier than cloud computing, cloud computing and its technologies have improved AI very much. Cloud computing has been an effective catalyst.

Image by Kivanc Uslu, inspired by source AI dynamic forces

We can see dynamic forces that have shaped AI: Data/datasets, processing capability including GPUs, models/algorithms, and talents/skills. Let’s look at how cloud computing has been taking part to advance and enrich AI ingredients:

Cloud delivery models

o IaaS (Infrastructure as a Service) helped AI practitioners to have an infrastructure environment — CPU, memory, disk, network, O/S easily so that a practitioner doesn’t lose time without waiting for an infrastructure team to prepare it. Moreover, cloud providers started to provide GPU resources later.

o PaaS (Platform as a Service) helped AI practitioners to use AI and data science services including jupyter notebooks, data catalog services to develop new generation applications easily.

o SaaS (Software as a Service) helped users to consume AI services within an application i.e. CRM, payment applications to create efficient results.

Cloud technologies

o Containers: As containers started to isolate applications from computing environments, containers provided the same interface and environment to all data scientists. Moreover, data scientist teams may run their containers on different cloud providers even with GPU capabilities that they preferred.

o Kubernetes: Kubernetes is an open-source system for automating deployment, scaling, and management of containerized applications. As data scientists wanted to use containerized data science platforms in a scalable way, Kubernetes helped them. Kubernetes provides efficient usage of computing environment resources including GPU. Kubernetes also provides data science applications, platforms running on containers to run on different cloud providers without worrying about compute environment.

o Data Sets consumption: Data is the most important ingredient of AI. You need to have rich data sets to run your algorithms, models. Whether data sets are shared or not from a cloud environment, once you got it that you could store them on public or private cloud environments to work on them easily.

Talent/Skills availability

o Although classical AI courses, educations were available in universities especially as part of computer engineering, computer science, applied mathematics, nowadays independent AI engineering bachelor and data science master programs have been available in different universities. Moreover, specialized data science courses are available from well-known universities. Coursera, Udemy also provides very successful courses about deep learning, machine learning courses on the cloud.

o Competition platforms like Kaggle, CrowdANALYTIX run on cloud environment and where data experts collaborate & compete to build & optimize AI, ML, NLP, and Deep Learning algorithms. These platforms are open to everyone. Some Data sets are available. With these new platforms, new talents can be raised.

DevOps

o DevOps is a set of practices that combines software development (Dev) and IT operations (Ops). The term was the first time used in a conference with the same name. Although it is possible to use DevOps approach in any development environment, especially microservices architecture style development requires DevOps. DevOps addresses the application development lifecycle, not the Data Science life cycle. So ModelOps was coined by Gartner in 2018. Before ModelOps, MLOps (Machine Learning Operations) was used as an extension of DevOps.

Almost all cloud providers (AWS, Azure, GCP, IBM, Oracle so on) already provide complex extensive data science and machine learning platforms and API services (i.e. NLP, Vision, Automated Machine Learning) in their cloud environment. Moreover, strong analytic firms (IBM, SAS, RapidMiner) provide their data science and machine learning platforms on different cloud providers. Many data science platforms and AI API services are also available on premise with cloud native (Kubernetes) environments. It seems that cloud providers and analytic firms will enrich these services on the cloud platform in the future.

Conclusion

Different analysts and technology companies predict future AI usage in different industries with different use cases. Cloud delivery and cloud computing models will help much to shape to use of AI use cases effectively. Moreover, edge computing which extends the cloud capabilities to on premise with low latency even offline capabilities will provide more use cases (i.e. video analytics) as organizations will have large amounts of data with processing capability on premise. And quantum computing will be expected to advance AI, especially in machine learning.

Sources and further reading

The Dynamic Forces shaping AI

https://www.oreilly.com/radar/the-four-dynamic-forces-shaping-ai/

IBM’s new tool lets developers add quantum-computing power to machine learning

https://www.zdnet.com/article/ibms-new-tool-lets-developers-add-quantum-computing-power-to-machine-learning/

Gartner Magic Quadrant for Data Science and Machine Learning Platforms

--

--

Technical and business thought leader on customer solutions. Open Group Distinguished Certified IT Architect.