Search

Artificial Intelligence Predictions for 2020

Artificial Intelligence Predictions for 2020
Getting your Trinity Audio player ready...

Artificial intelligence (AI) has become integral to practically every segment of the technology industry. It’s having an impact on applications, development tools, computing platforms, database management systems, middleware, management and monitoring tools—almost everything in IT. AI is even being used to improve AI. What changes in core AI uses, tools, techniques, platforms, and standards are in store for the coming year? Here is what we’re already starting to see in 2020.

GPUs will Continue to Dominate AI Acceleration

AI hardware accelerators have become a principal competitive battlefront in high tech. Even as rival hardware AI chipset technologies—such as CPUs, FPGAs, and neural network processing units—grab share in edge devices, GPUs will stay in the game thanks to their pivotal role in cloud-to-edge application environments, such as autonomous vehicles and industrial supply chains. Nvidia’s market-leading GPU-based offerings appear poised for further growth and adoption in 2020 and beyond. However, over the coming decade, various non-GPU technologies—including CPUs, ASICs, FPGAs, and neural network processing units—will increase their performance, cost, and power efficiency advantages for various edge applications. With each passing year, Nvidia will draw more competition.

Industry-standard AI Benchmarks will Become a Competitive Battlefront

As the AI market matures and computing platforms vie for the distinction of being fastest, most scalable, and lowest cost in handling these workloads, industry-standard benchmarks will rise in importance. In the past year, the MLPerf benchmarks took on greater competitive significance, as everybody from Nvidia to Google boasted of their superior performance on these. In 2020, AI benchmarks will become a critically important go-to-market strategy in a segment that will only grow more commoditized over time. As the decade wears on, MLPerf benchmark results will figure into solution providers’ positioning strategies wherever high-performance AI-driven capabilities are essential.

AI Modeling Frameworks will Converge on a Two-horse Race

AI modeling frameworks are the core environments within which data scientists build and train statistically driven computational graphs. In 2020, most working data scientists will probably use some blend of TensorFlow and PyTorch in most projects, and these two frameworks will be available in most commercial data scientist workbenches. As the decade proceeds, the differences between these frameworks will diminish as data scientists and other users value feature parity over strong functional differentiation. By the same token, more AI tool vendors will provide framework-agnostic modeling platforms, which may offer a new lease on life for older frameworks in danger of dying out. Accelerating the spread of open AI modeling platforms is industry adoption of several abstraction layers—such as Keras and ONNX—that will enable a model built in one framework’s front-end to be executed in any other supported framework’s back-end. By the decade’s end, it will become next to irrelevant which front-end modeling tool you use to build and train your machine learning model. No matter where you build your AI, the end-to-end data science pipeline will automatically format, compile, containerize, and otherwise serve it out for optimal execution anywhere from cloud to edge.

SaaS-based AI will Reduce Enterprise Demand for Data Scientists

This past year saw the maturation of machine learning as a service offerings from AWS, Microsoft, Google, IBM, and others. As this trend intensifies, more business users will rely on cloud providers such as these to supply more of their AI requirements without the need to maintain in-house data science teams. By the end of 2020, SaaS providers will become the predominant suppliers of natural language processing, predictive analytics, and other AI applications, as well as platform services and devops tooling. Those enterprises that maintain in-house AI initiatives will automate data scientist jobs to a greater degree, thereby reducing the need to hire new machine learning modelers, data engineers, and anciillary positions. Over the decade, most data scientists will find gainful employment primarily with SaaS and other cloud providers.

Enterprise AI will Shift Toward Continual Real-world Experimentation

Every digital business transformation initiative hinges on leveraging the best-fit machine learning models. This requires real-world experimentation in which AI-based processes test alternative machine learning models and automatically promote those that achieve the desired result. By the end of 2020, most enterprises will implement real-world experiments in every customer-facing and back-end business process. As business users flock to cloud providers for AI tooling, capabilities such as those recently launched by AWS—model-iteration studios, multi-model experiment tracking tools, and model-monitoring leaderboards—will become standard in every 24×7 AI-based business application environment. Over the decade, AI-based automation and devops capabilities will spawn a universal best practice of lights-out AI-based business process optimization.

AI will Automate AI Developers’ Core Modeling Function

Neural networks are the heart of modern AI. In 2020, an AI-driven methodology called neural architecture search will come into enterprise data scientists’ workbenches to automate the practice of building and optimizing neural networks for their intended purposes. As neural architecture search gains adoption and improves, it will boost data scientists’ productivity by guiding their decisions regarding whether to build their models on established machine learning algorithms, such as linear regression and random forest algorithms—or on any of the newer, more advanced neural-network algorithms. As the decade proceeds, this and related approaches will enable continuous AI devops through end-to-end pipeline automation.

AI-driven Conversational User Interfaces will Eliminate the Need for Hands-on in Most Apps

AI-based natural language understanding has become astonishingly accurate. People are rapidly going hands-free on their mobiles and other devices. As conversational user interfaces gain adoption, users will generate more text through voice inputs. By the end of 2020, more user texts, tweets, and other verbal inputs will be rendered though AI-driven voice assistants embedded in devices of every sort. Throughout the decade, voice assistants and conversational UI will become a standard feature of products in every segment of the global economy, with keyboards, keypads, and even on-screen, touch-type interfaces diminishing in usage.

Chief Legal Officers will Mandate end-to-end AI Transparency

AI is becoming a more salient risk factor in enterprise applications. As enterprises confront an upswell in lawsuits over the socioeconomic biases, privacy violations, and other unfortunate impacts of AI-driven applications, chief legal officers will demand a complete audit trail that reveals how the machine learning models used in enterprise apps were built, trained, and governed. By the end of 2020, chief legal officers in most enterprises will require that their data science teams automatically log every step in the machine learning pipeline while also generating a plain-language explanation of how each model drives automated inferencing. As the decade proceeds, a lack of built-in transparency will become a predominant factor in denying Artificial Intelligence project funding. Finally, we can safely assume that calls for regulation of AI-based capabilities in all products—especially those that use personally identifiable information—will grow in the coming years. Apart from the growing emphasis on AI devops transparency, it’s too early to say what impact these future mandates will have on the evolution of the underlying platforms, tools, and technologies. But it appears likely that these regulatory initiatives will only intensify in coming years, regardless of who wins the US presidential election this coming November. The original version of this article was first published on InfoWorld.

Author Information

James has held analyst and consulting positions at SiliconANGLE/Wikibon, Forrester Research, Current Analysis and the Burton Group. He is an industry veteran, having held marketing and product management positions at IBM, Exostar, and LCC. He is a widely published business technology author, has published several books on enterprise technology, and contributes regularly to InformationWeek, InfoWorld, Datanami, Dataversity, and other publications.

SHARE:

Latest Insights:

TSMC, Samsung, and Intel All Announced Agreements
Olivier Blanchard, Research Director at The Futurum Group, shares his insights on the geopolitical, market, and supply chain implications of finally securing domestic semiconductor chip production.
The Strategic Acquisition of Netreo by the Global Software Solutions Leader Has the Potential to Reshape the Future of IT Monitoring and Management
Discover insights from Steven Dickens, Vice President and Practice Lead at The Futurum Group, on how BMC's strategic acquisition of Netreo will shape the future of IT monitoring and management.
April 19 ‘Halving’ and New ETFs May Alter the Finance Ecosystem
Steven Dickens, VP and Practice Leader at The Futurum Group, highlights that as Bitcoin has introduced spot Bitcoin ETFs and experiences its fourth halving, it continues to redefine the financial landscape.
Unveiling the Montreal Multizone Region
Steven Dickens, Vice President and Practice Lead, and Sam Holschuh, Analyst, at The Futurum Group share their insights on IBM’s strategic investment in Canadian cloud sovereignty with the launch of the Montreal Multizone Region.