AI Hype Debunked in 5 Simple Visuals
“AI will replace the people who think it will.” — Naval Ravikant
The world is going nuts about AI! Tech Bros & Silicon Valley Titans give fancy speeches and call baby crawls in feature engineering: “massive breakthroughs”, or a burp in data model output: “game changing”… or a tiny algorithm optimization: “a monumental invention of mankind”.
It’s really hard to sift through all this car-salesman language & glamorized presentations to get a true sense of what’s happening. The noise is borderline deafening, for sure… but if you know the context of what they are actually talking about and a framework to follow where things are, then you can comfortably pick up the signal out of the noise.
First of all, majority of the hype about AI and AGI in particular are artificially inflated. There’s no digital brain or sentient AI on the horizon. These self-serving soundbites are mostly geared towards the investors who have bought into the dream and poured billions of dollars (or will pour more..) into tech-start ups and are thirsty for some assurance that the delivery is just around the corner. Nonetheless, the reality couldn’t be farther from the truth. Let me break down the AI frenzy in 3 simple diagrams so you can gauge the level of nonsense out there.
What is & is not AI?
There’s a good chance that the fancy data dashboard at work that is sold to you as AI-built or powered is just doing basic Data Science work: ingesting, processing and visualizing data with ZERO AI intervention. Due to this intersectionality between AI and Data Science, there’s some confusion among the general public as to what’s actually machine intelligence vs good old statistics applied on an advanced industrial scale.
There are obviously some brights spots here with machine learning algorithms used for natural language understanding and processing (AI domains), which are getting really good in text to text applications, i.e. Chat GPT or Gemini. Despite that, we are still far away from conquering the complexity and nuance of human-like language in terms of speed and accuracy of ingestion & production.
So why the hype?
The hype cycle from Gartner is a good proxy to understand the time it takes for a new technology to meet expectations and offer significant benefits for its intended beneficiaries. With AI, we are technically going through the peak of inflated expectations and might be hitting the trough of disillusionment.
As billions of dollars are poured into this domain, there’s little to no realized value to show. Only a few fancy PowerPoint slides, some LLM excel audits and recycled & reconstructed images with some mind bending creativity. Smart Robots are not smart enough to avoid harming others or themselves. Self-driving cars are miles away from actually “self-driving”… neglecting to scan moving objects in real-time, and LLMs still suffer from the challenges of bias and variance in data. So they just make things up! Just like your drunk uncle at the Thanksgiving dinner.
Where are we, then?
While the Gartner curve is ideal for a big picture understanding of AI timeline, I believe AI and its major sub-disciplines should be examined on a maturity spectrum — from Realized to Theoretical AI.
The journey of AI began with Narrow AI, performing well-defined deterministic tasks based on labeled data, i.e. mortgage approval, fraud detection…industrial robots , which are still doing phenomenally well with the advent of big data and robust cloud computing. Same goes with mid-range ML models doing predictive analytics, i.e. recommendation engines. We have also made impressive strides in the area of Machine Vision, both in software and hardware domains, i.e. face recognition, assembly line inspection & product quality checks.
However, we are far far away from the left side of the maturity curve: Cognitive AI and AGI. These ambitions are still in the realm of theoretical discussions. In this realm, you need machines to understand context, nuance, reasoning and make complex decisions… and many other paradigms that are mostly shaped by the feedback loop from your physical environment in real-time.
All that is nearly impossible to translate into code, parameters, nodes and tokens. We simply don’t know how exactly all this works in our own brains. We don’t know what’s happening inside a neuron (cell), it’s just a big black box of mystery!
And Finally: The Cost Factor!
Aside from all those technicalities, one thing that literally no one talks about in public for a myriad of reasons is the cost implications of AI on industrial scale. It’s not cheap!
The tech-stack above is just one of the many architectural templates big firms need to adopt in order to get an AI solution up and running across the enterprise. Each layer has a cost associated with it; the size the complexity of the use case determines the core compute cost, plus the additional marginal cost of resources and systems on the periphery. The more the parts, the higher the bill. The range could be from a few million to hundreds of millions over-time.AI is not simply a “gene out of the box”, we have to build it out from ground up; scalable cloud infrastructure, suitable model layer, right tools, robust data layer and finally user-friendly application layer. There are some companies in manufacturing, healthcare and eCommerce where they have implemented this correctly and the value it generates far outweighs the costs, i.e. Amazon RUFUS & Alexa chatbots, TESLA’s 3D road assistance, Netflix’s advanced recommendation engine.
So next time, you are listening to a podcast or watching a video loaded with jargon, superlatives and tech-speak, filter through the noise by using these visuals as a guide for context. AI glamour might just be a mirage unless they show the gains.. the Benjamins. Literally!
Masoud Hashime is a digital portfolio strategist & product manager with over a decade of experience transforming ecosystems in Education, Telecom & CPG industries. He’s a graduate of Johns Hopkins University, a design-thinking evangelist, and a passionate advocate for a healthy marriage between humans & AI.