Embracing Digital Transformation – Episode #278
Title: From Hype to Impact: Building Scalable AI Solutions for the Enterprise
Date: July 15, 2025
Host: Dr. Darren Pulsipher
Guests: Lyn Kampf (Intel), Russell Fishman (NetApp)
Episode Overview
In this episode, Dr. Darren Pulsipher gathers AI leaders Lyn Kampf (Intel) and Russell Fishman (NetApp) to move beyond the AI hype and discuss the real-world impact of scalable, enterprise-grade AI solutions. The conversation explores getting from “science experiment” to business value, the practical challenges of AI deployment, and how Intel and NetApp are making open-source AI and enterprise data integration accessible and secure for public sector and enterprise organizations.
Key Topics & Insights
1. The Journey to Enterprise AI
-
Defining “Enterprise AI”:
Russell frames the current phase as the era of "Enterprise AI"—distinguishing between experimental applications and solutions that truly deliver business value at scale.- “Enterprise AI is where the rubber meets the road. It’s the difference between trying something out in the wild west, and actually delivering value.” (Russell, 09:02)
-
High Failure Rates:
Most AI initiatives don’t make it past proof of concept.- “85% of AI projects never make it to production.” (Russell, 10:47)
2. Critical Public Sector AI Challenges
-
Security & Data Leakage:
Dr. Pulsipher notes rampant use of public GenAI even for sensitive work, raising alarm over intellectual property and compliance.- “They’re sending classified data home, running on ChatGPT, generating stuff, and coming back. It’s a huge problem.” (Darren, 14:20)
-
Vendor Lock-In and Cost:
The impression that generative AI requires massive GPU investments is challenged.- “Nvidia told me I have to spend $50 million to buy a bunch of GPU clusters…which…the three of us know is a total lie.” (Darren, 07:17)
3. The Intel–NetApp Partnership: Enterprise AI for All
-
Open Source First:
Both companies align in supporting open-source AI stacks—citing NetApp’s role in storage interfaces for Kubernetes, and Intel’s founding membership with OPIA (Open Platform for Enterprise AI).- “Open source is winning in AI…anyone seriously doing work in AI realizes that very quickly.” (Russell, 15:26)
-
Bringing It Together with “AI Pod Mini”:
The “NetApp AI Pod Mini with Intel” (delivered through channel partners) unifies data, compute, and open-source AI into a turnkey, enterprise-ready offering using:-
NetApp ONTAP storage OS
-
Intel Xeon 6 processors
-
Pre-integrated Open Platform for Enterprise AI (OPIA) stack
-
“We produced this thing…NetApp AI Pod Mini with Intel…[channel] partners deliver NetApp storage, Intel Xeon 6 processor, and a version of OPIA.” (Russell, 20:59)
-
4. Enterprise Use Case: Retrieval-Augmented Generation (RAG)
-
What Sets It Apart:
The solution goes beyond DIY RAG, with security and scalability baked in—addressing permissions, compliance, data update, support, and data integration in complex environments.-
“Just the ability to flow through permissions from underlying file environments through to the RAG environment…is what enterprise means in this case.” (Russell, 24:11)
-
“Some early AI implementations…using RAG, employees were able to get access to the CEO’s emails.” (Lynn, 25:30)
-
“Europe…is kind of a nightmare. Outside of the US, there’s very significant requirements for compliance.” (Lynn, 25:48)
-
-
Dynamic Data Updates:
With “SnapDiff” technology, only updated data is re-indexed, ensuring current results and supporting use cases like clinical chatbots in hospitals—where timely, accurate data is life-critical.- “SnapDiff detects when something has changed. It triggers the re-vectorization of that data entity.” (Russell, 32:16)
5. Practical Deployment and Support
-
Buying & Support Model:
Customers don’t go to Intel or NetApp directly, but through a network of partner integrators—who offer hands-on training and tailored deployment.- “We wanted to work through these guys—their best place to understand customer requirements. We produced this [AI Pod Mini] made available through our channel partners.” (Russell, 20:46)
- “A lot of that channel…they have some incredible training programs…super hands-on.” (Lynn, 34:39)
-
Open Source + Enterprise-Grade Backing:
The combo (open source AI + partner-based enterprise support) means organizations avoid the “bus factor”/“trucking factor” risk of relying on a single open-source maintainer (27:50).
Notable Quotes & Memorable Moments
-
On the State of AI Deployment:
“It’s not enough just to ask, can I do AI? Now it’s: can I do AI in a way that’s actually going to deliver some value?”
– Russell Fishman [09:13] -
On Open Source Complexity:
“Open source and components have been a lot like Home Depot—here’s your box of nails, a pile of wood, and your blueprints. Good luck, DIY.”
– Lyn Kampf [18:36] -
On Real-World Security Risks:
“Some early implementations of RAG…employees were able to get access to the CEO’s emails.”
– Lyn Kampf [25:30] -
Compliance Is Not Optional:
“Outside of the US, there’s very significant regulations. If you’re a multinational, you’ve got to keep tabs on all those regulations and be compliant based on your role.”
– Lyn Kampf [25:50] -
On Dynamic Indexing:
“SnapDiff detects when something has changed. It triggers the re-vectorization of that data entity… that’s what sets enterprise class RAG apart from something that runs on your laptop.”
– Russell Fishman [32:16] -
Channel-Driven Adoption:
“You type in ‘NetApp AI Pod Mini Intel’—right at the top you’ll see all the technical documentation…But go to your favorite channel partner that works with NetApp and OEMs of Intel and you ask them for it.”
– Russell Fishman [33:44]
Key Timestamps
| Time | Topic / Quote | |---------|----------------| | 09:01 | Defining “Enterprise AI”; moving from science experiment to value | | 10:47 | “85% of AI projects never make it to production.” | | 14:20 | Security risks of public GenAI in government | | 15:26 | Why open source is foundational in modern AI | | 18:36 | The Home Depot vs. IKEA analogy for AI solutions | | 20:59 | What makes up the NetApp AI Pod Mini with Intel | | 24:11 | Security, permissions, and enterprise requirements for RAG | | 25:30 | “Early RAG could leak CEO emails”—need for compliance | | 27:50 | The “bus factor” of open source and why enterprise support matters | | 32:16 | SnapDiff, dynamic data updates, and real enterprise requirements | | 33:44 | How and where to get the solution; training and adoption |
Conclusion
This episode pulls back the curtain on the transition from academic AI excitement to truly enterprise-grade, scalable solutions. Pulsipher, Kampf, and Fishman underscore that sustainable, impactful AI hinges on open-source foundations, robust data integration, and enterprise-ready support around compliance, reliability, and security. The NetApp/Intel partnership (embodied in their “AI Pod Mini”) is pitched as the answer to bridging technical innovation with the realities of public sector and enterprise deployment—making the promise of AI both practical and safe at scale.
