Hosted by Pure Storage · EN

This week we welcome Nihal Mirashi, Sr. Product Marketing Manager for Database solutions at Everpure, to unpack the key findings of the new 2026 State of Database Infrastructure report, which surveyed over 500 infrastructure and data professionals globally. Our discussion starts with a look at Nihal’s professional journey, including his transition from focusing on the SaaS vertical to building out the go-to-market strategy for open-source databases, and his current work in helping customers with their database environments and exploring AI tools. Our conversation frames the context for modernization initiatives, noting that while organizations understand they must modernize (with platforms like Oracle26.ai), they are often slowed by outdated approaches. A central theme of the report is that modernization is failing because the traditional, event-based model is broken, creating a disruption paradox. The report data shows that 85% of professionals are concerned about modernization initiatives because they associate them with disruption, and 70% find even planned downtime unacceptable. Success is no longer defined solely by performance—which is now considered table stakes with the move to all-flash solutions—but by operational assurance and efficiency. This focus on operations is critical given the complexities of hybrid environments: 93% of organizations operate one, and 73% find them harder to manage today than three years ago due to data silos and multiple workflows. Operational friction translates directly into significant hidden costs and missed innovation opportunities. A staggering 80% of Database Administrators (DBAs) spend more time on re-validation than on innovation, with 55% of their time consumed by maintenance, preventing them from becoming data strategists. Furthermore, 60% of respondents are not confident in predicting their three-year infrastructure costs, and 71% have received budget surprises from vendors. Our episode concludes by drawing an analogy between database modernization and the ability of Rock and Roll Hall of Fame inductees to adapt their sound, emphasizing that continuous evolution and learning from peers—not disruptive, rip-and-replace events—is the way forward. To learn more, visit: https://www.everpuredata.com/solutions/databases.html?utm_campaign=db Check out the new Everpure digital customer community to join the conversation with peers and Pure experts: https://purecommunity.purestorage.com/ 00:00 Intro and Nihal’s Career Journey 06:06 Exploring AI for Marketing 08:40 State of Database Infrastructure Report 11:30 Concerns about DB Modernization 16:20 Hybrid Database Environments 19:15 How Data Silos Hold Back Innovation 21:30 Disruption Paradox 27:15 Cost Uncertainty of DB Operations 33:33 Hot Takes Segment

In this episode of the Pure Report, we sit down with longtime colleague Robert Quimbey, Consulting Field Solution Architect, known to many as "Q." Q shares with us how his role has evolved from a back-end problem solver focusing on Microsoft integrations to a customer-facing strategist dedicated to understanding the entire solution set for customers, not just fixing siloed problems. Drawing on his deep history with Microsoft technologies, including his time on the Exchange team, Q discusses Everpure’s strategic moves to simplify the modern data center. Our conversation dives into two major capability developments: ActiveCluster for File and Azure Local. ActiveCluster for File is designed to provide high availability for file services, inheriting the core benefits of the original Everpure ActiveCluster— with ease of setup and no extra licensing costs—while solving hard problems in the file space. Q explains how ActiveCluster for File implements file storage at a cousin layer to block storage, avoiding the performance and scalability issues common in competing solutions. The new capability includes continuous availability to ensure persistent sessions for VMs on SMB shares, even during non-disruptive upgrades, and is launching right as the NAS market is projected to nearly double, driven largely by AI-related unstructured data growth. Next, we explore Azure Local, formerly Azure Stack HCI, a project Q has championed for years. This initiative is key to customers looking to modernize their virtualization strategy. The new integration, which is near GA, allows customers to connect virtual machines and containers directly to FlashArray via Fibre Channel (FC), avoiding the complexities of HCI storage. Ultimately, the focus remains on the power of the core Purity foundation to deliver agility, predictable costs (like Evergreen//One), and superior performance for all hybrid-cloud workloads. To learn more, visit: https://blog.purestorage.com/products/microsoft-azure-local-and-flash-array/ and https://blog.purestorage.com/products/introducing-activecluster-for-file/ Check out the new Everpure digital customer community to join the conversation with peers and Pure experts: https://purecommunity.purestorage.com/ 00:00 Intro and Q’s Career Journey 07:17 Key Development Projects 09:59 Stat of the Episode on NAS Storage 12:51 ActiveCluster for File Discussion 27:45 Protocols and Arrays Supported 34:19 Intro to Azure Local 53:04 Hot Takes Segment

The Pure Report welcomes Andrea Moccia, VP of AI and Data at Options Technology, and Robert Alvarez, AI Solution Architect at Everpure, to discuss the cutting edge of AI deployment right after the energy of NVIDIA GTC. We dive into sobering statistics that show a high failure rate for generative AI pilots—95% fail to scale to production—and discuss how the root cause is a fundamental data strategy problem. Our discussion shift to focus around the unique, high-stakes challenges faced by the financial services industry (FSI), which contends daily with massive data volume (tens of petabytes of market data), strict global compliance and regulatory requirements, and the need for near real-time, low-latency answers from AI models. Andrea explains how the power of simplicity is an operational advantage, following the mantra: "Simplicity is what lets you be brave.” He details how Options is addressing issues like data leakage and data sovereignty with their Private Mind offering—a private, sovereign AI platform where they control the entire stack, from model to metal" Robert and Andrea connect this innovation to the Everpure partnership, specifically how solutions like Data Stream and Everpure KVA (which Robert co-developed) are vital in reducing implementation complexity and accelerating real-world use cases, such as building a powerful knowledge graph on hundreds of thousands of SEC filings efficiently. Finally, we conclude with our Hot Takes segment to dispel common AI misconceptions. We talk about how companies should stop obsessively chasing the latest frontier models or GPUs for every task, as open-source alternatives and smaller, distilled models are perfectly capable for a majority of use cases. In conclusion, hear how the true key to AI maturity and growth lies not in chasing technological hype, but in removing data silos, fixing the foundational data strategy, and using the rapidly maturing AI ecosystem to streamline business processes. To learn more, visit: https://www.purestorage.com/customers/options.html Check out the new Everpure digital customer community to join the conversation with peers and Pure experts: https://purecommunity.purestorage.com/ 00:00 Intro and Welcome 01:50 Recap of NVIDIA GTC 2:40 Overview on Options Technology 3:55 Andrea’s Career Journey 7:14 Robert Alvarez Intro 9:13 Stat of the Episode on AI Pilots 12:20 AI Challenges for FSIs 15:05 Simplicity Let’s You Be Brave 21:05 AI and KVA in Action at Options 23:45 Data Sovereignty and Compliance 30:35 Hot Takes Segment 35:37 Summary and a Look Forward

In this episode we welcome Naveen Neelakantam, Chief Architect of Everpure’s Digital Experience Business Unit. Naveen dives into the origin and evolution of Everpure's Digital Experience (DX), detailing how DX revolutionized storage management by moving beyond reactive support. The foundation of this lies in the phone home telemetry data collected from storage arrays, which first enabled the Cloud Assist capability. This data powers the ability to proactively identify and prevent issues, non-disruptively upgrade systems, and ensure a first-class support experience for every customer. Naveen explains how the intelligence gathered through telemetry propelled innovations like Pure1 and Evergreen//One. Pure1, the cloud-based platform, uses machine learning to offer predictive recommendations—such as projecting capacity needs to avoid unexpected overages. This predictive power is central to Evergreen//One, the consumption-based storage-as-a-service offering. By managing the physical appliance using telemetry, Everpure allows customers to consume logical storage connected to SLAs, simplifying the procurement process and eliminating the complexity of managing hardware specifics. This subscription model provides predictability and isolates customers from pricing pressures on components. Our discussion shifts to the future of storage and the transformative power of Artificial Intelligence. Naveen details AI Co-pilot, an agentic AI interface that helps users triangulate performance issues and orchestrate complex operations, such as migrating VMs, through conversational language using the Model Context Protocol (MCP). This move to active management is further realized through Pure1 Edge, allowing fleet-level data management and cloud-based upgrades. We then touche on Everpure Protect, a crucial cloud-based Disaster Recovery as a Service (DRaaS) solution. Ultimately, Naveen advises IT leaders to embrace AI as a powerful tool—like the domestication of the horse—that will make people more effective and accelerate innovation. To learn more, visit: https://www.purestorage.com/products/monitoring-fleet-management.html Check out the new Everpure digital customer community to join the conversation with peers and Pure experts: https://purecommunity.purestorage.com/ 00:00 Intro and Welcome 01:45 Naveen’s Career Journey 09:29 Origin of Digital Experience 12:45 Proactive Recommendations 16:05 Cloud Management and Subscriptions 18:12 Stat of the Episode on Storage Capacity Growth 21:30 AI Co-Pilot and Automation 32:14 Telemetry 37:01 Pure1 and Subscription Management 39:51 Everpure Protect DRaaS 45:44 Hot Takes Segment

The Pure Report podcast went on location to the Pittsburgh Pure User Group, set against the backdrop of the Ohio River, to celebrate the GA launch of the Everpure and Nutanix solution. We interviewed a lineup of guests—including David Stevens, Expedient’s Rob McCafferty, Don Poorman, and Systems Engineer Adam Hill—who all gathered with customers and partners to discuss the excitement surrounding the new offering. Stevens emphasized the highly integrated nature of the solution, which simplifies setup to just a few operations on the array and in Nutanix . The momentum around the solution is growing - a month after the GA announcement, this event is the perfect forum to answer customer questions and showcase the solution's ease of use and ability to replicate the operational experience virtualization administrators prefer. Our discussion shifts to the business value of the solution, specifically addressing customer challenges like finding alternatives to existing virtualization platforms, reducing costs, and hedging bets against recent industry changes. Guests note that the new architecture helps organizations keep data on-premises to meet regulatory requirements, while still enabling them to burst new workloads into the cloud. The episode features Rob McCafferty, Chief Solutions Officer at Expedient, who details their role as a beta customer and launch partner for the Pure and Nutanix offering. Expedient, a long-time customer of both companies, is thrilled to provide clients with the flexibility and the optionality unlocked by bringing together two industry leaders, with clients already in the queue for deployment. Expedient focuses on delivering reduced risk, cost control, and stability in their platforms for clients. The episode concludes by focusing on the power of Pure user groups, which are described as crucial venues for peer-to-peer interaction and sharing knowledge about topics like cyber, AI, and virtualization. Technical Evangelist Don Poorman points out that the success of the joint solution is due to the similar customer-focused cultures of Pure and Nutanix. Poorman advises customers to view the new virtualization optionality as a bigger exercise than just cost savings, recommending they consider the long-term effects on automation and cyber security. He also advocates for Pure’s forward-looking technology investment in NVMe over TCP, which he sees as more robust than Fiber Channel for the next 15 years. The team encourages customers to step up and lead future user group events to continue building the community, both physically and on the Pure Community digital platform. To learn more, visit: https://purecommunity.purestorage.com/category/pure-user-groups Check out the new Everpure digital customer community to join the conversation with peers and Pure experts: https://purecommunity.purestorage.com/ 00:00 Intro and Welcome 00:49 Everpure and Nutanix with Davis Stevens 06:01 Value of User Groups 09:15 Rob McCafferty from Expedient 17:01 Don Poorman from Everpure 20:42 Community Momentum 26:09 Pittsburgh SE Adam Hill

This episode of the Pure Report podcast, features Solutions Director Andrew Sillifant where we dive into the implications of Oracle 26ai, focusing on what enterprises must do to prepare for this major long-term support release. Our discussion positions Oracle as part of the database industrial complex, noting its enduring dominance alongside Microsoft SQL Server, together accounting for over 50% of the enterprise market. Oracle 26ai is presented as the latest phase in the database lifecycle—following G for Grid and C for Cloud—which capitalizes on the momentum of artificial intelligence by repositioning the database as a full system. The new version embeds AI vector store capabilities and machine learning models, allowing organizations to combine structured data from legacy systems with unstructured data (like S3 tables) for better context awareness. Our conversation shifts to a look at the complexity and risk of database upgrades, which extends far beyond the DBA team to considerations around capacity planning, application integration, and platform choices. Andrew notes that while Oracle is mature and its upgrade cycle is well-known, infrastructure modernization, including decisions on virtualization and containers, is now taking precedence due to many economic and regulatory forces. The addition of new capabilities in 26ai —including OLTP, analytics, and vector data types means increased storage consumption and introduces new workload patterns that stress the compute layer. Enterprises face decision paralysis when considering the cost and multifaceted nature of these changes, making a simplified, reliable infrastructure foundation critical. We close with a look at how the Everpure platform is an essential risk reduction element in the upgrade process, simplifying the storage layer so it reduces complexity in the upgrade process. Key benefits discussed include de-risking capacity bloat through metadata-only snapshots for development, test, and QA copies, and offering extremely fast recovery speeds, with examples citing a two-node Oracle RAC database restore at 68 terabytes per hour. Non-disruptive risk reduction (NDU) capabilities and the Evergreen service model are emphasized as a significant moat against competitors, providing a low-risk platform that allows teams to pivot their focus to the database upgrade itself, rather than the underlying infrastructure. To learn more, visit: https://www.purestorage.com/solutions/databases/oracle.html Check out the new Everpure digital customer community to join the conversation with peers and Everpure experts: https://purecommunity.purestorage.com/ 00:00 Intro and Welcome 02:43 Background on Oracle Database 08:35 Upgrade Considerations 12:17 Maturing Oracle Features 15:16 Database Upgrade Process 21:17 Complex Factors to Consider 25:12 Handling Different Data Types 30:24 Everpure Value for Oracle Operations 36:30 Key Steps for Successful Upgrades

We welcome back Chad Kenney, VP Product Management, to explore the new definition of "Enterprise-ready" in data infrastructure. We discuss how reliability and uptime are more critical than ever, with downtime costs frequently exceeding $300,000 per hour for a majority of companies according to a recent survey. Kenney notes that while hardware fails less often today, the major causes of downtime—namely security issues and human error—still persist, accounting for a large portion of outages. Our discussion pivots to how Everpure addresses this by simplifying architecture to reduce points of failure by using proprietary data devices instead of traditional SSDs. The new standard for enterprise readiness requires a system built on first principles with a software-centric view of resilience. We then explore how a key differentiator for modern enterprise-ready infrastructure is proactive support, which Everpure has always delivered through extensive telemetry and "fingerprints" that predict issues before they cause an outage. Our predictive technology is so effective that 70% to 75% of support calls are for problems that have not yet occurred. Architecturally, simplicity is critical, including the use of stateless controllers, which eliminates complex manual data management during upgrades, and limited error paths. Furthermore, our software-first mentality abstracts complexity, such as automatic management of RAID groups, to deliver autonomous systems. This frees storage administrators to become strategists and simultaneously reduces the potential for human error, enhancing overall system resiliency. We also dive into cyber resiliency, performance, and the future of data management driven by AI. Everpure builds layered cyber protection through capabilities like SafeMode, native encryption, and partner integrations. This approach prioritizes near-instantaneous recovery from local snapshots, which is faster than recovering from backups. Looking forward, Enterprise-ready means a unified platform (the Enterprise Data Cloud) where diverse workloads, including AI inference processing, can run simultaneously with consistency, enabling "data as a supply chain". This vision is achieved through the API-first and Fusion-enabled platform, ensuring that new feature functionality is immediately available for automation, all while maintaining the simplicity and continuous innovation provided by Evergreen. To learn more, visit: https://www.purestorage.com/platform.html Check out the new Pure Storage digital customer community to join the conversation with peers and Pure experts: https://purecommunity.purestorage.com/ 00:00 Intro and Welcome 02:15 Chadd’s update on 2026 at Everpure 06:11 Stat of the Episode on Downtime 08:58 Proactive Support 12:32: Component Count Relative to Uptime 17:30: The Meaning Behind 6 9’s 23:28 Designing for Cyber Resilience 26:35 Consistent Performance for Tier 1 Apps 29:55 API First and Fusion First

In this episode, we sit down with Technical Evangelist Don Poorman for a deep dive into the most engaging and eye-opening questions from the past year of the customer-focused Ask Us Everything (AUE) webinar series. The AUE forum has proven to be an invaluable resource for the Everpure community, driving real-time feedback and high-quality, practical discussions directly with experts. Tune in as we revisit the most pertinent topics and customer use cases, revealing how these community interactions are shaping the Everpure roadmap and delivering tremendous value. The conversation recaps the biggest AUE sessions, starting with Fusion, where customers were focused on the operational reality of managing fleets, automating data placement across data centers, and multi-tenancy. Next, we discuss the highly attended session on Purity Upgrades and the success of the self-support upgrade model, emphasizing Everpure’s commitment to building confidence and providing tools like AI Copilot to make storage OS upgrades a non-event. The review moves into Cyber Resilience, highlighting the shift from prevention to recovery, the role of SafeMode snapshots, and the importance of ecosystem integration with partners like Rubrik and Superna to address ransomware attacks holistically. Finally, our discussion covers the rapid evolution of FlashArray File, including the much-anticipated ActiveCluster for Files use case, and a look at the comprehensive value delivered by the Evergreen portfolio—from the included features in Evergreen//One to the Cyber Resiliency SLA add-on and its role in hybrid-cloud environments. The episode wraps up with the highly relevant session on the Nutanix integration, exploring how the Everpure Platform helps decouple storage growth from hypervisor licensing and enables modern container-based workloads with features like NVMe/TCP. This recap provides a high-level overview of the technical and strategic conversations defining the Everpure platform today and what’s coming next. To learn more, visit https://purecommunity.purestorage.com/category/events/events/webinars Check out the new Pure Storage digital customer community to join the conversation with peers and Pure experts: https://purecommunity.purestorage.com/ 00:00 Intro and Welcome 02:25 Ask Us Everything Webinars 06:25 Fusion 10:05 Self Service Upgrades 16:19 Cyber Resilience 24:29 File Services 29:23 Evergreen//One 38:42 Nutanix and Everpure 47:45 Observations on the AUE Program

It’s the Pure Report annual predictions episode! We welcome Shawn Rosemarin to dive deep into the world of tech in 2026, including a look back at 2025 predictions on AI becoming a strategist, Multi-Cloud 2.0 requiring a unified data platform, and end-to-end security ramping up. Shawn holds himself accountable for last year’s bets, particularly noting that the expected "operating model transformation" driven by AI has yet to fully materialize, arguing that many organizations are still grappling with the hard changes to people, process, and technology required for true transformation. Our conversation pivots to what’s next, starting with the evolution of AI from simple co-pilots to autonomous agents that will soon become mature process owners capable of completing end-to-end workflows. This shift will require a greater emphasis on verification, changing the industry's focus from time to answer to time to trust (or time to truth) as enterprises build verification stacks to ensure AI accuracy, recognizing that every mistake costs money and customer satisfaction. Finally, Rosemarin forecasts that growing energy scarcity will drive new AI economics, forcing serious programs to run AI like a business system by routing queries to the most efficient models. Furthermore, he predicts that data stops being an asset and evolves to a supply chain, necessitating a manufacturing-like process to refine structured, semi-structured, and unstructured data for uniform consumption by training systems. This new landscape will ultimately punish infrastructure complexity and reward the platform mindset that simplifies operations and removes friction through automation and orchestration. To learn more, visit https://blog.purestorage.com/perspectives/2026-ai-predictions-data-storage/ Check out the new Pure Storage digital customer community to join the conversation with peers and Pure experts: https://purecommunity.purestorage.com/ 00:00 Intro and Welcome 09:30 Look back at 2025 Predictions 17:33 William Gibson Quote on the Future 22:20 2026 Predictions - Copilots Become Agents 26:48 Verification and Time to Trust 30:30 Energy Scarcity and AI Economics 34:13 Data as a Supply Chain 38:50 Relevance Engines 42:10 Platform Mindset 45:43 Content Authenticity 49:37 Cyber as an Executive Imperative 52:35 Workforce Productivity 55:21 Summary of 2026 Predictions

We welcome back Andrew Sillifant, Solution Director at Pure Storage, for a deep dive into the concept of data gravity. We start with the traditional 2010 definition coined by Dave McCrory—that data accumulates, making it harder to move, and forcing dependent systems to cluster nearby. However, Andrew presents his core thesis, arguing that this foundational principle is no longer sufficient in a world of exploding complexity. Our conversation emphasizes the need to re-examine data gravity through a modern lens, acknowledging the massive shift to cloud computing and the proliferation of interconnected systems over the last decade. Andrew introduces five crucial dimensions that now describe data's impact: Volume, redefined by context and classification; Dependency, now accelerated by API calls, integration points, and AI agents; Criticality, which includes regulations, security, and implicit SLAs; Velocity, measured by how many functions data is used for; and Latency, complicated by geographic requirements that skew response times. These dimensions highlight how non-physical constraints, like egress fees and data sovereignty laws, create artificial friction that compounds the problem beyond sheer data size. Our discussion concludes with a new framework of five sources of data gravity that IT leaders must address: Technical Gravity (the physical component and mobility), Economic Gravity (the costs of hosting and moving data, like egress fees), Regulatory Gravity (compliance and legal restrictions), Institutional Gravity (the dependency on a small number of people who know how to manage old systems), and Measurement Gravity (budgeting and decision-making risks). Finally, Andrew connects these challenges to Pure Storage, noting how platform features like deduplication and continuous innovation are actively working to lessen the effects of data gravity for customers. To learn more, visit https://blog.purestorage.com/purely-technical/the-economics-of-data-gravity/ Check out the new Pure Storage digital customer community to join the conversation with peers and Pure experts: https://purecommunity.purestorage.com/ 00:00 Intro and Welcome 01:05 Andrew Observations About the USA 04:19 Defining Data Gravity 07:30 Challenges Caused By Data Gravity 09:01 Real World Data Gravity Examples 17:15 Data Gravity Impact Vectors 33:02 New Dimensions of Data Gravity 40:30 Where Pure Helps with Data Gravity