With us today is Maryanne Baines, an authority in cloud technology with deep experience evaluating the various cloud providers, their tech stacks, and product applications across industries. We’re here to discuss a major development in the cloud storage space: Wasabi Technologies’ recent $70 million funding round, which positions it as a formidable challenger to the industry’s giants. This conversation will explore the strategic deployment of this new capital for AI infrastructure, the company’s unique value proposition against hyperscalers, and its technological innovations designed for the AI era. We will also examine the significance of key partnerships and the impressive growth trajectory that has defined its journey.
With your recent $70 million funding round, what are the specific first steps for expanding your AI infrastructure and broadening your global footprint? Can you detail the key priorities and expected milestones you aim to achieve within the next year?
This funding is a massive seal of approval and acts as a powerful accelerant. The first priority is to immediately ramp up the AI infrastructure expansion. This isn’t just about buying more servers; it’s about strategically deploying specialized hardware, like the technology behind our Wasabi Fire offering, into our 16 global regions to meet the intense demand from AI developers. We’re not just adding capacity, we’re enhancing our capabilities to power these next-generation workloads. Within the next year, the key milestone will be to significantly increase our managed data, moving well beyond the current three exabytes, and to establish a physical presence in new, high-demand international markets that are currently underserved by predictable, low-cost cloud storage.
You position your company as a “more predictable alternative” to the hyperscalers. Beyond eliminating egress fees, what specific complexities do enterprises face with larger providers? Please share an example of how your Hot Cloud Storage model directly addresses those challenges.
Enterprises are often caught in a web of complexity with hyperscalers that goes far beyond just egress fees. They face tiered pricing models that are incredibly difficult to forecast, API call charges, and a constant threat of vendor lock-in that makes migrating data both costly and technically challenging. It creates a sense of being trapped. Our Hot Cloud Storage model cuts through all of that noise. By offering a single, high-performance tier for frequently accessed data with zero egress fees, we make budgeting simple and predictable. For example, a media company that constantly needs to access and distribute large video files would see unpredictable, spiraling costs with a hyperscaler. With us, they pay for the storage they use, and that’s it. It’s a fundamental shift that gives control back to the customer.
The rise of generative AI has created significant storage bottlenecks. How does your Wasabi Fire NVMe offering specifically address the intense demands of AI training workloads, and in what key ways does it differ from your standard storage solutions for customers?
The demands of AI training are a different beast entirely. It’s all about speed and low latency; models are constantly reading and re-reading massive datasets, and any delay can dramatically increase training time and cost. That’s where Wasabi Fire comes in. It utilizes NVMe storage, which is a significant leap from our standard, highly-performant Hot Cloud Storage. Think of it as the difference between a sports car and a rocket ship. While our standard offering is perfect for high-frequency access in most data-intensive applications, Wasabi Fire is purpose-built to eliminate the I/O bottlenecks that plague AI training. It provides the raw, unthrottled performance needed to feed hungry GPUs, ensuring these complex workloads run efficiently and without interruption.
Flash storage provider Pure Storage is now an investor. How does this partnership reflect a shared vision for next-generation AI infrastructure? Could you elaborate on the practical, day-to-day benefits this collaboration brings to your enterprise customers?
Having Pure Storage invest in us is incredibly validating and speaks volumes about our shared vision. They are a leader in high-performance flash storage, and they recognize that the future of AI infrastructure requires a new approach—one that is both powerful and simple. As their VP for strategy, Krishna Gidwani, noted, we are both focused on building AI-ready environments without complexity or unpredictable costs. For our enterprise customers, this collaboration provides immense confidence. They see two best-in-class innovators working together, which translates into tighter product integrations, a more seamless experience between on-premise and cloud environments, and the assurance that the underlying infrastructure is built for the future of AI.
Your company now manages over three exabytes of data. What key factors have driven this rapid growth since your 2017 launch? Please describe the strategy that allowed you to scale so effectively while challenging established industry giants.
Reaching over three exabytes of data since our 2017 launch has been a whirlwind, and it really comes down to a simple, disruptive strategy: radical simplicity and predictability in a complex market. We identified the biggest pain points customers had with the industry giants—confusing pricing and punitive egress fees—and we eliminated them. Our growth wasn’t just about offering a lower price; it was about offering a better, more transparent model. This approach, combined with our singular focus on delivering high-performance “Hot Cloud Storage,” created a groundswell of support. We didn’t try to be everything to everyone. Instead, we did one thing exceptionally well, and that clarity resonated deeply with a market that was tired of being nickel-and-dimed.
What is your forecast for the cloud storage industry as data-intensive workloads from generative AI and autonomous systems become the new standard?
The future of cloud storage will be defined by a clear split between utility and specialty. The hyperscalers will continue to be the massive, all-encompassing utilities, but the explosive growth of AI and autonomous systems is creating a critical need for specialized, high-performance, and cost-predictable storage solutions. We’re going to see a permanent shift away from complex, multi-tiered storage models. Customers will demand simplicity and performance without hidden fees, as their data becomes more active and essential than ever. The old “cold” storage paradigm is fading; in an AI-driven world, all data is potentially “hot,” and the storage providers who can deliver that data quickly, reliably, and affordably will be the ones who thrive.
