Q&A with Quantum CEO, Jamie Lerner: The Future of Data Management and Storage
Release date: 06 March 2025
—
In our latest Q&A, we sit down with Jamie Lerner, CEO of Quantum, to discuss the evolving role of data storage, the impact of AI, and how businesses are rethinking their approach to archives, cloud strategies, and high-performance storage. We explore the challenges IT leaders face in managing the full data lifecycle and how Quantum is shaping its product portfolio to meet these demands.
What challenges are standing out the most to your customers today? Have any of them surprised you?
Jamie Lerner: The biggest thing people are struggling with or trying to implement is a full data lifecycle strategy. People are used to doing the standard IT things like backups and providing storage for databases and corporate file systems.
But AI has changed that—or even just the prospect of AI. Companies are realising that AI is real, and it produces results that are valuable. Those using advanced AI have an advantage over those that don’t.
So, there’s pressure to ask: are we ready for AI? And to be ready, you need to organise your data. Just having databases and file systems isn’t enough. You need a high-speed area to work with data: where you do analytics, run inferences, train models.
That needs to be backed up, sure. But then you need an archive of all your data—all the data your company has ever touched.
Companies used to think some data wasn’t valuable. Take proposals, for example. Every company makes quotes or proposals for customers. Maybe you have ones from 30 or 40 years ago. Most people thought those weren’t worth keeping. But if you want to analyse what makes a winning proposal versus a losing one, you need as many as possible to train the model. If you threw them away, you can’t do that.
So there’s a lot of pressure to build “forever archives.” And when you have a forever archive, you can’t just throw everything in, much like when storing items in an attic. Normally in an attic, you organise it—these are Christmas decorations, these are my kids’ toys, this is sports equipment. You have boxes, labels. You don’t just throw stuff in randomly.
Archives need the same organisation. But a lot of companies have disorganised archives—no labels, no categorisation. Even if they have the data, they can’t find it or organise it in a meaningful way, which means they can’t leverage it for insights and analysis
The leading IT organisations are now saying: “We need a full lifecycle strategy, from the data we’re working on now to data we may keep for 20, 50, or even 100 years.” That’s a big shift from just doing backups and running file servers.
Would you say the rise of AI has made stored data more of a dynamic resource?
Jamie Lerner: Absolutely. IT organisations used to be back-office groups that often were thought of as risk centres instead of business drivers.
If the ERP went down, you were considered a risk. If backups failed, you were considered a risk. If there was a cyberattack, and you couldn’t recover, you were considered a risk. The perception was that you only brought value down. If everything went well, no one noticed.
Now, with AI, IT can study the company’s data and actually provide insights to the business that help guide decisions: what activities are most profitable, what business decisions are most insightful, and how to retain customers and increase satisfaction through analytics.
All of a sudden, IT becomes a revenue generator. It becomes a value creator.
And you can see it in stock prices. Banks, airlines, life sciences companies doing AI well have valuations drastically higher than those that don’t. IT has moved much closer to the front office—at least for companies that are serious about AI.
Are companies shifting their focus from working storage to archives? How is that changing storage strategies?
Jamie Lerner: I think of it like a barbell. The two extremes—extremely high speed and very large, long-lasting, inexpensive storage—are getting the most attention. The mid-range? Not so much.
For the extreme high-speed working area, traditional all-flash file systems don’t cut it anymore. Modern systems are designed from the ground up for flash, using parallel agents to achieve high speed, GPU direct, and techniques like running the file system inside the container with the workload. There’s no separate storage and workload area—the storage is inside the workload. It’s a completely different architecture.
At the archival end, people are layering their storage in interesting ways. Some data in the archive is cold—maybe it won’t be touched for 20 or 30 years. That goes into very inexpensive, secure storage.
But other archived data is accessed often, so that might be on a flash tier. Maybe a cheap flash tier using commodity flash, designed to last 20–30 years. There are ways to treat flash very gently so it has a long life cycle.
There’s also software layering over archives now. File systems are SQL-searchable. You can run a SELECT statement against metadata: “Find all femur X-rays for women over 45 with a family history of osteoporosis.”
That’s a huge shift. AI is enriching metadata so you’re not just looking at filenames. You’re dealing with structured information at a completely different level than the old-school Word docs and spreadsheets people stored in file systems before.
Some customers have over 60 billion files in their archives. Everything from genomes—which are incredibly difficult to categorise—to movie clips, sound clips, and surveillance footage from massive physical security infrastructures. If you don’t have an intelligent archive, you’ve just got a giant pile of data you can’t use.
Are customers realising the potential of multimodal data?
Jamie Lerner: Oh, absolutely. And it’s causing a lot of regret.
CIOs used to pride themselves on aggressive data cleansing strategies. If data was over 10 years old, purge it. If surveillance footage shows nothing happening, delete it.
But what if that data captured every customer who ever shopped in your store? Every product they picked up and put back? Every moment of delight, frustration, hesitation?
Companies are realising they threw away invaluable behavioural data.
The same goes for sales quotes. Companies that kept decades of old quotes can now train AI models to recognise patterns in winning deals. Those that deleted them? They lost that insight forever.
A lot of people thought they were being smart by keeping only “necessary” data. Now they’re watching competitors gain an edge by analysing data they no longer have.
Has cloud repatriation changed how customers think about storage?
Jamie Lerner: Some IT leaders still think they can just say, “Our strategy is to move to the cloud.” Five years ago, that seemed viable, and people thought they were smart for committing to a full cloud transition.
Now, many larger customers realise that this was, in some cases, naive.
There are workloads that make sense in the cloud—like spinning up a system quickly for a one-time event. IT departments aren’t as agile at setting things up fast and tearing them down, but cloud providers do that really well.
But if you’re setting up a 100-year archive, does it make sense to pay a cloud provider a monthly bill forever? Do you want your most valuable data locked into a proprietary API and billing model?
A lot of IT teams are now saying, “I need a copy of my data under my control—my security, my costs, my infrastructure.” Because that data is going to determine whether they win or lose in the long term.
Some companies that moved everything to the cloud are realising their bills are insane. A large media and entertainment company we work with shifted everything to the cloud. Once the initial discounts expired, their bill hit $1.3 billion a year. Suddenly, their internal IT costs looked cheap.
Security is another reason for repatriation. That same company was producing episodic TV content, and every week, their content was stolen, ransomed, and leaked. They moved back on-prem not just for cost control, but to protect their intellectual property.
The conclusion many businesses are coming to is: not all data belongs on-prem, and not all data belongs in the cloud. It’s a workload-specific decision.
How has Quantum shaped its product roadmap in response to these changes?
Jamie Lerner: Most storage companies think about their platform in silos: file systems, block storage, backup systems. When I joined Quantum five years ago, we had solid products, but they were independent.
We reimagined our portfolio as an integrated data lifecycle. Instead of isolated products, we built a system where data moves seamlessly across different storage types.
- A high-speed working area for real-time processing
- Flash-based backup for rapid recovery
- Long-term archival storage with multi-site redundancy
Backup has been a huge shift. It used to be about cost—disk, then tape. Now, most customers want all-flash backup because they need instant restores.
For archival, we’ve focused on durability—multi-site erasure coding, compression, deduplication. We’ve also introduced a single namespace, so businesses can see their entire data lifecycle in one place.
We’re still innovating, but the foundation is now software-defined, cloud-native, and flash-native—built for modern workloads.
What excites you about the future of storage—and what keeps you up at night?
Jamie Lerner: Data storage today is too siloed. It sits in databases, file systems, cloud buckets. That doesn’t work long-term.
Storage has to be distributed. Data should exist everywhere it needs to be at the right time. That’s why we built Myriad, our high-performance all-flash file system, inside Kubernetes—so high-speed storage can be spread across enterprise, cloud, and partner systems.
It’s a long road, but in five years, I think we’ll see data move dynamically across the world—shifting automatically to where it’s needed. Companies that master that will unlock massive value.
Are there any customer stories that stand out to you?
Jamie Lerner: One of our biggest surprises was a European grocery chain.
We started out just helping them with basic backups. Nothing complex. But we completely misjudged them.
They didn’t think of themselves as a grocery store. They thought of themselves as a data company.
They have never thrown away a single piece of data since the 1950s. Every sale, every inventory record, every transaction—it’s all still there. And they wanted a multi-hundred-year strategy to keep it.
Then they told us: “We’re not just doing this for our stores. We’re going to back up Europe.”
They wanted to build an infrastructure that could handle hundreds of customers—including governments. They weren’t just thinking about their business; they were thinking about the future of European cloud and managed services.
Today, they operate one of the largest clouds in Europe. They’re buying high-tech companies and investing in AI and supercomputing. Their data business is now far more valuable than their grocery stores.
Had we viewed them as just a retailer, we would have completely missed the opportunity.
A lot of salespeople think, “Oh, I’m selling a backup appliance’ or ‘I’m selling a high-speed file system.” But are you? If you just respond to a request for a certain amount of storage, you’re probably missing the bigger opportunity of what that company is actually trying to do.
I like to spend as much as 60 to 70 percent of my time with customers, because otherwise, I’m missing the real opportunities.. My views are rapidly becoming outdated—things are changing fast. The greatest insights are happening in the field, not in the laboratory.
The customers I love working with the most are the ones pushing toward something that almost seems crazy—because in five or ten years, those are the companies that will have completely transformed their industries.
If you’re rethinking your data strategy or preparing for AI-driven storage demands, get in touch and our experts can consult on the best configurations for your needs.