AI Revolutionizes Enterprise Storage, Driving Performance, Governance, and Efficiency

Marwan Kenawy, CEO and Co-Founder of Dsquares, highlights how AI is redefining enterprise storage—from powering real-time loyalty programs for millions of customers to enabling intelligent tiering, generative content management, and energy-efficient operations—forcing organizations to treat storage as a strategic asset rather than an afterthought.
How is AI changing enterprise storage needs and priorities?
AI has completely changed how businesses think about data infrastructure. Storage is no longer a cost that should be kept to a minimum. Competitive advantage is directly impacted by this strategic asset. Data quality at the storage layer is equally critical. Our anomaly detection and advanced analytics only deliver value when data is clean and consistently available. Beyond performance, we help clients monetise their data assets through AI models.
What challenges arise when scaling storage for high-volume, fast-moving AI data?
The industry underestimates how difficult it is to scale AI-driven loyalty workloads. AI models must respond to customer behaviour data in almost real-time since it moves quickly. Missed interaction opportunities result directly from any latency at the storage layer.
Variety comes in second. Receipt photos, transaction metadata, geolocation signals, gamification events, and an increasing amount of F&B operational data are all examples of highly unstructured loyalty data. Storage must be optimised for each sort of data. Governance at scale comes in third and is often overlooked. Navigating various data residency and privacy laws in each region is necessary while expanding around the Middle East, Africa, and Europe. Regional compliance must be incorporated into storage architecture from the start rather than being added later.
How are high-capacity solutions evolving to support AI workloads and unstructured data?
The evolution has been significant and is moving in the right direction for companies like ours. The key shift we have observed is the emergence of storage solutions that are intelligence-aware, designed to coexist with vector databases and real-time streaming architectures rather than simply holding data passively.
We run dozens of client loyalty programmes on shared infrastructure while maintaining strict data separation, both for security and regulatory compliance. Modern storage solutions increasingly accommodate this requirement natively, which reduces our architectural complexity significantly.
The intelligence layer is moving closer to where data lives, and that architectural shift is enabling faster, smarter loyalty experiences for end consumers. This is particularly evident in our Customer Value Management capabilities, where we can now execute AI-driven campaigns with response times measured in milliseconds rather than minutes.
How do hybrid storage architectures balance cost, performance, and accessibility for AI?
At our scale, hybrid architecture is a commercial requirement rather than merely a technical choice. Not every piece of data has the same level of time sensitivity. While historical transaction data and compliance records are kept in economical cold or warm storage layers, active customer profiles, live point balances, and real-time campaign rules necessitate high-performance, low-latency settings.
AI-driven intelligent tiering is crucial. Performance and cost efficiency are significantly increased when storage selections are made based on usage patterns rather than set policies. Because our data architecture allowed for genuinely tailored engagement at every touchpoint, our strategy produced tangible results.
Accessibility is just as important. Clients, analysts, and partners never have to consider the physical location of data thanks to a hybrid design with clean API abstraction.
What impact do AI-native or intelligent storage systems have on managing generative content?
Generative AI is introducing a new class of data most enterprises aren’t prepared for. Our campaign management and customisation layers incorporate generative capabilities. Content volume increases rapidly when AI creates dynamic reward offers and campaign variations at scale. Financial services and telecom regulations require that all outputs be kept, versioned, auditable, and retrievable.
Without the need for human interaction, AI-native storage systems implement autonomous lifecycle policies, caching active material, archiving served content, and deleting old content. The operational burden of handling AI-generated content would completely outweigh the financial advantages without this sophisticated infrastructure.
How can efficient storage strategies help reduce energy use and operational costs from AI growth?
Storage has a major role in the actual energy footprint of AI progress. Data minimisation at the architecture level storing what matters, at the correct fidelity, for the right amount of time is the most effective approach we’ve adopted. When necessary, summary models can take the place of complete transaction histories, and aggregated behavioural signals can replace raw event logs.
These benefits are further enhanced by intelligent tiering. Instead of maintaining everything on power-intensive systems, moving cold data to energy-efficient archiving infrastructure results in quantifiable cost savings and strengthens our dedication to sustainable, ethical operations at scale.
What steps should IT leaders take to prepare storage for expanding AI-generated data?
The first step is to stop treating storage as a trailing decision. Too many organisations design their AI stack models, pipelines, interfaces then address storage as an afterthought. That approach breaks down rapidly at scale. Storage architecture must be a first-class consideration alongside compute and model selection from day one.
Second, invest in metadata infrastructure. AI-generated data is only as useful as your ability to find, filter, and contextualise it. Finally, put governance and auditability at the centre of your strategy. Tracing exactly what data was used, when, and how, is not just a compliance requirement. It is a trust requirement.
Which emerging storage trend will most improve handling the AI data surge?
One of the most important trends shaping how organisations manage the surge of AI data is the rise of lakehouse architectures built on scalable object storage. Modern AI systems generate and consume enormous volumes of behavioural, transactional, and interaction data, and traditional storage systems struggle to keep up with both the scale and the speed required.
Lakehouse platforms allow organisations to store massive datasets in open formats while running analytics, machine learning, and real-time processing directly on top of the same data layer, dramatically reducing the need for complex data movement.
Closely related is the rise of vector storage. As generative AI embeds deeper into loyalty personalisation, matching customers to rewards through preference embeddings rather than rigid rules, storing and querying high-dimensional vector data efficiently becomes foundational. The platforms investing in this capability now will lead the next wave of AI personalisation.



