In this report, Eric Burgener, Research Vice President at IDC provides an overview of key Big Memory market dynamics. He proposes that as real-time analytics workloads become more prevalent, the gap in the memory/storage hierarchy is highlighting a significant market opportunity that is addressed by Big Memory Computing.
Penguin Computing is a leader in providing servers for scientific and business high-performance computing. For the fast-emerging class of real-time analytics and machine learning applications with massive data sets, running completely in-memory is critical to meeting their performance requirements.
In this episode of the “GreyBeards on Storage” podcast, Ray Lucchesi and Matt Leib talk to MemVerge CEO, Charles Fan, about Optane persistent memory, and Memory Machine software, including it’s suite of data services.
For the first time, a major industry analyst defines the new category called Big Memory. During our Opening the Door for Big Memory Webinar, Eric Burgener, Research Vice President at IDC, describes the drivers for the Big Memory market, a forecast for byte-addressable Persistent Memory revenue, and a concise definition of the Big Memory category.
Join MemVerge at Global STAC Live. STAC Summits bring together CTOs and other industry leaders responsible for solution architecture, infrastructure engineering, application development, machine learning/deep learning engineering, data engineering, and operational intelligence to discuss important technical challenges in trading and investment.
Join MemVerge at Intel Partner Connect. Virtual Intel® Partner Connect will provide access to Intel’s executive leadership and subject matter experts to learn about product priorities, information about key technology trends, and the programs and tools that can help grow your business.
A webinar hosted by MemVerge, Intel, NetApp, IDC and Penguin Computing that introduces a New Wave of Computing: Big Memory Computing. Big Memory hardware and software together transform scarcity and volatility into abundance, persistence, and high-availability.
On NFL draft day, MemVerge and Intel hosted a Tech Talk about Big Memory in Financial Services. The webinar provided an overview of Optane persistent memory, the category called Big Memory, Memory Machine software, and use cases in financial services.
Today’s storage systems cannot keep up with the capacity and performance requirements from real-time data collection and analytics. A more memory-centric architecture will be needed. What if every application could run in-memory? What if memory could be abundant, persistent and highly available?
In the history of computing, memory and storage have been two different concepts. Memory is used to place an application’s running state, accessed by load/store operations, while storage is used to reliably persist data that can survive power cycles. The introduction of persistent memory opens up the possibility of a new infrastructure that unifies memory and storage.