MemVerge Emerges from Stealth Fueled by $24.5 Million Series A to Bring Memory-speed Processing to Enterprises as AI Workloads and Machine-Generated Data Proliferate

Gaorong Capital, Jerusalem Venture Partners and Lightspeed Venture Partners Invest in Inventors of First-ever Memory-Converged Infrastructure Who Hail from EMC, VMware and XtremIO

SAN JOSE, CA — April 2, 2019  MemVerge, the inventor of Memory-Converged infrastructure (MCI), today emerged from stealth with $24.5 million in Series A funding. The investment from Gaorong Capital, Jerusalem Venture Partners, LDV Partners, Lightspeed Venture Partners, and Northern Light Venture Capital will be used by MemVerge to significantly expand its engineering, sales and marketing teams in Silicon Valley and to accelerate R&D to advance the development of its MCI technology. In a separate release today, MemVerge announced the Beta launch of the world’s first Memory-Converged system to power applications in an era where the ubiquity of machine-generated data characterizes modern enterprise IT workloads. Built on top of brand new Optane™ DC persistent memory from Intel®, the MemVerge system for the first time collapses the memory-storage barrier so AI, IoT and real-time analytics applications run flawlessly at memory speed, without the crashes and other challenges common today. MemVerge was founded by former VMware and EMC executive Charles Fan, the co-founder of XtremIO and Rainfinity and Moore Professor at Caltech Shuki Bruck, as well as senior postdoctoral scholar at Caltech Yue Li.

“As more IT organizations look to leverage next generation big data analytics in a more real-time manner to drive better business optimization and planning, it’s clear that existing storage technologies are falling short of the performance requirements,” said Eric Burgener, research vice president, Infrastructure Systems, Platforms and Technologies Group, IDC. “With burgeoning persistent memory technologies now becoming available from vendors like Intel, it was inevitable that someone would come up with a storage operating environment specifically designed to run in persistent memory, and that is what storage startup MemVerge has done. The vendor’s Distributed Memory Objects technology promises to meet the requirements of next generation analytics workloads with lower latencies, more predictable performance at scale, expanded capacity, improved efficiencies and better reliability.”

Every day 2.5 quintillion bytes of data are created at an accelerating pace given the deluge of machine generated data from AI/ML, IoT and analytics applications. These high-performance workloads require an infrastructure that can process the constant flood of data produced by the on-demand economy. This presents a compounding challenge for enterprise teams as well as data scientists at web giants and large enterprises who until now were forced to choose between having speed over higher capacity computing memory, or vice versa, when trying to run the world’s most demanding data-centric workloads. MemVerge’s MCI system solves these challenges by delivering memory and storage from a single distributed platform while integrating seamlessly with existing applications. It provides 10X more memory size and 10X data I/O speed than state-of-the-art storage and compute solutions.

“The explosion of machine generated data has created a massive opportunity for enterprises to gather actionable real-time insights and streamline business processes. However, AI, machine learning, IoT and data science applications place extreme demands on current IT infrastructure – workloads are prone to slowing or failing. Until now, most enterprises had to choose between more speed or more capacity, which didn’t solve the problem,” said Bin Yue, founding partner at Gaorong Capital. “MemVerge has taken this need head-on. The company has a massive opportunity in being the first to market with a commercial Memory-Converged infrastructure system.”

“The transformation of the data center is long overdue,” said Charles Fan, MemVerge CEO and co-founder. “By eliminating the boundaries between memory and storage, our breakthrough architecture will power the most demanding AI and data science workloads today and in the future at memory speed, opening up new possibilities for data intensive computing for the enterprise.”

About MemVerge
MemVerge, the inventor of Memory-Converged Infrastructure (MCI), is the first to eliminate all boundaries between memory and storage to power the world’s most demanding data-centric enterprise workloads. Leveraging Intel® Optane™ DC persistent memory and architected to integrate seamlessly with existing applications, the MemVerge MCI system offers 10X the memory size and 10X the data I/O speed compared to current state-of-the-art computing and storage solutions. Its unique distributed memory objects (DMO) technology provides a logical convergence layer that harnesses Intel’s new memory-storage medium to let data-intensive workloads such as AI, machine learning (ML), big data analytics, IoT and data warehousing run flawlessly at memory speed with guaranteed data consistency across multiple systems. Offering large-scale memory and sub-microsecond response time, MemVerge solves a massive problem in the era of machine-generated data, namely how to process and derive insights from the enormous amount and variety of data in real time, handling small and large files with equal ease. Enterprises using MemVerge no longer contend with failed or painfully slow jobs due to performance bottlenecks, system crashes or worn out flash drives—they can now train AI models faster, analyze bigger states, complete more queries in less time and run complex workloads more predictably with fewer resources. Based in San Jose and backed by Gaorong Capital, Jerusalem Venture Partners, LDV Partners, Lightspeed Venture Partners and Northern Light Venture Capital, MemVerge is used for AI and data science workloads by leading innovators globally including LinkedIn, Tencent Cloud and JD.com.

Media Contact:
Steve Sturgeon
MemVerge
Steve.sturgeon@memverge.com
858.472.5669