Phillips took $100 million to accelerate analytics in data centers

[ad_1]

Analyzing the data generated within the organization – for example, sales and purchase data – can lead to insights that improve operations. But some organizations are struggling to efficiently process, store and use large amounts of data. According to IDC survey Seagate commissioned organizations collect only 56% of the data available across all lines of business and use only 57% of the 56%.

Part of the problem is that data-intensive workloads are resource-intensive, and adding the necessary computing and storage infrastructure is often expensive. For companies moving specifically to the cloud, IDG reports that they plan to spend $78 million on infrastructure this year. 36 percent cited regulatory costs as their top challenge.

That’s why Yuri Beitler launched what he calls Plips a “data processor” for enterprise and cloud data centers. The Plip processor is engineered to increase the performance of databases and other applications running on flash memory, which will save money in the long run, he said.

“It’s becoming clear that today’s data needs are no longer compatible with yesterday’s data center architecture. Massive data growth has collided with legacy compute and storage inefficiencies, creating computing slowdowns, storage bottlenecks and reducing network efficiency,” Baylor told TechCrunch in an email interview. “While CPU performance is increasing, it’s not getting stronger, especially where accelerated performance is critical. Adding more infrastructure often proves cost prohibitive and difficult to manage. As a result, organizations are looking for solutions that free up CPUs from computationally intensive storage tasks.

Philips is not the first to market with a processor for data analysis. Nivea sells the Bluefield-3 data processing unit (DPU). Marvel has Octeon technology. The Oracle SPARC M7 chip has an accelerated coprocessor for data analytics. And in the field of startups, Blueshift Memory and SpeedData are creating hardware that can perform analytical tasks faster than standard processors.

Image Credits: Flips

But Plips says it’s more present than most, with customers engaged and piloted (though unnamed) including fintechs, “mid-sized” communications service providers, data center operators and government labs. The startup’s early traction, which raised $100 million in a Series D round that closed today, appears to have won over investors.

Koch Disruptive Technologies has partnered with SK Hynix and Walden International Lip-Bu Tan, bringing Plips’ total capital to date to over $200 million. Bitler said he will work to build the company’s hardware and software roadmap, strengthen Philips’ footprint with partners and expand its global footprint.

“Many of our customers have seen significant growth during the Covid-19 pandemic, thanks in part to their ability to respond quickly to the new operating environment and uncertainty. Phillips certainly did. “Some customers were affected by supply chain issues, but we were not,” Beitler said. “We see no slowdown in the growth of data – or the need to use it. Plips was strong before this latest funding round and is even stronger now.

Speed ​​up data processing

Beitler, former director of advanced memory solutions at Samsung Israel Research Center, co-founded Plips in 2017 with Moshe Twitto and Aryeh Mergi. Twito was a Samsung research scientist developing signal processing technologies for flash memory, while Mergi launched several startups before joining Plips, including two that were acquired by EMC and SanDisk.

The Plip processor offers drive fiber protection for solid state drives (SSDs) as well as in-line compression, a technology that reduces data size by finding similar data sequences and then keeping only the first sequence. Beitler says the company’s technology reduces drive-thru space and expands capacity by displaying “variable sizes” of compacted items in storage to reduce wasted space.

The core of the PliPs processor is a hardware-accelerated key-value storage engine. In key-value databases – databases where data is stored in “key-value” format and optimized for reading and writing – key-value engines directly manage all persistent data. Beitler CPUs are typically overutilized when running these engines, resulting in applications not fully utilizing the SSD’s capabilities.

“Organizations are looking for solutions that free up CPUs from computationally-intensive storage tasks. Our hardware leverages a new generation of hardware-accelerated data processing and storage management technology to help create modern data center architectures – delivering orders of magnitude improvements in performance, reliability and scalability,” said Beitler. In short, PliPs can get more out of existing infrastructure investments.”

The Philips processor was launched last July. The development team’s current focus is on speeding up data entry for machine learning use cases, Beitler says — use cases that have grown among Plyop’s current and potential customers.

Image Credits: Flips

The road ahead

Certainly, Phillips has his work cut out for him. Nvidia has spent years developing its Bluefield lineup and is a formidable contender in the data processing accelerator space. And AMD acquired DPU supplier Pensando for $1.9 billion, signaling its broader ambitions.

A step that could pay off for Plips is joining the Open Program Infrastructure Project (OPI) under the Linux Foundation, which aims to create standards around data acceleration hardware. While Philips isn’t yet a member — current members include Intel, Nvidia, Marvell, F5, Red Hat, Dell and Keysight Technologies — it’s conceivable that becoming one could expose the technology to a larger customer base.

Beitler demurred when asked about OPI, but noted that the data acceleration market is nascent and growing.

“We continue to see both infrastructure and application teams become overwhelmed with inefficient storage and overcrowded applications,” Beitler said. The overall impression is that our processor is a game-changing product and without it companies would have to invest years in software and hardware engineering to solve the same problem.

[ad_2]

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *