Businesses are hoarding more data than ever to fuel their AI ambitions, but at the same time, they are concerned about who has access to this data, which is often of a very private nature. PVML provides an interesting solution by combining data analysis tools like ChatGPT with the security guarantees of differential privacy. Using Retrieval Enhanced Generation (RAG), PVML can access a company’s data without moving it, eliminating another security consideration.

The Tel Aviv-based company recently announced it has raised $8 million in a seed round led by NFX, with participation from FJ Labs and Gefen Capital.

Image Source: PVML

The company was founded by husband-and-wife team Shachar Schnapp (CEO) and Rina Galperin (CTO). Schnapp earned a PhD in computer science, specializing in differential privacy, and then worked on computer vision at General Motors, while Galperin earned a master’s degree in computer science, specializing in artificial intelligence and natural language processing, and worked on machine learning at Microsoft project.

“A lot of our experience in this area comes from our work at large companies and big companies where we found that things were not as efficient as we had hoped, perhaps as naive students,” Galperin said. “The main value we want to bring to the PVML organization is data democratization. This can only happen if you protect this very sensitive data on the one hand, but allow easy access to it on the other hand, which is today synonymous with artificial intelligence. Everyone wants to use free text to analyze data easier, faster and more efficiently – and our secret weapon, differential privacy, makes this integration very easy.”

Differential privacy is far from a new concept. The core idea is to ensure the privacy of individual users in large data sets and provide mathematical guarantees for this. One of the most common ways to achieve this is to introduce a degree of randomness into the data set that does not alter the data analysis.

The team believes that today’s data access solutions are inefficient and create significant overhead. For example, oftentimes, large amounts of data must be deleted in the process of enabling employees to access the data securely, but this can be counterproductive as you may not be able to effectively use the redacted data for certain tasks (plus additional access to the data) The lead time means that real-time use cases are often not possible).

Image Source: PVML

The promise of using differential privacy means that users of PVML do not have to make changes to the original data. This avoids almost all overhead and securely unlocks this information for AI use cases.

Almost all big tech companies now use differential privacy in one form or another and make their tools and libraries available to developers. The PVML team believes that most of the data community has yet to truly put this into practice.

“Current knowledge about differential privacy is more theoretical than practical,” Schnapp said. “We decided to take it from theory to practice. That’s exactly what we do: we develop practical algorithms that best fit the data in real-world scenarios.”

Differential privacy work is meaningless if PVML’s actual data analysis tools and platforms are not useful. The most obvious use case here is the ability to chat with your data, all while guaranteeing that no sensitive data can leak into the chat. Using RAG, PVML can reduce hallucinations to almost zero, with minimal overhead since the data remains in place.

But there are other use cases. Differential privacy now also allows companies to share data between business units, Schnapp and Galperin noted. Additionally, it may allow some companies to profit from third-party access to their data.

“In today’s stock market, 70% of trading is done by artificial intelligence,” said Gigi Levy-Weiss, general partner and co-founder of NFX. “This is the wave of the future, and organizations that embrace artificial intelligence today will be ahead tomorrow. step. But companies are afraid to connect their data to AI because they fear exposure—and for good reason. PVML’s unique technology creates an invisible layer of protection and democratizes access to data, making it possible today. monetization use cases and paving the way for tomorrow.”

#PVML #combines #AIcentric #data #access #analysis #platform #differential #privacy

Leave a Reply

Your email address will not be published. Required fields are marked *