Computer

Neural Networks are being Streamlined to make them more Adept at Computing on Encrypted Data by a Team of Analysts

Neural Networks are being Streamlined to make them more Adept at Computing on Encrypted Data by a Team of Analysts

Massive volumes of confidential data are managed and kept in the cloud or on linked servers. Encryption protects against brute-force and cyber-attacks, such as malware and ransomware, with the help of cyber security. Data encryption protects digital data sent over the internet and through computer systems.

Researchers from the NYU Tandon School of Engineering’s NYU Center for Cyber Security are sharing new insights into the underlying processes that govern neural networks’ capacity to generate inferences on encrypted data at the 38th International Conference on Machine Learning (ICML) in July 2021.

The team focuses on linear and non-linear operators, essential aspects of neural network frameworks that, depending on the operation, bring a hefty toll in time and computing resources, in the article “DeepReDuce: ReLU Reduction for Fast Private Inference.” When neural networks process encrypted data, the rectified linear activation function (ReLU), a non-linear operation, incurs several of these costs.

Ciphertext refers to encrypted data, whereas plaintext refers to data that has not been encrypted. Encryption is currently one of the most common and effective data protection technologies in use by businesses. Asymmetric encryption, often known as public-key encryption, and symmetric encryption are the two major kinds of data encryption.

Example of encrypted data

To safeguard data, modern encryption methods have superseded the old Data Encryption Standard. Integrity, authentication, and non-repudiation are just a few of the security objectives that these algorithms protect.

Under the supervision of Siddharth Garg, Brandon Reagen, professor of computer science and engineering and electrical and computer engineering, and a team of colleagues, including Nandan Kumar Jha, a Ph.D. student, and Zahra Ghodsi, a former doctorate student, created the DeepReDuce framework. It provides a solution by rearranging and reducing the number of ReLUs in neural networks.

This change, according to Reagen, necessitates a fundamental rethinking of where and how many components are placed in neural network systems.

“What we are trying to do is rethink how neural nets are designed in the first place,” he explained. “You can skip a lot of these time- and computationally-expensive ReLU operations and still get high performing networks at 2 to 4 times faster run time.”

DeepReDuce increased accuracy and decreased ReLU count by up to 3.5 percent and 3.5 percent, respectively, when compared to the state-of-the-art for private inference, according to the researchers.

This isn’t a purely academic investigation. Neural networks are increasingly performing calculations on encrypted data as the usage of AI develops in tandem with worries about the security of personal, business, and government data.

Non-linear functions have the largest “cost” in terms of time and power in scenarios involving neural networks making private inferences (PI’s) on concealed data without exposing inputs. Researchers have sought to lessen the strain ReLUs impose on such calculations since these expenses increase the complexity and time it takes for learning machines to conduct PI.

Asymmetric encryption, commonly known as Public-Key Cryptography, encrypts and decrypts data with two asymmetric cryptographic keys. A “public key” and a “private key” are the terms used to describe these two keys. Symmetric encryption is a kind of encryption in which the plaintext and ciphertext are encrypted and decrypted using only one secret symmetric key. It’s crucial to remember that while encryption strength is directly related to key size, the amount of resources required to conduct the computation grows as well.

The team’s technique is based on a cutting-edge technology known as CryptoNAS. CryptoNAS, described in an earlier article co-authored by Ghodsi and a third Ph.D. student, Akshaj Veldanda, optimizes the usage of ReLUs in the same way that rearranging boulders in a stream may improve water flow: it rebalances the distribution of ReLUs in the network and eliminates superfluous ReLUs.

DeepReDuce improves on CryptoNAS by speeding up the procedure. It consists of a series of optimizations for the careful removal of ReLUs following CryptoNAS rearrangement functions. DeepReDuce was put to the test by the researchers, who found that by removing ReLUs from traditional networks, they were able to drastically cut inference latency while retaining excellent accuracy.

Reagan is also working with Mihalis Maniatakos, a research assistant professor of electrical and computer engineering, on a novel microprocessor that will conduct computation on completely encrypted data as part of a cooperation with data security startup Duality.

The Center for Applications Driving Architectures and the Data Protection in Virtual Environments (DPRIVE) program at the US Defense Advanced Study Projects Agency (DARPA) funded the ReLUS research.

As employees utilize external devices, portable media, and online applications increasingly often as part of their everyday business operations, companies and organizations confront the issue of securing data and preventing data loss. Concerns regarding public cloud security and data protection across complex settings are rising as more companies move to hybrid and multi-cloud environments.

On-premises and in the cloud, enterprise-wide data encryption and encryption key management can help safeguard data. Based on message content and context, such as user, data class, and recipient, the finest data loss prevention solutions automatically alert, block, and encrypt critical information. The sensitive data of an organization must be safeguarded while enabling authorized users to carry out their duties. While data encryption may appear to be a difficult and time-consuming operation, data loss prevention software handles it on a daily basis.