The team streamlines neural networks to become more familiar with computing encrypted data-ScienceDaily – Illinoisnewstoday.com

This week, at the 38th International Conference on Machine Learning (ICML 21), researchers at NYU Cyber Security Center at NYU Tandon Institute of Technology unveil new insights into the basic functions that drive the ability of neural networks to make inferences. Encrypted data.

In the paper Deep ReDuce: ReLU Reduction for Fast Private Inference, the team focuses on linear and nonlinear operators, key features of neural network frameworks that place a heavy burden on time and computational resources depending on the operation. .. When neural networks compute encrypted data, many of these costs are incurred by the non-linear operation, the rectified linear activation function (ReLU).

A team of collaborators, including Brandon Regen, a professor of computer science and engineering, electrical and computer engineering, and a PhD in Nandan Kumar Jha. Students and Zahra Ghodsi, a former PhD student under the guidance of Siddharth Garg, have developed a framework called Deep ReDuce. We provide a solution by rearranging and reducing ReLU of neural networks.

Reagen explained that this shift requires a radical reassessment of where and how many components are distributed in the neural network system.

What were trying to do is rethink how neural networks are designed in the first place, he explained. By skipping many of these time-consuming and computationally expensive ReLU operations, you can get a high-performance network with 2-4 times faster execution times.

The team found that DeepReDuce improves accuracy and reduces ReLU counts by up to 3.5% and 3.5 times, respectively, compared to the latest private inference.

Surveys are not just academic. As the use of AI grows in tandem with the security concerns of personal, corporate, and government data security, neural networks are increasingly computing encrypted data. In such a scenario, which involves a neural network that generates private inference (PI) for hidden data without disclosing the input, it is the nonlinear function that offers the highest cost in time and power. These costs increase the difficulty and time it takes for the learning machine to perform PI, and researchers have struggled to reduce the burden that ReLU puts on such calculations.

The teams work is based on an innovative technology called CryptoNAS. The author is explained in previous treatises, including Ghodsi and a third PhD. Student Akshaj Veldanda, CryptoNAS, optimizes the use of ReLU because it may rearrange the way rocks are placed in streams to optimize water flow. Readjust the distribution of ReLUS in your network and remove redundant ReLUs.

DeepReDuce extends CryptoNAS by further streamlining the process. It consists of a series of optimizations for the wise removal of ReLU after the CryptoNAS reorganization feature. Researchers have tested by removing ReLU from traditional networks using DeepReDuce and found that it can significantly reduce inference latency while maintaining high accuracy.

Reagan collaborates with Mihalis Maniatakos, Research Assistant Professor of Electrical and Computer Engineering, with Duality, a data security company to design new microchips designed to handle calculations of fully encrypted data. It is also a part.

Research on ReLUS was supported by the ADA and the DARPA and Data Protection (DPRIVE) programs in virtual environments at the Center for Application-Driven Architecture.

Story source:

material Provided by NYU Tandon Institute of Technology.. Note: The content can be edited in style and length.

The team streamlines neural networks to become more familiar with computing encrypted data-ScienceDaily

Source link The team streamlines neural networks to become more familiar with computing encrypted data-ScienceDaily

Read the original post:

The team streamlines neural networks to become more familiar with computing encrypted data-ScienceDaily - Illinoisnewstoday.com

Related Posts

Comments are closed.