Comprehensive coverage

Prof. Noga Alon and Prof. Yossi Matias from Tel Aviv University won the prestigious Kanalkis Award of the ACM

The researchers from Tel Aviv University won the prize for laying the foundations for streaming algorithms in the 90s * The theoretical framework proposed by the researchers from Tel Aviv University determines what is possible and what is not possible to calculate with huge data

The Association for Computing Machinery (ACM) will award the 2019 Paris Kanelkis International Prize to Prof. Noga Alon from Tel Aviv University and Princeton University and to Prof. Yossi Matias from Tel Aviv University and Google.

The prestigious ACM Kanelakis Award is given to theoretical achievements that have given rise to significant and proven applications in the field of computing. Prof. Alon and Prof. Matias won the award together with their partners, Prof. Philip Gibbons from Carnegie Mellon University and Prof. Mario Segadi from Rutgers University, for their groundbreaking work in laying the foundations for streaming algorithms (streaming information) and developing applications for big data analysis (huge data).

The researchers from Tel Aviv University won the ACM Kanalakis Award, along with Gibbons and Segadi, for a series of studies, including a seminal article they published in 1996 in the journal JCSS, entitled "The space complexity of approximating the frequency moments" - in which they proposed a method for handling large quantities A lot of information flows. For this article they also received the prestigious Gadel Award in 2005.

The award committee explained in their reasoning: "They [Alon, Gibbons, Matias and Segadi] were the pioneers of the framework for the algorithmic handling of massive streaming of huge data, and today their sketching methods and algorithms still constitute the main approach in streaming algorithms and the basis of an entire subfield in the algorithmic field... In addition, The concepts of sketches and information summaries that they proposed are routinely used in various information analysis tasks in various fields such as databases, network monitoring, usage analysis of Internet products, natural language processing and machine learning."

Prof. Alon explains: "Today, in the age of big data, we have enormous amounts of information in many fields - from computational learning, through speech analysis, to network communication. We can read the data as it arrives, but we don't have the option of keeping everything in memory. Now the question is which algorithms can be programmed to perform calculations on the input even without saving it - using data synopsis, or information summaries".

Prof. Matias adds: "The challenge of calculations on big data may sound obvious today, but in the 90s we had to explain what the problem was. After all, calculating statistical functions can be an exercise in a programming course, and the significant computational problem only arises when there is a gap between the amount of information and the memory that can be calculated in it. To this end, we proposed a computational model that uses information summaries, or 'sketches', and we developed theoretical and practical algorithms, some of which were already implemented in one of the world's largest business intelligence systems. A few years after the publication of the article, the Internet began to explode with information - and a growing gap was created between the amount of information and what can be stored in memory. Out of this gap grew a new field of algorithms for flowing information, with a smart use of information summaries."

In the founding article, and in the many follow-up studies they published over the years, Prof. Alon and Prof. Matias demonstrated how certain statistical features of the information can be used to run calculations on big data without saving the data in the system.

Prof. Alon: "Suppose the information that flows to us is a collection of numbers and I want to check if there is a number that has arrived twice. A normal algorithm should take each new number that arrives and compare it to any number that came before. But for this you have to save all the numbers that arrived, an impossible task within the memory limitations. On the other hand, it is possible to create efficient statistics, which require little memory, which will estimate approximately how many different numbers arrived in the input or how many pairs of the numbers arrived are equal to each other. At the same time, we also showed statistical properties of big data that cannot be identified with limited memory."

Prof. Matias: "The challenge is to save the data summaries so that we can still run calculations for various problems and prove that the calculations are correct. We proposed the theoretical-mathematical framework for a smart summarization of information, and showed what is possible and what is not possible to calculate in this way. A very rich field was created, in the framework of which many researchers did beautiful and important works, both from a theoretical and a practical point of view. Engineers use our framework to program algorithms for streaming information."

The president of Tel Aviv University, Prof. Ariel Porat: "I congratulate Prof. Alon and Prof. Matias for winning the prestigious award. Dealing with huge data in different and varied fields is more common today than ever, and has both theoretical and practical importance. We are proud of the contribution of Prof. Alon and Prof. Matias to the development of the algorithms for the analysis of huge data analysis, and the ACM award is the best recognition for their groundbreaking achievements." 

Leave a Reply

Email will not be published. Required fields are marked *

This site uses Akismat to prevent spam messages. Click here to learn how your response data is processed.