Datenschutzerklärung|Data Privacy
Impressum

06.02.2020
Martin Pagel

Do 13.02.2020, 16:30 - 17:30 Uhr TU Berlin, EN building, seminar room EN 719 (7th floor), Einsteinufer 17, 10587 Berlin: "Exploiting Incremental Evaluation for Efficient Distributed Matrix Computation" (Chen Xu, East China Normal University)

Abstract:
Distributed matrix computation is common in large-scale data processing and machine learning applications. Iterative-convergent algorithms involving matrix computation share a common property, parameters converge non-uniformly. This property can be exploited to eliminate computational redundancy. Unfortunately, existing systems with an underlying distributed matrix computation, like SystemML, do not fully do so. In this presentation, I will talk about IMAC, an incremental matrix computation prototype, which incorporates both full matrix evaluation as well as incremental evaluation, to leverage non-uniform convergence. IMAC builds and improves upon SystemML, and our experiments show that IMAC outperforms SystemML by an order of magnitude.

Bio:
Chen Xu is currently an associate professor at School of Data Science and Engineering, East China Normal University (ECNU), Shanghai. From 2014 to 2018, Chen conducted postdoctoral research at Database Systems and Information Management (DIMA) Group, Technische Universität Berlin. Chen got his PhD degree in 2014 from ECNU. During his study, he served as a research intern at Data & Knowledge Engineering (DKE) Group, The University of Queensland, in 2011. His research interest is large-scale data management.