ﻻ يوجد ملخص باللغة العربية
Machine learning algorithms are commonly specified in linear algebra (LA). LA expressions can be rewritten into more efficient forms, by taking advantage of input properties such as sparsity, as well as program properties such as common subexpressions and fusible operators. The complex interaction among these properties impact on the execution cost poses a challenge to optimizing compilers. Existing compilers resort to intricate heuristics that complicate the codebase and add maintenance cost but fail to search through the large space of equivalent LA expressions to find the cheapest one. We introduce a general optimization technique for LA expressions, by converting the LA expressions into Relational Algebra (RA) expressions, optimizing the latter, then converting the result back to (optimized) LA expressions. One major advantage of this method is that it is complete, meaning that any equivalent LA expression can be found using the equivalence rules in RA. The challenge is the major size of the search space, and we address this by adopting and extending a technique used in compilers, called equality saturation. We integrate the optimizer into SystemML and validate it empirically across a spectrum of machine learning tasks; we show that we can derive all existing hand-coded optimizations in SystemML, and perform new optimizations that lead to speedups from 1.2X to 5X.
We present a new approach to e-matching based on relational join; in particular, we apply recent database query execution techniques to guarantee worst-case optimal run time. Compared to the conventional backtracking approach that always searches the
The practical success of deep learning has sparked interest in improving relational table tasks, like data search, with models trained on large table corpora. Existing corpora primarily contain tables extracted from HTML pages, limiting the capabilit
Financial transactions, internet search, and data analysis are all placing increasing demands on databases. SQL, NoSQL, and NewSQL databases have been developed to meet these demands and each offers unique benefits. SQL, NoSQL, and NewSQL databases a
We consider the question: what is the abstraction that should be implemented by the computational engine of a machine learning system? Current machine learning systems typically push whole tensors through a series of compute kernels such as matrix mu
Variability inherently exists in databases in various contexts which creates database variants. For example, variants of a database could have different schemas/content (database evolution problem), variants of a database could root from different so