ترغب بنشر مسار تعليمي؟ اضغط هنا

Factorize Factorization

62   0   0.0 ( 0 )
 نشر من قبل Claudia Faggian
 تاريخ النشر 2020
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English




اسأل ChatGPT حول البحث

Factorization -- a simple form of standardization -- is concerned with reduction strategies, i.e. how a result is computed. We present a new technique for proving factorization theorems for compound rewriting systems in a modular way, which is inspired by the Hindley-Rosen technique for confluence. Specifically, our technique is well adapted to deal with extensions of the call-by-name and call-by-value lambda-calculi. The technique is first developed abstractly. We isolate a sufficient condition (called linear swap) for lifting factorization from components to the compound system, and which is compatible with beta-reduction. We then closely analyze some common factorization schemas for the lambda-calculus. Concretely, we apply our technique to diverse extensions of the lambda-calculus, among which de Liguoro and Pipernos non-deterministic lambda-calculus and -- for call-by-value -- Carraro and Guerrieris shuffling calculus. For both calculi the literature contains factorization theorems. In both cases, we give a new proof which is neat, simpler than the original, and strikingly shorter.



قيم البحث

اقرأ أيضاً

Lambda-calculi come with no fixed evaluation strategy. Different strategies may then be considered, and it is important that they satisfy some abstract rewriting property, such as factorization or normalization theorems. In this paper we provide si mple proof techniques for these theorems. Our starting point is a revisitation of Takahashis technique to prove factorization for head reduction. Our technique is both simpler and more powerful, as it works in cases where Takahishis does not. We then pair factorization with two other abstract properties, defining emph{essential systems}, and show that normalization follows. Concretely, we apply the technique to four case studies, two classic ones, head and the leftmost-outermost reductions, and two less classic ones, non-deterministic weak call-by-value and least-level reductions.
In each variant of the lambda-calculus, factorization and normalization are two key-properties that show how results are computed. Instead of proving factorization/normalization for the call-by-name (CbN) and call-by-value (CbV) variants separately, we prove them only once, for the bang calculus (an extension of the lambda-calculus inspired by linear logic and subsuming CbN and CbV), and then we transfer the result via translations, obtaining factorization/normalization for CbN and CbV. The approach is robust: it still holds when extending the calculi with operators and extra rules to model some additional computational features.
We provide an algorithm that factorizes one-dimensional quantum walks into a protocol of two basic operations: A fixed conditional shift that transports particles between cells and suitable coin operators that act locally in each cell. This allows to tailor quantum walk protocols to any experimental setup by rephrasing it on the cell structure determined by the experimental limitations. We give the example of a walk defined on a qutrit chain compiled to run an a qubit chain.
Click-through rate prediction is one of the core tasks in commercial recommender systems. It aims to predict the probability of a user clicking a particular item given user and item features. As feature interactions bring in non-linearity, they are w idely adopted to improve the performance of CTR prediction models. Therefore, effectively modelling feature interactions has attracted much attention in both the research and industry field. The current approaches can generally be categorized into three classes: (1) naive methods, which do not model feature interactions and only use original features; (2) memorized methods, which memorize feature interactions by explicitly viewing them as new features and assigning trainable embeddings; (3) factorized methods, which learn latent vectors for original features and implicitly model feature interactions through factorization functions. Studies have shown that modelling feature interactions by one of these methods alone are suboptimal due to the unique characteristics of different feature interactions. To address this issue, we first propose a general framework called OptInter which finds the most suitable modelling method for each feature interaction. Different state-of-the-art deep CTR models can be viewed as instances of OptInter. To realize the functionality of OptInter, we also introduce a learning algorithm that automatically searches for the optimal modelling method. We conduct extensive experiments on four large datasets. Our experiments show that OptInter improves the best performed state-of-the-art baseline deep CTR models by up to 2.21%. Compared to the memorized method, which also outperforms baselines, we reduce up to 91% parameters. In addition, we conduct several ablation studies to investigate the influence of different components of OptInter. Finally, we provide interpretable discussions on the results of OptInter.
138 - S.M. Aybat , T.C. Rogers 2011
We give an overview of the current status of perturbative QCD factorization theorems in processes that involve transverse momentum dependent (TMD) parton distribution functions (PDFs) and fragmentation functions (FF). We enumerate those cases where T MD-factorization is well-established, and mention cases where it is likely to fail. We discuss recent progress in the implementation of specific TMD-factorization calculations, including the implementation of evolution. We also give examples of hard part calculations. We end by discussing future strategies for the implementation of TMD-factorization in phenomenological applications.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا