ترغب بنشر مسار تعليمي؟ اضغط هنا

Estimating the Cheeger constant using machine learning

122   0   0.0 ( 0 )
 نشر من قبل Kashyap Rajeevsarathy
 تاريخ النشر 2020
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English




اسأل ChatGPT حول البحث

In this paper, we use machine learning to show that the Cheeger constant of a connected regular graph has a predominant linear dependence on the largest two eigenvalues of the graph spectrum. We also show that a trained deep neural network on graphs of smaller sizes can be used as an effective estimator in estimating the Cheeger constant of larger graphs.

قيم البحث

اقرأ أيضاً

We review the theory of Cheeger constants for graphs and quantum graphs and their present and envisaged applications.
Estimating health benefits of reducing fossil fuel use from improved air quality provides important rationales for carbon emissions abatement. Simulating pollution concentration is a crucial step of the estimation, but traditional approaches often re ly on complicated chemical transport models that require extensive expertise and computational resources. In this study, we develop a novel and succinct machine learning framework that is able to provide precise and robust annual average fine particle (PM2.5) concentration estimations directly from a high-resolution fossil energy use data set. The accessibility and applicability of this framework show great potentials of machine learning approaches for integrated assessment studies. Applications of the framework with Chinese data reveal highly heterogeneous health benefits of reducing fossil fuel use in different sectors and regions in China with a mean of $34/tCO2 and a standard deviation of $84/tCO2. Reducing rural and residential coal use offers the highest co-benefits with a mean of $360/tCO2. Our findings prompt careful policy designs to maximize cost-effectiveness in the transition towards a carbon-neutral energy system.
We compute the Cheeger constant of spherical shells and tubular neighbourhoods of complete curves in an arbitrary dimensional Euclidean space.
In this article we study the top of the spectrum of the normalized Laplace operator on infinite graphs. We introduce the dual Cheeger constant and show that it controls the top of the spectrum from above and below in a similar way as the Cheeger cons tant controls the bottom of the spectrum. Moreover, we show that the dual Cheeger constant at infinity can be used to characterize that the essential spectrum of the normalized Laplace operator shrinks to one point.
Lattice constants such as unit cell edge lengths and plane angles are important parameters of the periodic structures of crystal materials. Predicting crystal lattice constants has wide applications in crystal structure prediction and materials prope rty prediction. Previous work has used machine learning models such as neural networks and support vector machines combined with composition features for lattice constant prediction and has achieved a maximum performance for cubic structures with an average $R^2$ of 0.82. Other models tailored for special materials family of a fixed form such as ABX3 perovskites can achieve much higher performance due to the homogeneity of the structures. However, these models trained with small datasets are usually not applicable to generic lattice parameter prediction of materials with diverse compositions. Herein, we report MLatticeABC, a random forest machine learning model with a new descriptor set for lattice unit cell edge length ($a,b,c$) prediction which achieves an R2 score of 0.979 for lattice parameter $a$ of cubic crystals and significant performance improvement for other crystal systems as well. Source code and trained models can be freely accessed at https://github.com/usccolumbia/MLatticeABC

الأسئلة المقترحة

التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا