In this paper, we use machine learning to show that the Cheeger constant of a connected regular graph has a predominant linear dependence on the largest two eigenvalues of the graph spectrum. We also show that a trained deep neural network on graphs
of smaller sizes can be used as an effective estimator in estimating the Cheeger constant of larger graphs.
The fractional matching number of a graph G, is the maximum size of a fractional matching of G. The following sharp lower bounds for a graph G of order n are proved, and all extremal graphs are characterized in this paper. (1)The sum of the fractiona
l matching number of a graph G and the fractional matching number of its complement is not less than n/2 , where n is not less than 2. (2) If G and its complement are non-empty, then the sum of the fractional matching number of a graph G and the fractional matching number of its complement is not less than (n+1)/2, where n is not less than 28. (3) If G and its complement have no isolated vertices, then the sum of the fractional matching number of a graph G and the fractional matching number of its complement is not less than (n+4)/2, where n is not less than 28.
The quadrilateral graph Q(G) is obtained from G by replacing each edge in G with two parallel paths of length 1 and 3, whereas the pentagonal graph W(G) is obtained from G by replacing each edge in G with two parallel paths of length 1 and 4. In this
paper, closed-form formulas of resistance distance and Kirchhoff index for quadrilateral graph and pentagonal graph are obtained whenever G is an arbitrary graph.
In this paper, we exploit the theory of dense graph limits to provide a new framework to study the stability of graph partitioning methods, which we call structural consistency. Both stability under perturbation as well as asymptotic consistency (i.e
., convergence with probability $1$ as the sample size goes to infinity under a fixed probability model) follow from our notion of structural consistency. By formulating structural consistency as a continuity result on the graphon space, we obtain robust results that are completely independent of the data generating mechanism. In particular, our results apply in settings where observations are not independent, thereby significantly generalizing the common probabilistic approach where data are assumed to be i.i.d. In order to make precise the notion of structural consistency of graph partitioning, we begin by extending the theory of graph limits to include vertex colored graphons. We then define continuous node-level statistics and prove that graph partitioning based on such statistics is consistent. Finally, we derive the structural consistency of commonly used clustering algorithms in a general model-free setting. These include clustering based on local graph statistics such as homomorphism densities, as well as the popular spectral clustering using the normalized Laplacian. We posit that proving the continuity of clustering algorithms in the graph limit topology can stand on its own as a more robust form of model-free consistency. We also believe that the mathematical framework developed in this paper goes beyond the study of clustering algorithms, and will guide the development of similar model-free frameworks to analyze other procedures in the broader mathematical sciences.