Do you want to publish a course? Click here

An Algorithm for Continuously Edge Coloring a Set of Graphs

خوارزمية تلوين ضلعي مستمر لصف من البيانات

1654   0   45   0 ( 0 )
 Publication date 2016
and research's language is العربية
 Created by Shamra Editor




Ask ChatGPT about the research

As it’s known, The Graph k-Colorability Problem (GCP) is a wellknown NP-Hard Problem. This problem consists in finding the k minimum number of colors to paint the vertices of a graph in such a way that any two adjoined vertices, which are connected by an edge, have always different colors. In another words how can we color the edges of a graph in such a way that any two edges joined by a vertex have always different colors? In this paper we introduce a new effective algorithm for coloring the edges of the graph. Our proposed algorithm enables us to achieve a Continuously Edge Coloring (CEC) for a set of known graphs.



References used
BARENBOIM, L.; ELKIN, M., 2009 - Distributed (Δ+1)-coloring in linear (in Δ) time, Proceedings of the 41st Symposium on Theory of Computing, pp. 111–120, ISBN 978-1-60558-506-2
BEIGEL, R., 2005 - 3-coloring in time O(1.3289n), Journal of Algorithms, 54 (2): 168–204
BRÉLAZ, D., 1979 - New Methods to Color the Vertices of a Graph, Communications of the ACM, 22 (4): 251–256
rate research

Read More

This paper introduces a new algorithm to solve some problems that data clustering algorithms such as K-Means suffer from. This new algorithm by itself is able to cluster data without the need of other clustering algorithms.
The graph-theoretical thickness (shortly thickness)of graph G, denoted by Φ(G), is the minimum number of planar subgraphs into which the graph can be decomposed, and a graph that can be drawn in the plane without any of its edges intersecting is c alled a planar graph. determining the thickness of a given graph is known to be an NP-complete problem. In this paper we introduce an application heuristic algorithm for determining the thickness. Our algorithm is based on simulated annealing optimization scheme which provide the results of the New-thick (1). We show that the simulated annealing is a efficient method to obtain good approximation for the thickness when the number vertices are at most 30 otherwise it is slower. Finally, we apply this algorithm on the heuristic algorithm Newthick and we show that the algorithm produces a good approximation and optimization solution for the thickness, and we program this algorithm with C++, and running it by laptop has RAM 2GB and CPU 2.27GHZ.
We propose a rolling version of the Latent Dirichlet Allocation, called RollingLDA. By a sequential approach, it enables the construction of LDA-based time series of topics that are consistent with previous states of LDA models. After an initial mode ling, updates can be computed efficiently, allowing for real-time monitoring and detection of events or structural breaks. For this purpose, we propose suitable similarity measures for topics and provide simulation evidence of superiority over other commonly used approaches. The adequacy of the resulting method is illustrated by an application to an example corpus. In particular, we compute the similarity of sequentially obtained topic and word distributions over consecutive time periods. For a representative example corpus consisting of The New York Times articles from 1980 to 2020, we analyze the effect of several tuning parameter choices and we run the RollingLDA method on the full dataset of approximately 4 million articles to demonstrate its feasibility.
The restoration of the network from failure got through different mechanisms and algorithms at different levels of network in particular time. Recovery methods of optical network based on GMPLS Do not take any consideration for the integration of communication when it establish the backup path, which can lead to greater harm, such as the failure may occur to a link or a path in which a large number of contacts concentrated. Also, such concentration may leads to a negative impact in terms of the network survivability.
The all-nodes shortest paths problem is undoubtedly one of the most basic problems in algorithmic graph theory. In this paper, we introduce simple and efficient algorithm for all nodes shortest paths problem for directed (undirected) graphs. In th is problem, we find the shortest path from a given source node to all other nodes in the graph, in which the shortest path is a path with minimum cost, i.e., sum of the edge weights. We proved that the complexity of the proposed algorithm in this paper depends only on the edges graph, and we show that the time of implementation of this algorithm is linear time O(m) and This is considered the best times of the algorithms at all. And a Comparison between complexity of proposed algorithm and the famous shortest path algorithms have been made, and the obtained results have shown that the complexity of the proposed algorithm is best.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا