Do you want to publish a course? Click here

A comparative study of compression algorithms and their impact on data communication in networks

دراسة مقارنة لخوارزميات الضغط و أثرها على تراسل المعطيات في الشبكات

884   0   13   0.0 ( 0 )
 Publication date 2018
and research's language is العربية
 Created by Shamra Editor




Ask ChatGPT about the research

Due to the large increase in the use of data communication and information exchange services of different types in different environments, the standard and the programming had to be a language of characterization is ideal for scalability and development that serve the growing needs in the best form and in the shortest possible time and was the most widely used language and the most widely used XML language. he adoption of graphics architecture sometimes created a problem affecting the performance of information transmission networks due to the large volume of data exchanged as well as the need for large storage capacity at both ends of the transmission and reception. Effective ways of reducing the amount of data exchanged through the network had to be found. There have been many scientific researches and practical experiments on finding effective ways to reduce the actual size of the data and by adopting different parameters that affect the process of compressing the files so as to achieve better results by reducing the volumes of files exchanged with attention to times of compression and decompression of files. In this research, we focused on the study and comparison of some compression algorithms for files and their effect on data communication in networks.

References used
SOHAIL ANSARI ; PRAJEET SHARMA , XML Optimization and Compression , International Journal of Innovations & Advancement in Computer Science , March 2015
SHERIF SAKR , Investigate state-of-the-art XML compression techniques , IBM Corporation , 19 July 2011
WILFRED NG ; WAI-YEUNG LAM ; JAMES CHENG , Comparative Analysis of XML Compression Technologies , March 2006
rate research

Read More

Wireless sensor networks (WSNs) are often deployed by random bestrewing (airplane bestrewing for example). A majority of nodes cannot obtain their coordinate beforehand. Therefore, how to obtain the position information of unknown nodes, which is called localization problem, has become a hot topic in WSN. Without position information, WSN cannot work properly. Global Position System (GPS) is the most extensive and mature position system at present. But because the nodes usually have the shortcoming of high expenditure, large volume, high cost and require settled basal establishment, therefore, the GPS is inapplicable for the low-cost selfconfigure sensor networks, and also it is impossible to install GPS for each sensor node. In this paper, we will study localization mechanisms (which is not based on GPS) used in WSN, and will test the effectiveness of using MUSIC algorithm in determining the signal arrival angel depending on the SDMA- technology and ESPAR antenna.
The ability of data mining to provide predictive information derived from huge databases became an effective tool in the hands of companies and individuals، allowing them to focus on areas that are important to them from the massive data generated by the march of their daily lives. Along with the increasing importance of this science there was a rapidly increasing in the tools that produced to implement the theory concepts as fast as possible. So it will be hard to take a decision on which of these tools is the best to perform the desired task. This study provides a comparison between the two most commonly used data mining tools according to opinion polls، namely: Rapidminer and R programming language in order to help researchers and developers to choose the best suited tool for them between the two. Adopted the comparison on seven criteria: platform، algorithms، input/output formats، visualization، user’s evaluation، infrastructure and potential development، and performance by applying a set of classification algorithms on a number of data sets and using two techniques to split data set: cross validation and hold-out to make sure of the results. The Results show that R supports the largest number of algorithms، input/output formats، and visualization. While Rapidminer superiority in terms of ease of use and support for a greater number of platforms. In terms of performance the accuracy of classification models that were built using the R packages were higher. That was not true in some cases imposed by the nature of the data because we did not added any pre-processing stage. Finally the preference option in any tool is depending on the extent of the user experience and purpose that the tool is used for
Application-Level Multicast (ALM) has been proposed as an alternative solution to overcome the lack of deployment of the IP Multicast group communication model. It builds an overlay tree consisting of end-to-end unicast connections between end-host s based on the collaboration of group members with each other. The efficiency of the constructed overlay tree depends entirely on the honesty and on the cooperation of all participating members. However such behaviour can not be guaranteed and some selfish and non-cooperative nodes may take profit from the honesty of other members in the overlay. Recently, many researchers have been investigating the impact of selfishness of nodes in the overlay multicast. Our contribution in this paper is to describe in detail the basic algorithms used to construct the overlay tree, and evaluate the impact of cheating nodes on the stability and on the performance of constructed overlay tree using these algorithms.
Many studies have tried to determine the impact of change orders on the cost and time of the project, which in turn leads to differences and disputes between contractors and owners. Where change orders dealt with in various engineering projects. T his search displays formal causes of change orders occurring during the life cycle of the project in Syrian coastal zone. Particular building projects are studied, and the most important impact on completion of the project indicators (cost_time) is discussed. Also identifies the party responsible for the change, and shows the weak points during follow the change order life cycle and provides recommendations for each of the responsible parties, stressing the need to monitor performance in order to manage change order and address the causes and impact alleviation. The prediction models were drafted at additional cost that may result from change orders.
The sound is an essential component of multimedia, and due to the needto be used in many life applications such as television broadcasting andcommunication programs, so it was necessary for the existence of audio signal processing techniquessuch as compressing, improving, and noisereduction. Data compression process aims to reduce the bit rate used, by doing encoding information using fewer bits than the original representation for transmitting and storing. By this process,the unnecessary information is determined and removed, that means it gives the compressed information for useable compression, which we need as a fundamental, not the minutest details. This research aims to study how to process sound and musical signal. It's a process that consists of a wide range of applications like coding and digital compression for the effective transport and storage on mobile phones and portable music players, modeling and reproduction of the sound of musical instruments and music halls and the harmonics of digital music, editing digital music, and classification of music content, and other things.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا