Do you want to publish a course? Click here

We observe that the development cross-entropy loss of supervised neural machine translation models scales like a power law with the amount of training data and the number of non-embedding parameters in the model. We discuss some practical implication s of these results, such as predicting BLEU achieved by large scale models and predicting the ROI of labeling data in low-resource language pairs.
Best-worst Scaling (BWS) is a methodology for annotation based on comparing and ranking instances, rather than classifying or scoring individual instances. Studies have shown the efficacy of this methodology applied to NLP tasks in terms of a higher quality of the datasets produced by following it. In this system demonstration paper, we present Litescale, a free software library to create and manage BWS annotation tasks. Litescale computes the tuples to annotate, manages the users and the annotation process, and creates the final gold standard. The functionalities of Litescale can be accessed programmatically through a Python module, or via two alternative user interfaces, a textual console-based one and a graphical Web-based one. We further developed and deployed a fully online version of Litescale complete with multi-user support.
Contrastive learning has been applied successfully to learn vector representations of text. Previous research demonstrated that learning high-quality representations benefits from batch-wise contrastive loss with a large number of negatives. In pract ice, the technique of in-batch negative is used, where for each example in a batch, other batch examples' positives will be taken as its negatives, avoiding encoding extra negatives. This, however, still conditions each example's loss on all batch examples and requires fitting the entire large batch into GPU memory. This paper introduces a gradient caching technique that decouples backpropagation between contrastive loss and the encoder, removing encoder backward pass data dependency along the batch dimension. As a result, gradients can be computed for one subset of the batch at a time, leading to almost constant memory usage.
This study aimed at opening discussions concerning new ideas of suggesting sustainable scenario based on the principle of an integrated spatial development ring between urban-rural areas along and between AL-Abrash and AL- Hseen rivers watershed, as an application of the bottom-up planning model, seeking to achieve ruralization in parallel with urbanization. This paper adopted data collections and analysis through using a step-by-step approach. Firstly, investigated the land-cover change (LCC) during 30-years. Therefore, using multi-temporal satellite data from different dates for the same study area to create thematic land cover maps which can be used for land cover change detection. Three Landsat satellite images from1987, 2002 and 2017 were classified separately using the supervised classification method in ArcGIS, to provide an economical way to quantify, map and analyse changes over time in land cover. Then, SWOT analysis for the possibilities and determinants within the two-way flow of the current and futuristic economic activity, besides discussion the opportunities of the land-use (LU) taking into account slop map to achieve conservation priority for natural resources. Finally, evaluate results and establishing a sustainable spatial scenario approachable to upgrade into scaling up/out to covering the coastal region watersheds, can support regional planning and decision-making in the future.
Response spectrum analysis and equivalent static analysis is widely used by engineers and engineering offices to estimate buildings and structures response to earthquakes. But performance based procedures to evaluate buildings and new designs acco rding to Syrian code and other international codes require response analysis using smallest of earthquake records, where we can estimate engineeringdemandparameters(EDPs)— floordisplacements,storydrifts,memberforces,memberdeformations,etc.— ofbuildingsandspecialstructuressubjectedtogroundmotions, consecutively to verify required performance criteria. Theserecordsshouldbeproperlyselectedandscaledincompliancewithsitespecifichazardconditionstoestimate (EDPs) and ensure that they verify ―expected‖ median demands. In this study, background, selection procedures compatible with Syrian code, and review of most scaling methods were introduced. The structural response was studied by comparing displacements due to response spectrum analysis, scaled records using PGA, and synthetic time histories records in time domain and frequency domain (generated according to Syrian response spectrum). Tow three-dimensional models of real buildings in Lattakia city were used as study cases, the results obtained by 20 analysis processes. The results show that analysis using synthetic records compatible with Syrian code give noticeably less displacements estimates comparing with response spectrum analysis and analysis using records scaled by PGA scaling.
Structural design for seismic loading, which is traditionally done for most types of common structures by the means of equivalent lateral static loading or modal spectrum analysis, is no longer a preferred methodology for design of modern structures with complex topology and functionality under extreme loading scenarios. Nonlinear response history evaluation, on the other hand, is becoming a practical tool due to availability of high performance computing and recommendations of the new seismic guidelines, and due to the increase of available strong ground motion database. Therefor using and scaling real recorded accelerograms is becoming one of the most contemporary research issues in this field. Seismological characteristics of the records, such as earthquake magnitude, epicentral distance and site classification are usually considered in the selection of real records, as they influence the shape of the response spectrum, the energy content and duration of strong ground shaking, and therefore the expected demand on structures. After real seismic records selection it is necessary to scale these records to match the intensity of the earthquake expected for the site. Generally, scaling can be made by ground motions uniform scaling in time domain which is simply scaled up or down the ground motions uniformly to best match (in average) the target spectrum within a period range of interest. It’s an engineer’s job to find the best scaling factors to best match the target spectrum, which is a complex task, so we employed the Genetic Algorithm (GA) in finding them to achieve the best results. When testing the selected and scaled ground motions, it’s a standard procedure to use the nonlinear time history analysis to validate the results in terms of structural responses and their variation. this proves the efficiency of the presented procedure. In this study, basic methodologies for selecting and scaling strong ground motion time histories are summarized, the selection and scaling criteria of real time history records to satisfy the Syrian design code are discussed. The GA scaling procedures are utilized to scale 10 set of records, every set consists of seven records of available real records to match the Syrian design spectra. The resulting time histories are investigated and compared in terms of suitability as input to time history analysis of civil engineering structures, by mean of time history analyses of SDOF systems which are conducted to examine the efficiency of the scaling method in reducing the scatter in structural response. The nonlinear response of SDOF systems is represented by bilinear hysteretic model. Assuming 5 different Periods, yield strength reduction factor, R= 4.5, α=3% post-yield stiffness, a number of 700 runs of analysis are conducted. The results are described for elastic displacement D.
: Nonlinear response history evaluation is becoming a practical tool due to availability of high performance computing and recommendations of the new seismic guidelines, and due to the increase of available strong ground motion database. When testing the selected and scaled ground motions, it’s a standard procedure to use the time history analysis to validate the results in terms of structural responses and their variation. this proves the efficiency of the presented procedure. In this study the selection and scaling criteria of real time history records to satisfy the Syrian design code are discussed. Ten set of records have been selected and scaled, every set consists of seven records of available real records, to match the Syrian design spectra. The resulting time histories are investigated and compared in terms of suitability as input to time history analysis of civil engineering structures, by mean of time history analyses of SDOF systems which are conducted to examine the efficiency of the scaling method in reducing the scatter in structural response. The nonlinear response of SDOF systems is represented by bilinear hysteretic model. Assuming 5 different Periods, α=3% post-yield stiffness, a number of 700 runs of analysis are conducted. And a number of 280 runs of analysis are conducted for MDOF systems.
In this study, basic methodologies and procedures for generation synthetic time histories in time domain and frequency domain are summarized. These synthetic time histories are matching Syrian spectrum and compatible with wide range of buildings m odels and soil types according to the seismic parameters of Lattakia city. This paper will discuss the Selection and scaling criteria of three real time history records available in strong ground motion databases to satisfy the Syrian spectrum, and the suitability as input to time history analysis of civil engineering structures.
After selecting real seismic records it is necessary to scale these records to match the intensity of the earthquake expected for the site. Generally, scaling can be made by ground motions uniform scaling in time domain which is simply scaled up or d own the ground motions uniformly to best match (in average) the target spectrum within a period range of interest. It’s an engineer’s job to find the best scaling factors to best match the target spectrum, which is a complex task, so we employed the Genetic Algorithm (GA) in finding those scaling factors to achieve the best results. Genetic Algorithms (GAs) are probably the best-known types of artificial evolution search methods based on natural selection and mechanisms of population genetics. These algorithms are often applied to large, complex problems that are non-linear with multiple local optima. The power of the genetic algorithms is inherent in its capability to adapt. In natural systems, species adapt to the environment through successive interactions and generations subject to the environment. After several consecutive generations, only those species that can adapt well to the environment survive and the rest disappear. In mathematical terms, individuals are analogous to problem variables and environment is the stated problem. The final generation of the variable strings that can adapt to the problem is the solution. In this study, basic methodologies of the GA and the scaling procedures are summarized and the scaling criteria of real time history records to satisfy the Syrian design code are discussed. The traditional time domain scaling procedures and the scaling procedures using GA are used to scale a number of the available real records to match the Syrian design spectra. The resulting time histories of the procedures are investigated and compared in terms of meeting criteria.
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا