ﻻ يوجد ملخص باللغة العربية
In this study, a novel topology optimization approach based on conditional Wasserstein generative adversarial networks (CWGAN) is developed to replicate the conventional topology optimization algorithms in an extremely computationally inexpensive way. CWGAN consists of a generator and a discriminator, both of which are deep convolutional neural networks (CNN). The limited samples of data, quasi-optimal planar structures, needed for training purposes are generated using the conventional topology optimization algorithms. With CWGANs, the topology optimization conditions can be set to a required value before generating samples. CWGAN truncates the global design space by introducing an equality constraint by the designer. The results are validated by generating an optimized planar structure using the conventional algorithms with the same settings. A proof of concept is presented which is known to be the first such illustration of fusion of CWGANs and topology optimization.
Topology design optimization offers tremendous opportunity in design and manufacturing freedoms by designing and producing a part from the ground-up without a meaningful initial design as required by conventional shape design optimization approaches.
Performance of machine learning algorithms depends critically on identifying a good set of hyperparameters. While recent approaches use Bayesian optimization to adaptively select configurations, we focus on speeding up random search through adaptive
Topology optimization has emerged as a popular approach to refine a components design and increasing its performance. However, current state-of-the-art topology optimization frameworks are compute-intensive, mainly due to multiple finite element anal
Automatic evaluation of the goodness of Generative Adversarial Networks (GANs) has been a challenge for the field of machine learning. In this work, we propose a distance complementary to existing measures: Topology Distance (TD), the main idea behin
We revisit the initialization of deep residual networks (ResNets) by introducing a novel analytical tool in free probability to the community of deep learning. This tool deals with non-Hermitian random matrices, rather than their conventional Hermiti