ﻻ يوجد ملخص باللغة العربية
We present a method for stellarator coil design via gradient-based optimization of the coil-winding surface. The REGCOIL (Landreman 2017 Nucl. Fusion 57 046003) approach is used to obtain the coil shapes on the winding surface using a continuous current potential. We apply the adjoint method to calculate derivatives of the objective function, allowing for efficient computation of analytic gradients while eliminating the numerical noise of approximate derivatives. We are able to improve engineering properties of the coils by targeting the root-mean-squared current density in the objective function. We obtain winding surfaces for W7-X and HSX which simultaneously decrease the normal magnetic field on the plasma surface and increase the surface-averaged distance between the coils and the plasma in comparison with the actual winding surfaces. The coils computed on the optimized surfaces feature a smaller toroidal extent and curvature and increased inter-coil spacing. A technique for visualization of the sensitivity of figures of merit to normal surface displacement of the winding surface is presented, with potential applications for understanding engineering tolerances.
Coil complexity is a critical consideration in stellarator design. The traditional two-step optimization approach, in which the plasma boundary is optimized for physics properties and the coils are subsequently optimized to be consistent with this bo
In recent years many efforts have been undertaken to simplify coil designs for stellarators due to the difficulties in fabricating non-planar coils. The FOCUS code removes the need for a winding surface and represents the coils as arbitrary curves in
The condition of omnigenity is investigated, and applied to the near-axis expansion of Garren and Boozer (1991a). Due in part to the particular analyticity requirements of the near-axis expansion, we find that, excluding quasi-symmetric solutions, on
This paper presents an efficient gradient projection-based method for structural topological optimization problems characterized by a nonlinear objective function which is minimized over a feasible region defined by bilateral bounds and a single line
Sparsity-inducing regularization problems are ubiquitous in machine learning applications, ranging from feature selection to model compression. In this paper, we present a novel stochastic method -- Orthant Based Proximal Stochastic Gradient Method (