Do you want to publish a course? Click here

On the Whitney extension problem for near isometries and beyond

241   0   0.0 ( 0 )
 Added by Steven Damelin Dr
 Publication date 2021
and research's language is English




Ask ChatGPT about the research

In this memoir, we develop a general framework which allows for a simultaneous study of labeled and unlabeled near alignment data problems in $mathbb R^D$ and the Whitney near isometry extension problem for discrete and non-discrete subsets of $mathbb R^D$ with certain geometries. In addition, we survey related work of ours on clustering, dimension reduction, manifold learning, vision as well as minimal energy partitions, discrepancy and min-max optimization. Numerous open problems in harmonic analysis, computer vision, manifold learning and signal processing connected to our work are given. A significant portion of the work in this memoir is based on joint research with Charles Fefferman in the papers [48], [49], [50], [51].



rate research

Read More

391 - Fushuai Jiang 2019
Let $ f $ be a real-valued function on a compact subset in $ mathbb{R}^n $. We show how to decide if $ f $ extends to a nonnegative and $ C^1 $ function on $ mathbb{R}^n $. There has been no known result for nonnegative $ C^m $ extension from a general compact set $ E $ when $ m > 0 $. The nonnegative extension problem for $ m geq 2 $ remains open.
We characterize the validity of the Whitney extension theorem in the ultradifferentiable Roumieu setting with controlled loss of regularity. Specifically, we show that in the main Theorem 1.3 of [15] condition (1.3) can be dropped. Moreover, we clarify some questions that remained open in [15].
Let $Dgeq 2$, $Ssubset mathbb R^D$ be finite and let $phi:Sto mathbb R^D$ with $phi$ a small distortion on $S$. We solve the Whitney extension-interpolation-alignment problem of how to understand when $phi$ can be extended to a function $Phi:mathbb R^Dto mathbb R^D$ which is a smooth small distortion on $mathbb R^D$. Our main results are in addition to Whitney extensions, results on interpolation and alignment of data in $mathbb R^D$ and complement those of [14,15,20].
We study approximately differentiable functions on metric measure spaces admitting a Cheeger differentiable structure. The main result is a Whitney-type characterization of approximately differentiable functions in this setting. As an application, we prove a Stepanov-type theorem and consider approximate differentiability of Sobolev, BV and maximal functions.
We analyze the Gamblers problem, a simple reinforcement learning problem where the gambler has the chance to double or lose the bets until the target is reached. This is an early example introduced in the reinforcement learning textbook by Sutton and Barto (2018), where they mention an interesting pattern of the optimal value function with high-frequency components and repeating non-smooth points. It is however without further investigation. We provide the exact formula for the optimal value function for both the discrete and the continuous cases. Though simple as it might seem, the value function is pathological: fractal, self-similar, derivative taking either zero or infinity, and not written as elementary functions. It is in fact one of the generalized Cantor functions, where it holds a complexity that has been uncharted thus far. Our analyses could provide insights into improving value function approximation, gradient-based algorithms, and Q-learning, in real applications and implementations.

suggested questions

comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا