Sampling formulas involving differences in shift-invariant subspaces: a unified approach
September 12, 2017 Β· Declared Dead Β· π arXiv.org
"No code URL or promise found in abstract"
Evidence collected by the PWNC Scanner
Authors
Antonio G. GarcΓa, MarΓa J. MuΓ±oz-Bouzo
arXiv ID
1709.03893
Category
math.FA
Cross-listed
cs.IT
Citations
0
Venue
arXiv.org
Last Checked
2 months ago
Abstract
Successive differences on a sequence of data help to discover some smoothness features of this data. This was one of the main reasons for rewriting the classical interpolation formula in terms of such data differences. The aim of this paper is to mimic them to a sequence of regular samples of a function in a shift-invariant subspace allowing its stable recovery. A suitable expression for the functions in the shift-invariant subspace by means of an isomorphism with the $L^2(0,1)$ space is the key to identify the simple pattern followed by the dual Riesz bases involved in the derived formulas. The paper contains examples illustrating different non-exhaustive situations including also the two-dimensional case.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
π Similar Papers
In the same crypt β math.FA
R.I.P.
π»
Ghosted
R.I.P.
π»
Ghosted
Tables of the existence of equiangular tight frames
R.I.P.
π»
Ghosted
Approximation spaces of deep neural networks
R.I.P.
π»
Ghosted
Sampling Theorems for Shift-invariant Spaces, Gabor Frames, and Totally Positive Functions
R.I.P.
π»
Ghosted
Eldan's Stochastic Localization and the KLS Conjecture: Isoperimetry, Concentration and Mixing
R.I.P.
π»
Ghosted
Equivalence of approximation by convolutional neural networks and fully-connected networks
Died the same way β π» Ghosted
R.I.P.
π»
Ghosted
Language Models are Few-Shot Learners
R.I.P.
π»
Ghosted
PyTorch: An Imperative Style, High-Performance Deep Learning Library
R.I.P.
π»
Ghosted
XGBoost: A Scalable Tree Boosting System
R.I.P.
π»
Ghosted