New Uniform Bounds for Almost Lossless Analog Compression
June 18, 2019 Β· Declared Dead Β· π International Symposium on Information Theory
"No code URL or promise found in abstract"
Evidence collected by the PWNC Scanner
Authors
Yonatan Gutman, Adam Εpiewak
arXiv ID
1906.07620
Category
math.DS
Cross-listed
cs.IT
Citations
7
Venue
International Symposium on Information Theory
Last Checked
2 months ago
Abstract
Wu and VerdΓΊ developed a theory of almost lossless analog compression, where one imposes various regularity conditions on the compressor and the decompressor with the input signal being modelled by a (typically infinite-entropy) stationary stochastic process. In this work we consider all stationary stochastic processes with trajectories in a prescribed set $\mathcal{S} \subset [0,1]^\mathbb{Z}$ of (bi)infinite sequences and find uniform lower and upper bounds for certain compression rates in terms of metric mean dimension and mean box dimension. An essential tool is the recent Lindenstrauss-Tsukamoto variational principle expressing metric mean dimension in terms of rate-distortion functions.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
π Similar Papers
In the same crypt β math.DS
R.I.P.
π»
Ghosted
R.I.P.
π»
Ghosted
Linearly-Recurrent Autoencoder Networks for Learning Dynamics
R.I.P.
π»
Ghosted
Gradient Descent Only Converges to Minimizers: Non-Isolated Critical Points and Invariant Regions
R.I.P.
π»
Ghosted
Eigendecompositions of Transfer Operators in Reproducing Kernel Hilbert Spaces
R.I.P.
π»
Ghosted
From rate distortion theory to metric mean dimension: variational principle
R.I.P.
π»
Ghosted
Double variational principle for mean dimension
Died the same way β π» Ghosted
R.I.P.
π»
Ghosted
Language Models are Few-Shot Learners
R.I.P.
π»
Ghosted
PyTorch: An Imperative Style, High-Performance Deep Learning Library
R.I.P.
π»
Ghosted
XGBoost: A Scalable Tree Boosting System
R.I.P.
π»
Ghosted