For any finite point set in $D$-dimensional space equipped with the 1-norm, we present random linear embeddings to $k$-dimensional space, with a new metric, having the following properties. For any pair of points from the point set that are not too close, the distance between their images is a strictly concave increasing function of their original distance, up to multiplicative error. The target dimension $k$ need only be quadratic in the logarithm of the size of the point set to ensure the result holds with high probability. The linear embeddings are random matrices composed of standard Cauchy random variables, and the proofs rely on Chernoff bounds for sums of iid random variables. The new metric is translation invariant, but is not induced by a norm.
"Linear dimension reduction approximately preserving a function of the 1-norm." Electron. J. Statist. 14 (2) 4361 - 4394, 2020. https://doi.org/10.1214/20-EJS1773