skip to main content
10.1145/2463664.2465217acmconferencesArticle/Chapter ViewAbstractPublication PagespodsConference Proceedingsconference-collections
research-article

Sketching via hashing: from heavy hitters to compressed sensing to sparse fourier transform

Published:22 June 2013Publication History
First page image

References

  1. Nir Ailon and Bernard Chazelle. Faster dimension reduction. Communications of the ACM, 53(2):97--104, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. A. Akavia, S. Goldwasser, and S. Safra. Proving hard-core predicates using list decoding. FOCS, 44:146--159, 2003. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. A. Akavia. Deterministic sparse Fourier approximation via fooling arithmetic progressions. COLT, pages 381--393, 2010.Google ScholarGoogle Scholar
  4. Nir Ailon and Edo Liberty. An almost optimal unrestricted fast Johnson-Lindenstrauss transform. SODA, pages 185--191, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. 2}BCGLSP. Boufounos, V. Cevher, A. C. Gilbert, Y. Li, and M. J. Strauss. What's the frequency, kenneth?: Sublinear Fourier sampling off the grid. RANDOM/APPROX, 2012.Google ScholarGoogle ScholarCross RefCross Ref
  6. R. Berinde, A. Gilbert, P. Indyk, H. Karloff, and M. Strauss. Combining geometry and combinatorics: a unified approach to sparse signal recovery. Allerton, 2008.Google ScholarGoogle ScholarCross RefCross Ref
  7. R. Berinde, P. Indyk, and M. Ruzic. Practical near-optimal sparse recovery in the l_1 norm. Allerton, 2008.Google ScholarGoogle ScholarCross RefCross Ref
  8. Mayank Bakshi, Sidharth Jaggi, Sheng Cai, and Minghua Chen. Sho-fa: Robust compressive sensing with order-optimal complexity, measurements, and bits. arXiv preprint arXiv:1207.2335, 2012.Google ScholarGoogle Scholar
  9. Andrei Broder and Michael Mitzenmacher. Network applications of bloom filters: A survey. Internet Mathematics, 1(4):485--509, 2004.Google ScholarGoogle ScholarCross RefCross Ref
  10. Vladimir Braverman, Rafail Ostrovsky, and Yuval Rabani. Rademacher chaos, random eulerian graphs and the sparse Johnson-Lindenstrauss transform. arXiv preprint arXiv:1011.2590, 2010.Google ScholarGoogle Scholar
  11. M. Charikar, K. Chen, and M. Farach-Colton. Finding frequent items in data streams. ICALP, 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Mahdi Cheraghchi, Venkatesan Guruswami, and Ameya Velingker. Restricted isometry of Fourier matrices and list decodability of random linear codes. SODA, 2013.Google ScholarGoogle ScholarCross RefCross Ref
  13. Saar Cohen and Yossi Matias. Spectral bloom filters. SIGMOD, pages 241--252, 2003. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Graham Cormode and S. Muthukrishnan. What's hot and what's not: tracking most frequent items dynamically. PODS, pages 296--306, 2003. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. G. Cormode and S. Muthukrishnan. Improved data stream summaries: The count-min sketch and its applications. LATIN, 2004.Google ScholarGoogle ScholarCross RefCross Ref
  16. G. Cormode and S. Muthukrishnan. Combinatorial algorithms for Compressed Sensing. Proc. 40th Ann. Conf. Information Sciences and Systems, Mar. 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. E. J. Candès, J. Romberg, and T. Tao. Stable signal recovery from incomplete and inaccurate measurements. Comm. Pure Appl. Math., 59(8):1208--1223, 2006.Google ScholarGoogle ScholarCross RefCross Ref
  18. Emmanuel J Candes and Terence Tao. Near-optimal signal recovery from random projections: Universal encoding strategies? Information Theory, IEEE Transactions on, 52(12):5406--5425, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Kenneth L Clarkson and David P Woodruff. Low rank approximation and regression in input sparsity time. STOC, 2013. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. K. Do Ba, P. Indyk, E. Price, and D. Woodruff. Lower bounds for sparse recovery. SODA, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Anirban Dasgupta, Ravi Kumar, and Tamás Sarlós. A sparse Johnson-Lindenstrauss transform. STOC, pages 341--350, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. D. L. Donoho. Compressed Sensing. Information Theory, IEEE Transactions on, 52(4):1289--1306, Apr. 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Cristian Estan and George Varghese. New directions in traffic measurement and accounting. SIGCOMM, 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. Li Fan, Pei Cao, Jussara M. Almeida, and Andrei Z. Broder. Summary cache: A scalable wide-area web cache sharing protocol. SIGCOMM, pages 254--265, 1998. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. S. Foucart, A. Pajor, H. Rauhut, and T. Ullrich. The gelfand widths of lp-balls for 0 < p łeq 1. preprint, 2010.Google ScholarGoogle Scholar
  26. Min Fang, Narayanan Shivakumar, Hector Garcia-Molina, Rajeev Motwani, and Jeffrey D Ullman. Computing iceberg queries efficiently. VLDB, 1998. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. A. Gilbert, S. Guha, P. Indyk, M. Muthukrishnan, and M. Strauss. Near-optimal sparse Fourier representations via sampling. STOC, 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. A. C. Gilbert, S. Guha, P. Indyk, Y. Kotidis, S. Muthukrishnan, and M. J. Strauss. Fast, small-space algorithms for approximate histogram maintenance. STOC, 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. Badih Ghazi, Haitham Hassanieh, Piotr Indyk, Dina Katabi, Eric Price, and Lixin Shi. Sample-optimal average-case sparse fourier transform in two dimensions. arXiv preprint arXiv:1303.1209, 2013.Google ScholarGoogle Scholar
  30. A. Gilbert and P. Indyk. Sparse recovery using sparse matrices. Proceedings of IEEE, 2010.Google ScholarGoogle ScholarCross RefCross Ref
  31. Anna Gilbert, Piotr Indyk, Dina Katabi, and Ramesh Raskar. Workshop on sparse Fourier transform etc. http://groups.csail.mit.edu/ netmit/sFFT/workshop.html, 2013.Google ScholarGoogle Scholar
  32. O. Goldreich and L. Levin. A hard-corepredicate for all one-way functions. STOC, pages 25--32, 1989. Google ScholarGoogle ScholarDigital LibraryDigital Library
  33. A. C. Gilbert, Y. Li, E. Porat, and M. J. Strauss. Approximate sparse recovery: optimizing time and measurements. STOC, pages 475--484, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  34. Michael T Goodrich and Michael Mitzenmacher. Invertible bloom lookup tables. In Allerton, pages 792--799, 2011.Google ScholarGoogle ScholarCross RefCross Ref
  35. A. Gilbert, M. Muthukrishnan, and M. Strauss. Improved time bounds for near-optimal space Fourier representations. SPIE Conference, Wavelets, 2005.Google ScholarGoogle Scholar
  36. O. Goldreich. Modern cryptography, probabilistic proofs and pseudorandomness. Algorithms and Combinatorics, 17, 1999.Google ScholarGoogle Scholar
  37. Parikshit Gopalan, Ryan O'Donnell, Rocco A. Servedio, Amir Shpilka, and Karl Wimmer. Testing fourier dimensionality and sparsity. SIAM J. Comput., 40(4):1075--1100, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  38. A. C. Gilbert, M. J. Strauss, J. A. Tropp, and R. Vershynin. One sketch for all: fast algorithms for compressed sensing. STOC, pages 237--246, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  39. H. Hassanieh, P. Indyk, D. Katabi, and E. Price. Near-optimal algorithm for sparse Fourier transform. STOC, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  40. H. Hassanieh, P. Indyk, D. Katabi, and E. Price. Simple and practical algorithm for sparse Fourier transform. SODA, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  41. Sabine Heider, Stefan Kunis, Daniel Potts, and Michael Veit. A sparse prony fft. 2013.Google ScholarGoogle Scholar
  42. P. Indyk. Explicit constructions for compressed sensing of sparse signals. SODA, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  43. P. Indyk and M. Ruzic. Near-optimal sparse recovery in the $l_1$ norm. FOCS, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  44. M. A. Iwen. Combinatorial sublinear-time Fourier algorithms. Foundations of Computational Mathematics, 10:303--338, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  45. M.A. Iwen. Improved approximation guarantees for sublinear-time Fourier algorithms. Applied And Computational Harmonic Analysis, 2012.Google ScholarGoogle Scholar
  46. W. B. Johnson and J. Lindenstrauss. Extensions of Lipschitz mapping into Hilbert space. Conf. in modern analysis and probability, 26:189--206, 1984.Google ScholarGoogle ScholarCross RefCross Ref
  47. S. Jafarpour, W. Xu, B. Hassibi, and A. R. Calderbank. Efficient and robust compressed sensing using high-quality expander graphs. Information Theory, IEEE Transactions on, 55(9):4299--4308, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  48. M. A. Khajehnejad, A. G. Dimakis, W. Xu, and B. Hassibi. Sparse recovery of positive signals with minimal expansion. Signal Processing, IEEE Transactions on, 59(1):196--208, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  49. E. Kushilevitz and Y. Mansour. Learning decision trees using the Fourier spectrum. STOC, 1991. Google ScholarGoogle ScholarDigital LibraryDigital Library
  50. Daniel M Kane and Jelani Nelson. Sparser Johnson-Lindenstrauss transforms. SODA, pages 1195--1206, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  51. Felix Krahmer and Rachel Ward. New and improved Johnson-Lindenstrauss embeddings via the restricted isometry property. SIAM Journal on Mathematical Analysis, 43(3):1269--1281, 2011.Google ScholarGoogle ScholarCross RefCross Ref
  52. L.A. Levin. Randomness and non-determinism. J. Symb. Logic, 58(3):1102--1103, 1993.Google ScholarGoogle Scholar
  53. Y. Lu, A. Montanari, B. Prabhakar, S. Dharmapurikar, and A. Kabbani. Counter braids: A novel counter architecture for per-flow measurement. SIGMETRICS, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  54. D. Lawlor, Y. Wang, and A. Christlieb. Adaptive sub-linear time Fourier algorithms. arXiv:1207.6368, 2012.Google ScholarGoogle Scholar
  55. Y. Mansour. Randomized interpolation and approximation of sparse polynomials. ICALP, 1992. Google ScholarGoogle ScholarDigital LibraryDigital Library
  56. Xiangrui Meng and Michael W Mahoney. Low-distortion subspace embeddings in input-sparsity time and applications to robust linear regression. STOC, 2013. Google ScholarGoogle ScholarDigital LibraryDigital Library
  57. S Muthukrishnan. Data streams: Algorithms and applications. Now Publishers Inc, 2005.Google ScholarGoogle ScholarCross RefCross Ref
  58. Jelani Nelson and Huy L Nguyên. Osnap: Faster numerical linear algebra algorithms via sparser subspace embeddings. arXiv preprint arXiv:1211.1002, 2012.Google ScholarGoogle Scholar
  59. Jelani Nelson, Eric Price, and Mary Wootters. New constructions of RIP matrices with fast multiplication and fewer rows. CoRR, abs/1211.0986, 2012.Google ScholarGoogle Scholar
  60. Sameer Pawar and Kannan Ramchandran. A hybrid dft-ldpc framework for fast, efficient and robust compressive sensing. In Allerton, pages 1943--1950, 2012.Google ScholarGoogle ScholarCross RefCross Ref
  61. M. Rudelson and R. Veshynin. Sparse reconstruction by convex relaxation: Fourier and Gaussian measurements. Proc. 40th Ann. Conf. Information Sciences and Systems, Mar. 2006.Google ScholarGoogle ScholarCross RefCross Ref
  62. S. Sarvotham, D. Baron, and R. G. Baraniuk. Sudocodes - fast measurement and reconstruction of sparse signals. IEEE International Symposium on Information Theory, 2006.Google ScholarGoogle Scholar
  63. S. Sarvotham, D. Baron, and R. G. Baraniuk. Bayesian compressive sensing via belief propagation. Signal Processing, IEEE Transactions on, 58(1):269--280, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  64. Qinfeng Shi, James Petterson, Gideon Dror, John Langford, Alex Smola, and SVN Vishwanathan. Hash kernels for structured data. The Journal of Machine Learning Research, 10:2615--2637, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  65. Kilian Weinberger, Anirban Dasgupta, John Langford, Alex Smola, and Josh Attenberg. Feature hashing for large scale multitask learning. ICML, pages 1113--1120, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  66. Wei Wang, Minos Garofalakis, and Kannan Ramchandran. Distributed sparse random projections for refinable approximation. IPSN, pages 331--339, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  67. W. Wang, M. J. Wainwright, and K. Ramchandran. Information-theoretic limits on sparse signal recovery: Dense versus sparse measurement matrices. Information Theory, IEEE Transactions on, 56(6):2967--2979, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  68. W. Xu and B. Hassibi. Efficient compressive sensing with determinstic guarantees using expander graphs. IEEE InformationGoogle ScholarGoogle Scholar

Index Terms

  1. Sketching via hashing: from heavy hitters to compressed sensing to sparse fourier transform

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in
      • Published in

        cover image ACM Conferences
        PODS '13: Proceedings of the 32nd ACM SIGMOD-SIGACT-SIGAI symposium on Principles of database systems
        June 2013
        334 pages
        ISBN:9781450320665
        DOI:10.1145/2463664
        • General Chair:
        • Richard Hull,
        • Program Chair:
        • Wenfei Fan

        Copyright © 2013 ACM

        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 22 June 2013

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • research-article

        Acceptance Rates

        PODS '13 Paper Acceptance Rate24of97submissions,25%Overall Acceptance Rate642of2,707submissions,24%

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader