skip to main content
10.1145/1374376.1374451acmconferencesArticle/Chapter ViewAbstractPublication PagesstocConference Proceedingsconference-collections
research-article

Agnostically learning decision trees

Published: 17 May 2008 Publication History

Abstract

We give a query algorithm for agnostically learning decision trees with respect to the uniform distribution on inputs. Given black-box access to an *arbitrary* binary function f on the n-dimensional hypercube, our algorithm finds a function that agrees with f on almost (within an epsilon fraction) as many inputs as the best size-t decision tree, in time poly(n,t,1ε).
This is the first polynomial-time algorithm for learning decision trees in a harsh noise model. We also give a *proper* agnostic learning algorithm for juntas, a sub-class of decision trees, again using membership queries.
Conceptually, the present paper parallels recent work towards agnostic learning of halfspaces (Kalai et al, 2005); algorithmically, it is more challenging. The core of our learning algorithm is a procedure to implicitly solve a convex optimization problem over the L1 ball in 2n dimensions using an approximate gradient projection method.

References

[1]
A. Beimel, F. Bergadano, N. H. Bshouty, E. Kushilevitz, and S. Varricchio, Learning functions represented as multiplicity automata, J. ACM, 47 (2000), pp. 506--530.
[2]
L. Breiman, J. Friedman, R. Olshen, and C. Stone, Classification of Regression Trees, Wadsworth, 1984.
[3]
N. H. Bshouty, The monotone theory for the PAC-model, Inf. Comput, 186(1) (2003), pp. 20--35.
[4]
R. Caruana and A. Niculescu-Mizil, An empirical comparison of supervised learning algorithms, in Proc. 23rd Intl. Conf. Machine learning (ICML'06), 2006, pp. 161--168.
[5]
A. Ehrenfeucht and D. Haussler, Learning decision trees from random examples, Information and Computation, 82 (1989), pp. 231--246.
[6]
V. Feldman, P. Gopalan, S. Khot, and A. K. Ponnuswami, New results for learning noisy parities and halfspaces, in Proc. 47th IEEE Symp. on Foundations of Computer Science (FOCS'06), 2006.
[7]
A. Flaxman, A. T. Kalai, and H. B. McMahan, Online convex optimization in the bandit setting: gradient descent without a gradient, in ACM Symposium on Discrete Algorithms (SODA'05), 2005, pp. 385--394.
[8]
A. C. Gilbert, S. Guha, P. Indyk, S. Muthukrishnan, and M. Strauss, Near-optimal sparse Fourier representations via sampling, in Proc. 34th Ann. ACM Symp. on Theory of Computing (STOC'02), 2002, pp. 152--161.
[9]
A. C. Gilbert, M. J. Strauss, J. A. Tropp, and R. Vershynin, One sketch for all: Fast algorithms for compressed sensing, in Proc. 39th ACM Symposium on the Theory of Computing (STOC'07), 2007.
[10]
O. Goldreich and L. Levin, A hard-core predicate for all one-way functions., in Proc. 21st ACM Symp. on the Theory of Computing (STOC'89), 1989, pp. 25--32.
[11]
J. Jackson, The Harmonic sieve: a novel application of Fourier analysis to machine learning theory and practice, PhD thesis, Carnegie Mellon University, August 1995.
[12]
J. C. Jackson, Uniform-distribution learnability of noisy linear threshold functions with restricted focus of attention, in Proc. Conf. on Learning Theory (COLT'06), 2006, pp. 304--318.
[13]
A. T. Kalai, A. R. Klivans, Y. Mansour, and R. Servedio, Agnostically learning halfspaces, in Proc. 46th IEEE Symp. on Foundations of Computer Science (FOCS'05), 2005.
[14]
M. Kearns, R. Schapire, and L. Sellie, Toward Efficient Agnostic Learning, Machine Learning, 17 (1994), pp. 115--141.
[15]
E. Kushilevitz and Y. Mansour, Learning decision trees using the Fourier spectrum, SIAM Journal of Computing, 22(6) (1993), pp. 1331--1348.
[16]
N. Linial, Y. Mansour, and N. Nisan, Constant depth circuits, Fourier transform and learnability, Journal of the ACM, 40 (1993), pp. 607--620.
[17]
J. R. Quinlan, C4.5: Programs for Machine Learning, Morgan Kaufmann, 1992.
[18]
J. B. Rosen, The gradient projection method for nonlinear programming. part i. linear constraints, Journal of the Society for Industrial and Applied Mathematics, 8 (1960), pp. 181--217.
[19]
L. Valiant, A theory of the learnable, Communications of the ACM, 27 (1984), pp. 1134--1142.
[20]
M. Zinkevich, Online convex programming and generalized infinitesimal gradient ascent, in Proc. 20th Intl. Conf. on Machine Learning (ICML'03), 2003, pp. 928--936.

Cited By

View all
  • (2024)Agnostically Learning Multi-Index Models with Queries2024 IEEE 65th Annual Symposium on Foundations of Computer Science (FOCS)10.1109/FOCS61266.2024.00116(1931-1952)Online publication date: 27-Oct-2024
  • (2024)Fast Decision Tree Learning Solves Hard Coding-Theoretic Problems2024 IEEE 65th Annual Symposium on Foundations of Computer Science (FOCS)10.1109/FOCS61266.2024.00114(1893-1910)Online publication date: 27-Oct-2024
  • (2024)A New Bound for the Fourier-Entropy-Influence ConjectureCombinatorica10.1007/s00493-024-00133-z45:1Online publication date: 20-Dec-2024
  • Show More Cited By

Index Terms

  1. Agnostically learning decision trees

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    STOC '08: Proceedings of the fortieth annual ACM symposium on Theory of computing
    May 2008
    712 pages
    ISBN:9781605580470
    DOI:10.1145/1374376
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 17 May 2008

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. agnostic learning
    2. decision trees
    3. learning in the presence of noise

    Qualifiers

    • Research-article

    Conference

    STOC '08
    Sponsor:
    STOC '08: Symposium on Theory of Computing
    May 17 - 20, 2008
    British Columbia, Victoria, Canada

    Acceptance Rates

    STOC '08 Paper Acceptance Rate 80 of 325 submissions, 25%;
    Overall Acceptance Rate 1,469 of 4,586 submissions, 32%

    Upcoming Conference

    STOC '25
    57th Annual ACM Symposium on Theory of Computing (STOC 2025)
    June 23 - 27, 2025
    Prague , Czech Republic

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)16
    • Downloads (Last 6 weeks)2
    Reflects downloads up to 14 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Agnostically Learning Multi-Index Models with Queries2024 IEEE 65th Annual Symposium on Foundations of Computer Science (FOCS)10.1109/FOCS61266.2024.00116(1931-1952)Online publication date: 27-Oct-2024
    • (2024)Fast Decision Tree Learning Solves Hard Coding-Theoretic Problems2024 IEEE 65th Annual Symposium on Foundations of Computer Science (FOCS)10.1109/FOCS61266.2024.00114(1893-1910)Online publication date: 27-Oct-2024
    • (2024)A New Bound for the Fourier-Entropy-Influence ConjectureCombinatorica10.1007/s00493-024-00133-z45:1Online publication date: 20-Dec-2024
    • (2023)Lifting Uniform Learners via Distributional DecompositionProceedings of the 55th Annual ACM Symposium on Theory of Computing10.1145/3564246.3585212(1755-1767)Online publication date: 2-Jun-2023
    • (2023)A survey on the complexity of learning quantum statesNature Reviews Physics10.1038/s42254-023-00662-46:1(59-69)Online publication date: 11-Dec-2023
    • (2022)Properly Learning Decision Trees in almost Polynomial TimeJournal of the ACM10.1145/356104769:6(1-19)Online publication date: 24-Nov-2022
    • (2022)Sharper bounds on the Fourier concentration of DNFs2021 IEEE 62nd Annual Symposium on Foundations of Computer Science (FOCS)10.1109/FOCS52979.2021.00094(930-941)Online publication date: Feb-2022
    • (2022)Properly learning decision trees in almost polynomial time2021 IEEE 62nd Annual Symposium on Foundations of Computer Science (FOCS)10.1109/FOCS52979.2021.00093(920-929)Online publication date: Feb-2022
    • (2020)Provable guarantees for decision tree inductionProceedings of the 37th International Conference on Machine Learning10.5555/3524938.3525026(941-949)Online publication date: 13-Jul-2020
    • (2020)Universal guarantees for decision tree induction via a higher-order splitting criterionProceedings of the 34th International Conference on Neural Information Processing Systems10.5555/3495724.3496518(9475-9484)Online publication date: 6-Dec-2020
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media