skip to main content
10.1145/1143844.1143888acmotherconferencesArticle/Chapter ViewAbstractPublication PagesicmlConference Proceedingsconference-collections
Article

Kernelizing the output of tree-based methods

Published:25 June 2006Publication History

ABSTRACT

We extend tree-based methods to the prediction of structured outputs using a kernelization of the algorithm that allows one to grow trees as soon as a kernel can be defined on the output space. The resulting algorithm, called output kernel trees (OK3), generalizes classification and regression trees as well as tree-based ensemble methods in a principled way. It inherits several features of these methods such as interpretability, robustness to irrelevant variables, and input scalability. When only the Gram matrix over the outputs of the learning sample is given, it learns the output kernel as a function of inputs. We show that the proposed algorithm works well on an image reconstruction task and on a biological network inference problem.

References

  1. Blockeel, H., Bruynooghe, M., Dzeroski, S., Ramon, J., & Struyf, J. (2002). Hierarchical multi-classification. KDD-2002 Workshop Notes: MRDM 2002, Workshop on Multi-Relational Data Mining (pp. 21--35).]]Google ScholarGoogle Scholar
  2. Blockeel, H., De Raedt, L., & Ramon, J. (1998). Top-down induction of clustering trees. Proc. of ICML 1998 (pp. 55--63).]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Breiman, L. (1996). Bagging predictors. Machine Learning, 24, 123--140.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Breiman, L., Friedman, J., Olsen, R., & Stone, C. (1984). Classification and regression trees. Wadsworth International (California).]]Google ScholarGoogle Scholar
  5. Cortes, C., Mehryar, M., & Weston, J. (2005). A general regression technique for learning transductions. Proceedings of ICML 2005 (pp. 153--160).]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Dhillon, I., Guan, Y., & Kullis, B. (2004). Kernel k-means, spectral clustering and normalized cuts (Technical Report). University of Austin, Texas.]]Google ScholarGoogle Scholar
  7. Geurts, P., Ernst, D., & Wehenkel, L. (2006). Extremely randomized trees. Machine Learning Journal (advance access: DOI 10.1007/s10994-006-6226-1).]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Kondor, R., & Lafferty, J. (2002). Diffusion kernels on graphs and other discrete input. Proc. of ICML 2002 (pp. 315--322).]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Ralaivola, L., & d'Alchéé-Buc, F. (2003). Dynamical modeling with kernels for nonlinear time series prediction. Advances in Neural Information Processing Systems.]]Google ScholarGoogle Scholar
  10. Scholkopf, B., & Smola, A. (2002). Learning with kernels. MIT Press.]]Google ScholarGoogle Scholar
  11. Segal, M. (1992). Tree structured methods for longitudinal data. Journal of the American Statistical Association, 87, 407--418.]]Google ScholarGoogle ScholarCross RefCross Ref
  12. Taskar, B., Chatalbashev, V., Koller, D., & Guestrin, C. (2005). Learning structured prediction models: A large margin approach. Proc. of the 22nd International Conference on Machine Learning (pp. 897--904).]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Todorovski, L., Blockeel, H., & Dzeroski, S. (2002). Ranking with predictive clustering trees. Proc. of the 13th European Conference on Machine Learning (pp. 444--456).]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Tsochantaridis, I., Joachims, T., Hofmann, T., & Altun, Y. (2005). Large margin methods for structured and interdependent output variables. Journal of Machine Learning Research, 6, 1453--1484.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Tsuda, K., Akaho, S., & Asai, K. (2003). The em algorithm for kernel matrix completion with auxiliary data. Journal of machine learning research, 4, 67--81.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Vert, J.-P., & Yamanishi, Y. (2004). Supervised graph inference. Advances in Neural Information Processing Systems, 17, 1433--1440.]]Google ScholarGoogle Scholar
  17. Weston, J., Chapelle, O., Elisseeff, A., Schoelkopf, B., & Vapnik, V. (2002). Kernel dependency estimation. Advances in Neural Information Processing Systems, 15.]]Google ScholarGoogle Scholar
  18. Weston, J., Schoelkopf, B., & Bousquet, O. (2005). Joint kernel maps. Proceedings of the 8th International Work-Conference on Artificial Neural Networks (pp. 176--191).]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Yamanishi, Y., Vert, J.-P., & Kanehisa, M. (2004). Protein network inference from multiple genomic data: a supervised approach. Bioinformatics, 20, i363--i370.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Yamanishi, Y., Vert, J.-P., & Kanehisa, M. (2005). Supervised enzyme network inference from the integration of genomic data and chemical information. Bioinformatics, 21, i468--i477.]] Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Kernelizing the output of tree-based methods

          Recommendations

          Comments

          Login options

          Check if you have access through your login credentials or your institution to get full access on this article.

          Sign in
          • Published in

            cover image ACM Other conferences
            ICML '06: Proceedings of the 23rd international conference on Machine learning
            June 2006
            1154 pages
            ISBN:1595933832
            DOI:10.1145/1143844

            Copyright © 2006 ACM

            Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

            Publisher

            Association for Computing Machinery

            New York, NY, United States

            Publication History

            • Published: 25 June 2006

            Permissions

            Request permissions about this article.

            Request Permissions

            Check for updates

            Qualifiers

            • Article

            Acceptance Rates

            ICML '06 Paper Acceptance Rate140of548submissions,26%Overall Acceptance Rate140of548submissions,26%

          PDF Format

          View or Download as a PDF file.

          PDF

          eReader

          View online with eReader.

          eReader