skip to main content
10.1145/1958746.1958796acmconferencesArticle/Chapter ViewAbstractPublication PagesicpeConference Proceedingsconference-collections
abstract

Automatic estimation of performance requirements for software tasks of mobile devices

Published:30 September 2011Publication History

ABSTRACT

This paper introduces a new method to predict performance requirements of mobile devices' software tasks using system models describing the hardware and software. With the help of clustering algorithms and linear regression, behavioral models of software tasks are generated automatically. These models are used to project the runtime of representative parts of the software tasks. The runtime of representative execution parts is determined with instruction-accurate simulations which are not feasible for whole executions. The inputs for the projection task a model of the hardware platform and input data parameters, especially the data size. A major advantage of this approach is that the developers do not have to estimate the performance requirements themselves. In this way the method helps to seamlessly integrate the performance analysis process into the development process. The paper introduces the ideas in detail and presents an evaluation of the proposed method for typical software tasks of mobile devices.

References

  1. ARM. AMBA Specification (Rev 2.0), May 1999.Google ScholarGoogle Scholar
  2. ARM Limited. ARM926EJ-S Technical Reference Manual, 2003.Google ScholarGoogle Scholar
  3. ARM Limited. ARM11 MPCore Processor Technical Reference Manual r1p0, Feb 2008.Google ScholarGoogle Scholar
  4. Arnold S. Berger. Embedded Systems Design: An Introduction To Processes, Tools, And Techniques. CMP Books, 2001.Google ScholarGoogle Scholar
  5. T. Austin, E. Larson, and D. Ernst. SimpleScalar: An Infrastructure for Computer System Modeling. Computer, 35(2):59-67, 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. T. Ball and J. R. Larus. Efficient path profiling. In MICRO 29: Proceedings of the 29th annual ACM/IEEE international symposium on Microarchitecture, pages 46-57, Washington, DC, USA, 1996. IEEE Computer Society. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. F. Bellard. QEMU, a fast and portable dynamic translator. In ATEC '05: Proceedings of the annual conference on USENIX Annual Technical Conference, pages 41-41, Berkeley, CA, USA, 2005. USENIX Association. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. C. Ding and Y. Zhong. Predicting whole-program locality through reuse distance analysis. SIGPLAN Not., 38(5):245-257, 2003. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. L. Eeckhout. Measuring Benchmark Similarity Using Inherent Program Characteristics. IEEE Trans. Comput., 55(6):769-782, 2006. Student Member-Ajay Joshi and Student Member-Aashish Phansalkar and Senior Member-Lizy Kurian John. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. L. Eeckhout, J. Sampson, and B. Calder. Exploiting Program Microarchitecture Independent Characteristics and Phase Behavior for Reduced Benchmark Suite Simulation. In Proceedings of the 2005 IEEE International Symposium on Workload Characterization, pages 2-12, Austin, TX, USA, 10 2005. IEEE.Google ScholarGoogle ScholarCross RefCross Ref
  11. L. Eeckhout, H. Vandierendonck, and K. D. Bosschere. Designing Computer Architecture Research Workloads. Computer, 36(2):65-71, 2003. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. L. Eeckhout, H. Vandierendonck, and K. De Bosschere. Workload Design: Selecting Representative Program-Input Pairs. In Proceedings of the 2002 International Conference on Parallel Architectures and Compilation Techniques, pages 83-94, Charlottesville, VA, USA, 9 2002. IEEE Computer Society. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. L. Eeckhout, H. Vandierendonck, and K. De Bosschere. Quantifying the Impact of Input Data Sets on Program Behavior and its Applications. Journal of Instruction-Level Parallelism, 5:1-33, 2 2003.Google ScholarGoogle Scholar
  14. J. E. Fritts, F. W. Steiling, J. A. Tucek, and W. Wolf. MediaBench II video: Expediting the next generation of video systems research, 2009.Google ScholarGoogle Scholar
  15. G. Fursin, J. Cavazos, M. O'Boyle, and O. Temam. MiDataSets: Creating The Conditions For A More Realistic Evaluation of Iterative Optimization. In International Conference on High Performance Embedded Architectures & Compilers (HiPEAC 2007), January 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. M. R. Guthaus, J. S. Ringenberg, D. Ernst, T. M. Austin, T. Mudge, and R. B. Brown. MiBench: A free, commercially representative embedded benchmark suite. In WWC '01: Proceedings of the Workload Characterization, 2001. WWC-4. 2001 IEEE International Workshop on, pages 3-14, Washington, DC, USA, 2001. IEEE Computer Society. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. G. Hamerly, E. Perelman, J. Lau, and B. Calder. Simpoint 3.0: Faster and more flexible program phase analysis. Journal of Instruction Level Parallelism, 7, Sep 2005.Google ScholarGoogle Scholar
  18. K. Hoste and L. Eeckhout. Comparing Benchmarks Using Key Microarchitecture-Independent Characteristics. IEEE Workload Characterization Symposium, 0:83-92, 2006.Google ScholarGoogle Scholar
  19. K. Hoste and L. Eeckhout. Characterizing the Unique and Diverse Behaviors in Existing and Emerging General-Purpose and Domain-Specific Benchmark Suites. In Proceedings of the IEEE International Symposium on Performance Analysis of Systems and Software (ISPASS), pages 157-168, Austin, TX, USA, 4 2008. IEEE. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. T. Lafage and A. Seznec. Choosing representative slices of program execution for microarchitecture simulations: a preliminary application to the data stream. In Workload characterization of emerging computer applications, pages 145-163. Kluwer Academic Publishers, Norwell, MA, USA, 2001. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. OMG. A UML Profile for Modeling and Analysis of Real-Time and Embedded systems (MARTE). Object Management Group, 2009.Google ScholarGoogle Scholar
  22. A. Phansalkar, A. Joshi, L. Eeckhout, and L. K. John. Measuring Program Similarity: Experiments with SPEC CPU Benchmark Suites. In Proceedings of the 2005 IEEE International Symposium on Performance Analysis of Systems and Software (ISPASS 2005), pages 10-20, Austin, TX, 3 2005. IEEE. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. L. Pustina, S. Schwarzer, and P. Martini. A Methodology for Performance Predictions of Future ARM Systems Modelled in UML. In Proceedings of the 2nd Annual IEEE International Systems Conference Syscon 2008, 2008.Google ScholarGoogle ScholarCross RefCross Ref
  24. T. Sherwood, E. Perelman, G. Hamerly, and B. Calder. Automatically characterizing large scale program behavior. SIGOPS Oper. Syst. Rev., 36(5):45-57, 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. T. Sherwood, E. Perelman, G. Hamerly, S. Sair, and B. Calder. Discovering and Exploiting Program Phases. IEEE Micro, 23(6):84-93, 2003. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. C. U. Smith and L. G. Williams. Performance solutions: a practical guide to creating responsive, scalable software. Addison Wesley Longman Publishing Co., Inc., Redwood City, CA, USA, 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. P.-N. Tan, M. Steinbach, and V. Kumar. Introduction to Data Mining. Addison-Wesley, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. Texas Instruments. OMAP5912 Applications Processor Data Manual, Dec 2003.Google ScholarGoogle Scholar
  29. D. W. Wall. Predicting program behavior using real or estimated profiles. SIGPLAN Not., 26(6):59-70, 1991. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Automatic estimation of performance requirements for software tasks of mobile devices

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader