skip to main content
10.1145/3149572.3149585acmotherconferencesArticle/Chapter ViewAbstractPublication PagesicimeConference Proceedingsconference-collections
research-article

MTCB: A Multi-Tenant Customizable database Benchmark

Published: 09 October 2017 Publication History

Abstract

We argue that there is a need for Multi-Tenant Customizable OLTP systems. Such systems need a Multi-Tenant Customizable Database (MTC-DB) as a backing. To stimulate the development of such databases, we propose the benchmark MTCB. Benchmarks for OLTP exist and multi-tenant benchmarks exist, but no MTC-DB benchmark exists that accounts for customizability. We formulate seven requirements for the benchmark: realistic, unambiguous, comparable, correct, scalable, simple and independent. It focuses on performance aspects and produces nine metrics: Aulbach compliance, size on disk, tenants created, types created, attributes created, transaction data type instances created per minute, transaction data type instances loaded by ID per minute, conjunctive searches per minute and disjunctive searches per minute. We present a specification and an example implementation in Java 8, which can be accessed on this public repository: https://bitbucket.org/actfact/mtcdb-benchmark. In the same repository a naive implementation can be found of an MTC-DB where each tenant has its own schema. We believe that this benchmark is a valuable contribution to the community of MTC-DB developers, because it provides objective comparability as well as a precise definition of the concept of MTC-DB.

References

[1]
S. Aulbach, T. Grust, D. Jacobs, A. Kemper, and J. Rittinger. 2008. Multi-Tenant Databases for Software as a Service: Schema-Mapping Techniques. In SIGMOD' 08. Proceedings of the 2008 ACM SIGMOD international conference on Management of data. ACM, 1195--1206.
[2]
Cor-Paul Bezemer and Andy Zaidman, 2010. Challenges of Reengineering into Multi-Tenant SaaS Applications. Delft University of Technology Software Engineering Research Group. Technical Report Series (2010).
[3]
S. Chen, A. Ailamaki, M. Athanassoulis, P. B. Gibbons, R. Johnson, I. Pandis, and R. Stoica. 2010. TPC-E vs. TPC-C: Characterizing the new TPC-E benchmark via an I/O comparison study. SIGMOD Record 39, 3 (2010), 5--10. www.scopus.com.
[4]
W. R. Friedrich and J. A. Van Der Poll. 2007. Towards a methodology to elicit tacit domain knowledge from users. Interdisciplinary Journal of Information, Knowledge, and Management 2 (2007), 179--193. www.scopus.com.
[5]
R. Krebs, A. Wert, and S. Kounev. 2013. Multi-tenancy performance benchmark for web application platforms. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), Vol. 7977 LNCS. 424--438 pages. www.scopus.com.
[6]
R. M. Locke. 2002. The Promise and Perils of Globalization: The Case of Nike. MIT Working Paper (2002). Downloaded 16 December 2016 from https://ipc.mit.edu/sites/default/files/documents/02-007.pdf.
[7]
Salesforce.com. 2008. The Force.com Multitenant Architecture: Understanding the Design of Salesforce.com's Internet Application Development Platform. (2008). Accessed 4 January 2017 on http://www.developerforce.com/media/ForcedotcomBookLibrary/Force.com_Multitenancy_WP_101508.pdf.
[8]
Transaction Processing Performance Council. 2010. TPC BENCHMARK C Standard Specification Revision 5.11. (2010). http://www.tpc.org/TPC_Documents_Current_Versions/pdf/tpc-c_v5.11.0.pdf
[9]
Transaction Processing Performance Council. 2015. TPC BENCHMARK E Standard Specification Version 1.14.0. (2015).
[10]
L. Wevers. 2012. A Persistent Functional Language for Concurrent Transaction Processing. Master's thesis. University of Twente.
[11]
Roel J. Wieringa. 2014. Design science methodology for information systems and software engineering. Springer.

Cited By

View all
  • (2019)A Versatile Framework for Painless Benchmarking of Database Management SystemsDatabases Theory and Applications10.1007/978-3-030-12079-5_4(45-56)Online publication date: 23-Jan-2019

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
ICIME 2017: Proceedings of the 9th International Conference on Information Management and Engineering
October 2017
233 pages
ISBN:9781450353373
DOI:10.1145/3149572
Publication rights licensed to ACM. ACM acknowledges that this contribution was authored or co-authored by an employee, contractor or affiliate of a national government. As such, the Government retains a nonexclusive, royalty-free right to publish or reproduce this article, or to allow others to do so, for Government purposes only.

In-Cooperation

  • University of Salford: University of Salford

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 09 October 2017

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Benchmark
  2. Database
  3. Multi-Tenant Customizable
  4. Multi-level customizability
  5. OLTP

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Conference

ICIME 2017

Acceptance Rates

Overall Acceptance Rate 19 of 31 submissions, 61%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)2
  • Downloads (Last 6 weeks)0
Reflects downloads up to 08 Mar 2025

Other Metrics

Citations

Cited By

View all
  • (2019)A Versatile Framework for Painless Benchmarking of Database Management SystemsDatabases Theory and Applications10.1007/978-3-030-12079-5_4(45-56)Online publication date: 23-Jan-2019

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media