Performance Test Automation with Distributed Database Systems

1
Mungamuru Nirmala
Mungamuru Nirmala
2
Dr. R. Mahammad Shafi
Dr. R. Mahammad Shafi
1 Ministry of Education - NBHE

Send Message

To: Author

GJCST Volume 12 Issue B12

Article Fingerprint

ReserarchID

CSTB93635

Performance Test Automation with Distributed Database Systems Banner
  • English
  • Afrikaans
  • Albanian
  • Amharic
  • Arabic
  • Armenian
  • Azerbaijani
  • Basque
  • Belarusian
  • Bengali
  • Bosnian
  • Bulgarian
  • Catalan
  • Cebuano
  • Chichewa
  • Chinese (Simplified)
  • Chinese (Traditional)
  • Corsican
  • Croatian
  • Czech
  • Danish
  • Dutch
  • Esperanto
  • Estonian
  • Filipino
  • Finnish
  • French
  • Frisian
  • Galician
  • Georgian
  • German
  • Greek
  • Gujarati
  • Haitian Creole
  • Hausa
  • Hawaiian
  • Hebrew
  • Hindi
  • Hmong
  • Hungarian
  • Icelandic
  • Igbo
  • Indonesian
  • Irish
  • Italian
  • Japanese
  • Javanese
  • Kannada
  • Kazakh
  • Khmer
  • Korean
  • Kurdish (Kurmanji)
  • Kyrgyz
  • Lao
  • Latin
  • Latvian
  • Lithuanian
  • Luxembourgish
  • Macedonian
  • Malagasy
  • Malay
  • Malayalam
  • Maltese
  • Maori
  • Marathi
  • Mongolian
  • Myanmar (Burmese)
  • Nepali
  • Norwegian
  • Pashto
  • Persian
  • Polish
  • Portuguese
  • Punjabi
  • Romanian
  • Russian
  • Samoan
  • Scots Gaelic
  • Serbian
  • Sesotho
  • Shona
  • Sindhi
  • Sinhala
  • Slovak
  • Slovenian
  • Somali
  • Spanish
  • Sundanese
  • Swahili
  • Swedish
  • Tajik
  • Tamil
  • Telugu
  • Thai
  • Turkish
  • Ukrainian
  • Urdu
  • Uzbek
  • Vietnamese
  • Welsh
  • Xhosa
  • Yiddish
  • Yoruba
  • Zulu

Our previous research paper ‘A Focus on Testing Issues in Distributed Database Systems’ led us to a conclusion that Distributed Database Systems supports many good engineering practices but there is still place for refinements. A Distributed Database (DDB) is formed by a collection of multiple databases logically inter-related in a Computer Network. Apart from managing a plethora of complicated tasks, database management systems also need to be efficient in terms of concurrency, reliability, fault-tolerance and performance. As there has been a paradigm shift from centralized databases to Distributed databases, any testing process, when used in DDB correlates a series of stages for the construction of a DDB project right from the scratch and is employed in homogeneous systems. In this paper, an attempt is made to describe the establishment of Performance Testing with DDB systems. It focuses on the need for maintaining performance and some techniques to achieve performance in DDB systems. Three sample web based systems are tested by using TestMaker, one of the open source software, in order to highlight the helpful role of performance in the context of testing. The strengths and weaknesses of chosen performance testing tools viz., TestMaker, OpenSTA, and httperf are discussed.

13 Cites in Articles

References

  1. David Mosberger,Tai Jin (1998). httperf—a tool for measuring web server performance.
  2. B Neuman Scale in Distributed Systems.
  3. R Shafi,Dr Kavitha -A Framework for Designing and Testing a Distributed Database System.
  4. A Tanenbaum,M Steen Distributed Systems -Principles and Paradigms.
  5. W Emmerich (1997). Distributed System Principles.
  6. Yuanling Zhu,Kevin Lü (2000). Performance Analysis of Web Database Systems.
  7. E Codd (1970). A relational model of data for large shared data banks.
  8. Michael Stonebraker,Dorothy Moore (1996). Object-Relational DBMSs: The Next Great Wave.
  9. Yao-S Bing,Alan Hevner (1984). A Guide to Performance Evaluation of Database Systems.
  10. Haran Boral,David Dewitt (1984). A methodology for database system performance evaluation.
  11. Peter Harrison,Catalina Llado (2000). Performance Evaluation of a Distributed Enterprise Data Mining System.
  12. C Date (1990). What is a Distributed Database System?.
  13. Stefano Ceri,Giuseppe Pelagatti (1984). Distributed Databases: Principles and Systems.

Funding

No external funding was declared for this work.

Conflict of Interest

The authors declare no conflict of interest.

Ethical Approval

No ethics committee approval was required for this article type.

Data Availability

Not applicable for this article.

Mungamuru Nirmala. 2012. \u201cPerformance Test Automation with Distributed Database Systems\u201d. Global Journal of Computer Science and Technology - B: Cloud & Distributed GJCST-B Volume 12 (GJCST Volume 12 Issue B12): .

Download Citation

Issue Cover
GJCST Volume 12 Issue B12
Pg. 9- 14
Journal Specifications

Crossref Journal DOI 10.17406/gjcst

Print ISSN 0975-4350

e-ISSN 0975-4172

Classification
Not Found
Version of record

v1.2

Issue date

December 28, 2012

Language

English

Experiance in AR

The methods for personal identification and authentication are no exception.

Read in 3D

The methods for personal identification and authentication are no exception.

Article Matrices
Total Views: 10119
Total Downloads: 2506
2026 Trends
Research Identity (RIN)
Related Research

Published Article

Our previous research paper ‘A Focus on Testing Issues in Distributed Database Systems’ led us to a conclusion that Distributed Database Systems supports many good engineering practices but there is still place for refinements. A Distributed Database (DDB) is formed by a collection of multiple databases logically inter-related in a Computer Network. Apart from managing a plethora of complicated tasks, database management systems also need to be efficient in terms of concurrency, reliability, fault-tolerance and performance. As there has been a paradigm shift from centralized databases to Distributed databases, any testing process, when used in DDB correlates a series of stages for the construction of a DDB project right from the scratch and is employed in homogeneous systems. In this paper, an attempt is made to describe the establishment of Performance Testing with DDB systems. It focuses on the need for maintaining performance and some techniques to achieve performance in DDB systems. Three sample web based systems are tested by using TestMaker, one of the open source software, in order to highlight the helpful role of performance in the context of testing. The strengths and weaknesses of chosen performance testing tools viz., TestMaker, OpenSTA, and httperf are discussed.

Our website is actively being updated, and changes may occur frequently. Please clear your browser cache if needed. For feedback or error reporting, please email [email protected]
×

This Page is Under Development

We are currently updating this article page for a better experience.

Request Access

Please fill out the form below to request access to this research paper. Your request will be reviewed by the editorial or author team.
X

Quote and Order Details

Contact Person

Invoice Address

Notes or Comments

This is the heading

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Ut elit tellus, luctus nec ullamcorper mattis, pulvinar dapibus leo.

High-quality academic research articles on global topics and journals.

Performance Test Automation with Distributed Database Systems

Dr. R. Mahammad Shafi
Dr. R. Mahammad Shafi
Mungamuru Nirmala
Mungamuru Nirmala Ministry of Education - NBHE

Research Journals