Database Benchmark: Unterschied zwischen den Versionen

Aus Geoinformation HSR
Wechseln zu: Navigation, Suche
(Die Seite wurde neu angelegt: „See also: * HSR_Texas_Geo_Database_Benchmark == Ideas for a new Database Benchmark Goals: * ... Rules: * ... == Weblinks == * * [Open Source Database Be…“)
 
K (About Database Performance Benchmarking)
 
(23 dazwischenliegende Versionen desselben Benutzers werden nicht angezeigt)
Zeile 1: Zeile 1:
See also:
+
Synonym: Database Contest.
* [[HSR_Texas_Geo_Database_Benchmark]]
 
  
== Ideas for a new Database Benchmark
+
See also: "Performance-Vergleich von PostgreSQL, SQLite, db4o und MongoDB" [http://wiki.hsr.ch/Datenbanken/wiki.cgi?SeminarDatenbanksystemeHS1112 Seminar Database Systems Autumn 2011/2012], Master of Science in Engineering, HSR.
  
Goals:
+
== Spatial Database Benchmarks ==
* ...
+
* [[HSR Texas Geo Database Benchmark]]
 +
** Comparing PostGIS, SpatiaLite and Geocouch
 +
** Comparing PostGIS with Lucene/Solr [http://wiki.hsr.ch/Datenbanken/SeminarDBS1ThemaWolski Seminar Database Systems Autumn 2013/2014], Master of Science in Engineering, HSR.
 +
* Spatial Overlay/Clipping:
 +
** "ArcGIS vs QGIS etc Clipping Contest Rematch revisited": [http://courses.neteler.org/arcgis-vs-qgis-etc-clipping-contest-rematch-revisited/]
 +
** Clipping Contest with Spatialite: [https://www.gaia-gis.it/fossil/libspatialite/wiki?name=benchmark-4.0]
  
Rules:
+
== About Database Performance Benchmarking ==
* ...
 
  
== Weblinks ==
+
Existing DB-Benchmarks:
 +
* TPC-C for OLTP benchmarks [https://en.wikipedia.org/wiki/Transaction_Processing_Performance_Council].
 +
** TPC-R & TPC-H (formerly TPC-DS) for data warehouse & decision support systems.
 +
** TPC-W benchmark for Web-based systems.
 +
* "The Engineering Database Benchmark" [http://research.microsoft.com/en-us/um/people/gray/benchmarkhandbook/chapter7.pdf].
 +
* "Open Source Database Benchmark" [http://osdb.sourceforge.net/].
 +
* PolePosition open source database benchmark [http://www.polepos.org/].
  
*  
+
== Guidelines ==
* [Open Source Database Benchmark]
+
 
* []
+
A database performance benchmark hast to consider following different aspects:
 +
# Cold and warm start (beware that in case of warm start chaching will take place!).
 +
# Equality and Range Queries.
 +
# Query-Result Sets, which respond with one tupel or which respond more than half of the tupels in the dataset.
 +
# Single User versus Multi-user.
 +
 
 +
Software (Scritps) for benchmark automation:
 +
* [http://wiki.hsr.ch/Datenbanken/files/db-benchmark_mott.zip PostgreSQL hstore Benchmark] - Benchmarking in Python by Michel Ott, 2011.
 +
* [http://www.postgresql.org/docs/devel/static/pgbench.html pgbench] - Benchmark tool for PostgreSQL.
 +
 
 +
== Weblinks / References ==
 +
 
 +
* "The Engineering Database Benchmark", Rick Cattell, [http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.89.1518&rep=rep1&type=pdf PDF].
 +
* [http://osdb.sourceforge.net/ Open Source Database Benchmark]
 +
* http://www.delicious.com/sfkeller/database+benchmark and http://www.delicious.com/tag/database,benchmark/alltime

Aktuelle Version vom 16. September 2015, 01:44 Uhr

Synonym: Database Contest.

See also: "Performance-Vergleich von PostgreSQL, SQLite, db4o und MongoDB" Seminar Database Systems Autumn 2011/2012, Master of Science in Engineering, HSR.

Spatial Database Benchmarks

About Database Performance Benchmarking

Existing DB-Benchmarks:

  • TPC-C for OLTP benchmarks [3].
    • TPC-R & TPC-H (formerly TPC-DS) for data warehouse & decision support systems.
    • TPC-W benchmark for Web-based systems.
  • "The Engineering Database Benchmark" [4].
  • "Open Source Database Benchmark" [5].
  • PolePosition open source database benchmark [6].

Guidelines

A database performance benchmark hast to consider following different aspects:

  1. Cold and warm start (beware that in case of warm start chaching will take place!).
  2. Equality and Range Queries.
  3. Query-Result Sets, which respond with one tupel or which respond more than half of the tupels in the dataset.
  4. Single User versus Multi-user.

Software (Scritps) for benchmark automation:

Weblinks / References