Auflistung Datenbank Spektrum 13(3) - November 2013 nach Erscheinungsdatum
1 - 10 von 11
Treffer pro Seite
Sortieroptionen
- ZeitschriftenartikelCloud Data Management for Online Games: Potentials and Open Issues(Datenbank-Spektrum: Vol. 13, No. 3, 2013) Diao, Ziqiang; Schallehn, Eike; Wang, Shuo; Mohammad, SibaThe number of players, for massively multiplayer online role-playing games (MMORPG), typically reaches millions of people, geographically distributed throughout the world. Worldwide revenues for these games increase by billions of dollars each year. Unfortunately, their complex architecture makes them hard to maintain, resulting in considerable costs and development risks. For normal operation, MMORPGs have to access huge amounts of diverse data. With increasing numbers of players, managing growing volumes of data in a relational database becomes a big challenge, which cannot be overcome by simply adding new servers. Cloud storage systems are emerging solutions focusing on providing scalability and high performance for Cloud applications, social media, etc. However, Cloud storage systems are in general not designed for processing transactions or providing high levels of consistency. In this paper, we present our current work-in-progress by analyzing the existing architecture of MMORPGs and classifying relevant data. Based on this, we highlight the design requirements, identify the major research challenges, and propose a Cloud-based model for MMORPGs that we currently implement as a testbed for further evaluation.
- ZeitschriftenartikelVorstellung des Lehrstuhls für Datenbanksysteme der TUM(Datenbank-Spektrum: Vol. 13, No. 3, 2013) Kemper, Alfons; Neumann, ThomasDer Lehrstuhl für Datenbanksysteme der Technischen Universität München beschäftigt sich in der Forschung mit der Entwicklung, Optimierung und Anwendung „moderner“ Datenbanktechnologie. Die Schwerpunkte der letzten Jahre lagen in der Entwicklung von Multi-Tenancy-fähigen Datenbanken, eScience-Datenbankanwendungen, Workload-Management für heterogene Anwendungen, RDF-Datenbanken und, insbesondere, Hauptspeicher-Datenbanken für hybride OLTP&OLAP-Anwendungen.
- Zeitschriftenartikel„Gib mir so viel Gold, wie die Metzger im Nachbardorf zusammen besitzen und ich lasse den Piloten frei!“ – Spielbasiertes Lernen von SQL-Grundlagen(Datenbank-Spektrum: Vol. 13, No. 3, 2013) Schildgen, Johannes; Deßloch, StefanStellen Sie sich vor, Sie landen auf einer einsamen Insel und die Bewohner verstehen nur die Sprache SQL. Das Spiel SQL Island (http://www.sql-island.de) dient zur Vermittlung und Übung von SQL-Grundlagen und wird durch die Eingabe von SQL-Anfragen gesteuert. Der Zweck des Spiels ist es, dem Spieler auf unterhaltsame Weise beizubringen, wie Daten in relationalen Datenbanken abgefragt und manipuliert werden können. Dabei werden keine SQL-Kenntnisse vorausgesetzt.
- ZeitschriftenartikelDatabase and Data Management Requirements for Equalization of Contactless Acquired Traces for Forensic Purposes—Provenance and Performance(Datenbank-Spektrum: Vol. 13, No. 3, 2013) Kirst, Stefan; Schäler, MartinThe importance of fingerprints and microtraces within the field of criminalistics and forensics is well-known. An upcoming field is the contactless acquisition of traces, because the integrity of traces is preserved Hildebrandt et al. (MM’Sec, pp. 1–8, 2011). A further issue from such an acquisition method is the potential presence of perspective distortions, which we already started to deal with in Kist et al. (SPIE 8546 Conf., pp. 0A/1–0A/12, 2012). Within the scope of a productive use of contactless acquisition methods, preprocessing steps like the equalization come along. In this paper, we give a perspective on requirements for an underlying database and database management system to support the methods of Kist et al. (SPIE 8546 Conf., pp. 0A/1–0A/12, 2012) as a potential real-case scenario. Thereby, we point out possible starting points for parallelization potential and evaluate the benefit. Finally, we integrate an approach to ensure the chain of custody by means of provenance, which is essential for any forensic investigation and evaluate the effect on the overall system performance.
- ZeitschriftenartikelJEPC: The Java Event Processing Connectivity(Datenbank-Spektrum: Vol. 13, No. 3, 2013) Hoßbach, Bastian; Glombiewski, Nikolaus; Morgen, Andreas; Ritter, Franz; Seeger, BernhardToday, event processing (EP) is the first choice technology for analyzing massive event streams in a timely manner. EP allows to detect user-defined situations of interest, like in streaming position events for example, in near real-time such that actions can be taken immediately. Unfortunately, each specific EP system has its very own API and query language because there are no standards. The exchange of EP systems as well as their use within a federation is challenging, error-prone, and expensive. To overcome these problems, we introduce the Java Event Processing Connectivity (JEPC) that is a middleware for uniform EP functionality in Java. JEPC provides always the same API and query language for EP completely independent of the EP system beneath. Furthermore, we show in detail how JEPC can integrate database systems besides EP systems and evaluate the performance of EP powered by databases systems.
- ZeitschriftenartikelAn Interactive System for Visual Analytics of Dynamic Topic Models(Datenbank-Spektrum: Vol. 13, No. 3, 2013) Günnemann, Nikou; Derntl, Michael; Klamma, Ralf; Jarke, MatthiasThe vast amount and rapid growth of data on the Web and in document repositories make knowledge extraction and trend analysis a challenging task. A well-proven approach for the unsupervised analysis of large text corpora is dynamic topic modeling. While there is a solid body of research on fundamentals and applications of this technique, visual-interactive analysis systems for allowing end-users to perform analysis tasks using topic models are still rare. In this paper, we present D-VITA, an interactive text analysis system that exploits dynamic topic modeling to detect the latent topic structure and dynamics in a collection of documents. D-VITA supports end-users in understanding and exploiting the topic modeling results by providing interactive visualizations of the topic evolution in document collections and by browsing documents based on keyword search and similarity of their topic distributions. The system was evaluated by a scientific community that used D-VITA for trend analysis in their data sources. The results indicate high usability of D-VITA and its usefulness for productive analysis tasks.
- ZeitschriftenartikelOn the Integration of Electrical/Electronic Product Data in the Automotive Domain(Datenbank-Spektrum: Vol. 13, No. 3, 2013) Tiedeken, Julian; Reichert, Manfred; Herbst, JoachimThe recent innovation of modern cars has mainly been driven by the development of new as well as the continuous improvement of existing electrical and electronic (E/E) components, including sensors, actuators, and electronic control units. This trend has been accompanied by an increasing complexity of E/E components and their numerous interdependencies. In addition, external impact factors (e.g., changes of regulations, product innovations) demand for more sophisticated E/E product data management (E/E-PDM). Since E/E product data is usually scattered over a large number of distributed, heterogeneous IT systems, application-spanning use cases are difficult to realize (e.g., ensuring the consistency of artifacts corresponding to different development phases, plausibility of logical connections between electronic control units). To tackle this challenge, the partial integration of E/E product data as well as corresponding schemas becomes necessary. This paper presents the properties of a typical IT system landscape related to E/E-PDM, reveals challenges emerging in this context, and elicits requirements for E/E-PDM. Based on this, insights into our framework, which targets at the partial integration of E/E product data, are given. Such an integration will foster E/E product data integration and hence contribute to an improved E/E product quality.
- ZeitschriftenartikelEditorial(Datenbank-Spektrum: Vol. 13, No. 3, 2013) Härder, Theo
- ZeitschriftenartikelNews(Datenbank-Spektrum: Vol. 13, No. 3, 2013)
- ZeitschriftenartikelMöglichkeiten und Konzepte zur XML-Schemavalidierung am Beispiel von DB2 for z/OS V9.1(Datenbank-Spektrum: Vol. 13, No. 3, 2013) Koch, ChristophDas von IBM entwickelte relationale Datenbankmanagementsystem (DBMS) DB2 for z/OS V9.1 ermöglicht durch die Implementierung der pureXML-Technologie (IBM Corporation 2012) die native Speicherung und Verarbeitung von „Extensible Markup Language (XML)“-Daten. Hierzu zählen auch Mechanismen zum Umgang mit XML-Schemas – dem de facto Standardinstrument zur Datenmodellierung und Integritätssicherung im XML-Kontext. Der vorliegende Beitrag reflektiert in Anlehnung an (Koch 2012) die DB2-Funktionalitäten zur XML-Schemavalidierung anhand eines zuvor erarbeiteten Anforderungsprofils. Darauf aufbauend werden Konzepte zur XML-Schemavalidierung – die nachträgliche und die automatische Schemavalidierung – vorgestellt, durch deren Implementierung sich die DB2-Funktionalitäten gemäß der Anforderungen gezielt ergänzen lassen. Abschließend werden die Ausführungen dieses Beitrags in Bezug zur Fragestellung, inwieweit der Einsatz von datenbankseitiger XML-Schemavalidierung bereits mit DB2 for z/OS V9.1 sinnvoll ist, zusammengefasst.