1. Target
  2. Movies, Music & Books
  3. Books
  4. All Book Genres
  5. Computers & Technology Books

Foundations of Data Quality Management - (Synthesis Lectures on Data Management) by Wenfei Fan & Floris Geerts (Paperback)

Foundations of Data Quality Management - (Synthesis Lectures on Data Management) by  Wenfei Fan & Floris Geerts (Paperback)
Store: Target
Last Price: 40.49 USD

Similar Products

Products of same category from the store

All

Product info

<p/><br></br><p><b> Book Synopsis </b></p></br></br>Data quality is one of the most important problems in data management. A database system typically aims to support the creation, maintenance and use of large amount of data, focusing on the quantity of data. However, real-life data are often dirty: inconsistent, duplicated, inaccurate, incomplete, or stale. Dirty data in a database routinely generate misleading or biased analytical results and decisions, and lead to loss of revenues, credibility and customers. With this comes the need for data quality management. In contrast to traditional data management tasks, data quality management is to enable the detection and correction of errors in the data, syntactic or semantic, in order to improve the quality of the data and hence, add values to business processes. This monograph gives an overview of fundamental issues underlying central aspects of data quality, namely, data consistency, deduplication, accuracy, currency, and information completeness. We promote a uniform logical framework for dealing with these issues, based on data quality rules. The text is organized into seven chapters, focusing on relational data. Chapter 1 introduces data quality issues. A conditional dependency theory is developed in Chapter 2, for capturing data inconsistencies. It is followed by practical techniques in Chapter 3 for discovering conditional dependencies, and for detecting inconsistencies and repairing data based on conditional dependencies. Matching dependencies are introduced in Chapter 4, as matching rules for data deduplication. A theory of relative information completeness is studied in Chapter 5, revising the classical Closed World Assumption and the Open World Assumption, to characterize incomplete information in the real world. A data currency model is presented in Chapter 6, to identify the current values of entities in a database and to answer queries with the current values, in the absence of reliable timestamps. Finally, interactions between these data quality issues are explored in Chapter 7. Important theoretical results and practical algorithms are covered, but formal proofs are omitted. The bibliographical notes contain pointers to papers in which the results were presented and proved, as well as references to materials for further reading. This text is intended for a seminar course at the graduate level. It is also to serve as a useful resource for researchers and practitioners who are interested in the study of data quality. The fundamental research on data quality draws on several areas, including mathematical logic, computational complexity and database theory. It has raised as many questions as it has answered, and is a rich source of questions and vitality.

Price History