Author: Laura Sebastian-Coleman
Publisher: Newnes
ISBN: 0123977541
Category : Computers
Languages : en
Pages : 404
Book Description
The Data Quality Assessment Framework shows you how to measure and monitor data quality, ensuring quality over time. You'll start with general concepts of measurement and work your way through a detailed framework of more than three dozen measurement types related to five objective dimensions of quality: completeness, timeliness, consistency, validity, and integrity. Ongoing measurement, rather than one time activities will help your organization reach a new level of data quality. This plain-language approach to measuring data can be understood by both business and IT and provides practical guidance on how to apply the DQAF within any organization enabling you to prioritize measurements and effectively report on results. Strategies for using data measurement to govern and improve the quality of data and guidelines for applying the framework within a data asset are included. You'll come away able to prioritize which measurement types to implement, knowing where to place them in a data flow and how frequently to measure. Common conceptual models for defining and storing of data quality results for purposes of trend analysis are also included as well as generic business requirements for ongoing measuring and monitoring including calculations and comparisons that make the measurements meaningful and help understand trends and detect anomalies. - Demonstrates how to leverage a technology independent data quality measurement framework for your specific business priorities and data quality challenges - Enables discussions between business and IT with a non-technical vocabulary for data quality measurement - Describes how to measure data quality on an ongoing basis with generic measurement types that can be applied to any situation
Measuring Data Quality for Ongoing Improvement
Author: Laura Sebastian-Coleman
Publisher: Newnes
ISBN: 0123977541
Category : Computers
Languages : en
Pages : 404
Book Description
The Data Quality Assessment Framework shows you how to measure and monitor data quality, ensuring quality over time. You'll start with general concepts of measurement and work your way through a detailed framework of more than three dozen measurement types related to five objective dimensions of quality: completeness, timeliness, consistency, validity, and integrity. Ongoing measurement, rather than one time activities will help your organization reach a new level of data quality. This plain-language approach to measuring data can be understood by both business and IT and provides practical guidance on how to apply the DQAF within any organization enabling you to prioritize measurements and effectively report on results. Strategies for using data measurement to govern and improve the quality of data and guidelines for applying the framework within a data asset are included. You'll come away able to prioritize which measurement types to implement, knowing where to place them in a data flow and how frequently to measure. Common conceptual models for defining and storing of data quality results for purposes of trend analysis are also included as well as generic business requirements for ongoing measuring and monitoring including calculations and comparisons that make the measurements meaningful and help understand trends and detect anomalies. - Demonstrates how to leverage a technology independent data quality measurement framework for your specific business priorities and data quality challenges - Enables discussions between business and IT with a non-technical vocabulary for data quality measurement - Describes how to measure data quality on an ongoing basis with generic measurement types that can be applied to any situation
Publisher: Newnes
ISBN: 0123977541
Category : Computers
Languages : en
Pages : 404
Book Description
The Data Quality Assessment Framework shows you how to measure and monitor data quality, ensuring quality over time. You'll start with general concepts of measurement and work your way through a detailed framework of more than three dozen measurement types related to five objective dimensions of quality: completeness, timeliness, consistency, validity, and integrity. Ongoing measurement, rather than one time activities will help your organization reach a new level of data quality. This plain-language approach to measuring data can be understood by both business and IT and provides practical guidance on how to apply the DQAF within any organization enabling you to prioritize measurements and effectively report on results. Strategies for using data measurement to govern and improve the quality of data and guidelines for applying the framework within a data asset are included. You'll come away able to prioritize which measurement types to implement, knowing where to place them in a data flow and how frequently to measure. Common conceptual models for defining and storing of data quality results for purposes of trend analysis are also included as well as generic business requirements for ongoing measuring and monitoring including calculations and comparisons that make the measurements meaningful and help understand trends and detect anomalies. - Demonstrates how to leverage a technology independent data quality measurement framework for your specific business priorities and data quality challenges - Enables discussions between business and IT with a non-technical vocabulary for data quality measurement - Describes how to measure data quality on an ongoing basis with generic measurement types that can be applied to any situation
Measuring Data Quality for Ongoing Improvement
Author: Laura Sebastian-Coleman
Publisher: Morgan Kaufmann
ISBN: 9780123970336
Category : Computers
Languages : en
Pages : 376
Book Description
The Data Quality Assessment Framework shows you how to measure and monitor data quality, ensuring quality over time. You'll start with general concepts of measurement and work your way through a detailed framework of more than three dozen measurement types related to five objective dimensions of quality: completeness, timeliness, consistency, validity, and integrity. Ongoing measurement, rather than one time activities will help your organization reach a new level of data quality. This plain-language approach to measuring data can be understood by both business and IT and provides practical guidance on how to apply the DQAF within any organization enabling you to prioritize measurements and effectively report on results. Strategies for using data measurement to govern and improve the quality of data and guidelines for applying the framework within a data asset are included. You'll come away able to prioritize which measurement types to implement, knowing where to place them in a data flow and how frequently to measure. Common conceptual models for defining and storing of data quality results for purposes of trend analysis are also included as well as generic business requirements for ongoing measuring and monitoring including calculations and comparisons that make the measurements meaningful and help understand trends and detect anomalies. Demonstrates how to leverage a technology independent data quality measurement framework for your specific business priorities and data quality challenges Enables discussions between business and IT with a non-technical vocabulary for data quality measurement Describes how to measure data quality on an ongoing basis with generic measurement types that can be applied to any situation
Publisher: Morgan Kaufmann
ISBN: 9780123970336
Category : Computers
Languages : en
Pages : 376
Book Description
The Data Quality Assessment Framework shows you how to measure and monitor data quality, ensuring quality over time. You'll start with general concepts of measurement and work your way through a detailed framework of more than three dozen measurement types related to five objective dimensions of quality: completeness, timeliness, consistency, validity, and integrity. Ongoing measurement, rather than one time activities will help your organization reach a new level of data quality. This plain-language approach to measuring data can be understood by both business and IT and provides practical guidance on how to apply the DQAF within any organization enabling you to prioritize measurements and effectively report on results. Strategies for using data measurement to govern and improve the quality of data and guidelines for applying the framework within a data asset are included. You'll come away able to prioritize which measurement types to implement, knowing where to place them in a data flow and how frequently to measure. Common conceptual models for defining and storing of data quality results for purposes of trend analysis are also included as well as generic business requirements for ongoing measuring and monitoring including calculations and comparisons that make the measurements meaningful and help understand trends and detect anomalies. Demonstrates how to leverage a technology independent data quality measurement framework for your specific business priorities and data quality challenges Enables discussions between business and IT with a non-technical vocabulary for data quality measurement Describes how to measure data quality on an ongoing basis with generic measurement types that can be applied to any situation
Measuring Data Quality for Ongoing Improvement
Author: Laura Sebastian-Coleman
Publisher: Morgan Kaufmann
ISBN: 9780123970336
Category : Computers
Languages : en
Pages : 376
Book Description
The Data Quality Assessment Framework shows you how to measure and monitor data quality, ensuring quality over time. You'll start with general concepts of measurement and work your way through a detailed framework of more than three dozen measurement types related to five objective dimensions of quality: completeness, timeliness, consistency, validity, and integrity. Ongoing measurement, rather than one time activities will help your organization reach a new level of data quality. This plain-language approach to measuring data can be understood by both business and IT and provides practical guidance on how to apply the DQAF within any organization enabling you to prioritize measurements and effectively report on results. Strategies for using data measurement to govern and improve the quality of data and guidelines for applying the framework within a data asset are included. You'll come away able to prioritize which measurement types to implement, knowing where to place them in a data flow and how frequently to measure. Common conceptual models for defining and storing of data quality results for purposes of trend analysis are also included as well as generic business requirements for ongoing measuring and monitoring including calculations and comparisons that make the measurements meaningful and help understand trends and detect anomalies. Demonstrates how to leverage a technology independent data quality measurement framework for your specific business priorities and data quality challenges Enables discussions between business and IT with a non-technical vocabulary for data quality measurement Describes how to measure data quality on an ongoing basis with generic measurement types that can be applied to any situation
Publisher: Morgan Kaufmann
ISBN: 9780123970336
Category : Computers
Languages : en
Pages : 376
Book Description
The Data Quality Assessment Framework shows you how to measure and monitor data quality, ensuring quality over time. You'll start with general concepts of measurement and work your way through a detailed framework of more than three dozen measurement types related to five objective dimensions of quality: completeness, timeliness, consistency, validity, and integrity. Ongoing measurement, rather than one time activities will help your organization reach a new level of data quality. This plain-language approach to measuring data can be understood by both business and IT and provides practical guidance on how to apply the DQAF within any organization enabling you to prioritize measurements and effectively report on results. Strategies for using data measurement to govern and improve the quality of data and guidelines for applying the framework within a data asset are included. You'll come away able to prioritize which measurement types to implement, knowing where to place them in a data flow and how frequently to measure. Common conceptual models for defining and storing of data quality results for purposes of trend analysis are also included as well as generic business requirements for ongoing measuring and monitoring including calculations and comparisons that make the measurements meaningful and help understand trends and detect anomalies. Demonstrates how to leverage a technology independent data quality measurement framework for your specific business priorities and data quality challenges Enables discussions between business and IT with a non-technical vocabulary for data quality measurement Describes how to measure data quality on an ongoing basis with generic measurement types that can be applied to any situation
The Practitioner's Guide to Data Quality Improvement
Author: David Loshin
Publisher: Elsevier
ISBN: 0080920349
Category : Computers
Languages : en
Pages : 423
Book Description
The Practitioner's Guide to Data Quality Improvement offers a comprehensive look at data quality for business and IT, encompassing people, process, and technology. It shares the fundamentals for understanding the impacts of poor data quality, and guides practitioners and managers alike in socializing, gaining sponsorship for, planning, and establishing a data quality program. It demonstrates how to institute and run a data quality program, from first thoughts and justifications to maintenance and ongoing metrics. It includes an in-depth look at the use of data quality tools, including business case templates, and tools for analysis, reporting, and strategic planning. This book is recommended for data management practitioners, including database analysts, information analysts, data administrators, data architects, enterprise architects, data warehouse engineers, and systems analysts, and their managers. - Offers a comprehensive look at data quality for business and IT, encompassing people, process, and technology. - Shows how to institute and run a data quality program, from first thoughts and justifications to maintenance and ongoing metrics. - Includes an in-depth look at the use of data quality tools, including business case templates, and tools for analysis, reporting, and strategic planning.
Publisher: Elsevier
ISBN: 0080920349
Category : Computers
Languages : en
Pages : 423
Book Description
The Practitioner's Guide to Data Quality Improvement offers a comprehensive look at data quality for business and IT, encompassing people, process, and technology. It shares the fundamentals for understanding the impacts of poor data quality, and guides practitioners and managers alike in socializing, gaining sponsorship for, planning, and establishing a data quality program. It demonstrates how to institute and run a data quality program, from first thoughts and justifications to maintenance and ongoing metrics. It includes an in-depth look at the use of data quality tools, including business case templates, and tools for analysis, reporting, and strategic planning. This book is recommended for data management practitioners, including database analysts, information analysts, data administrators, data architects, enterprise architects, data warehouse engineers, and systems analysts, and their managers. - Offers a comprehensive look at data quality for business and IT, encompassing people, process, and technology. - Shows how to institute and run a data quality program, from first thoughts and justifications to maintenance and ongoing metrics. - Includes an in-depth look at the use of data quality tools, including business case templates, and tools for analysis, reporting, and strategic planning.
Meeting the Challenges of Data Quality Management
Author: Laura Sebastian-Coleman
Publisher: Academic Press
ISBN: 0128217561
Category : Computers
Languages : en
Pages : 353
Book Description
Meeting the Challenges of Data Quality Management outlines the foundational concepts of data quality management and its challenges. The book enables data management professionals to help their organizations get more value from data by addressing the five challenges of data quality management: the meaning challenge (recognizing how data represents reality), the process/quality challenge (creating high-quality data by design), the people challenge (building data literacy), the technical challenge (enabling organizational data to be accessed and used, as well as protected), and the accountability challenge (ensuring organizational leadership treats data as an asset). Organizations that fail to meet these challenges get less value from their data than organizations that address them directly. The book describes core data quality management capabilities and introduces new and experienced DQ practitioners to practical techniques for getting value from activities such as data profiling, DQ monitoring and DQ reporting. It extends these ideas to the management of data quality within big data environments. This book will appeal to data quality and data management professionals, especially those involved with data governance, across a wide range of industries, as well as academic and government organizations. Readership extends to people higher up the organizational ladder (chief data officers, data strategists, analytics leaders) and in different parts of the organization (finance professionals, operations managers, IT leaders) who want to leverage their data and their organizational capabilities (people, processes, technology) to drive value and gain competitive advantage. This will be a key reference for graduate students in computer science programs which normally have a limited focus on the data itself and where data quality management is an often-overlooked aspect of data management courses. - Describes the importance of high-quality data to organizations wanting to leverage their data and, more generally, to people living in today's digitally interconnected world - Explores the five challenges in relation to organizational data, including "Big Data," and proposes approaches to meeting them - Clarifies how to apply the core capabilities required for an effective data quality management program (data standards definition, data quality assessment, monitoring and reporting, issue management, and improvement) as both stand-alone processes and as integral components of projects and operations - Provides Data Quality practitioners with ways to communicate consistently with stakeholders
Publisher: Academic Press
ISBN: 0128217561
Category : Computers
Languages : en
Pages : 353
Book Description
Meeting the Challenges of Data Quality Management outlines the foundational concepts of data quality management and its challenges. The book enables data management professionals to help their organizations get more value from data by addressing the five challenges of data quality management: the meaning challenge (recognizing how data represents reality), the process/quality challenge (creating high-quality data by design), the people challenge (building data literacy), the technical challenge (enabling organizational data to be accessed and used, as well as protected), and the accountability challenge (ensuring organizational leadership treats data as an asset). Organizations that fail to meet these challenges get less value from their data than organizations that address them directly. The book describes core data quality management capabilities and introduces new and experienced DQ practitioners to practical techniques for getting value from activities such as data profiling, DQ monitoring and DQ reporting. It extends these ideas to the management of data quality within big data environments. This book will appeal to data quality and data management professionals, especially those involved with data governance, across a wide range of industries, as well as academic and government organizations. Readership extends to people higher up the organizational ladder (chief data officers, data strategists, analytics leaders) and in different parts of the organization (finance professionals, operations managers, IT leaders) who want to leverage their data and their organizational capabilities (people, processes, technology) to drive value and gain competitive advantage. This will be a key reference for graduate students in computer science programs which normally have a limited focus on the data itself and where data quality management is an often-overlooked aspect of data management courses. - Describes the importance of high-quality data to organizations wanting to leverage their data and, more generally, to people living in today's digitally interconnected world - Explores the five challenges in relation to organizational data, including "Big Data," and proposes approaches to meeting them - Clarifies how to apply the core capabilities required for an effective data quality management program (data standards definition, data quality assessment, monitoring and reporting, issue management, and improvement) as both stand-alone processes and as integral components of projects and operations - Provides Data Quality practitioners with ways to communicate consistently with stakeholders
Executing Data Quality Projects
Author: Danette McGilvray
Publisher: Academic Press
ISBN: 0128180161
Category : Computers
Languages : en
Pages : 378
Book Description
Executing Data Quality Projects, Second Edition presents a structured yet flexible approach for creating, improving, sustaining and managing the quality of data and information within any organization. Studies show that data quality problems are costing businesses billions of dollars each year, with poor data linked to waste and inefficiency, damaged credibility among customers and suppliers, and an organizational inability to make sound decisions. Help is here! This book describes a proven Ten Step approach that combines a conceptual framework for understanding information quality with techniques, tools, and instructions for practically putting the approach to work – with the end result of high-quality trusted data and information, so critical to today's data-dependent organizations. The Ten Steps approach applies to all types of data and all types of organizations – for-profit in any industry, non-profit, government, education, healthcare, science, research, and medicine. This book includes numerous templates, detailed examples, and practical advice for executing every step. At the same time, readers are advised on how to select relevant steps and apply them in different ways to best address the many situations they will face. The layout allows for quick reference with an easy-to-use format highlighting key concepts and definitions, important checkpoints, communication activities, best practices, and warnings. The experience of actual clients and users of the Ten Steps provide real examples of outputs for the steps plus highlighted, sidebar case studies called Ten Steps in Action. This book uses projects as the vehicle for data quality work and the word broadly to include: 1) focused data quality improvement projects, such as improving data used in supply chain management, 2) data quality activities in other projects such as building new applications and migrating data from legacy systems, integrating data because of mergers and acquisitions, or untangling data due to organizational breakups, and 3) ad hoc use of data quality steps, techniques, or activities in the course of daily work. The Ten Steps approach can also be used to enrich an organization's standard SDLC (whether sequential or Agile) and it complements general improvement methodologies such as six sigma or lean. No two data quality projects are the same but the flexible nature of the Ten Steps means the methodology can be applied to all. The new Second Edition highlights topics such as artificial intelligence and machine learning, Internet of Things, security and privacy, analytics, legal and regulatory requirements, data science, big data, data lakes, and cloud computing, among others, to show their dependence on data and information and why data quality is more relevant and critical now than ever before. - Includes concrete instructions, numerous templates, and practical advice for executing every step of The Ten Steps approach - Contains real examples from around the world, gleaned from the author's consulting practice and from those who implemented based on her training courses and the earlier edition of the book - Allows for quick reference with an easy-to-use format highlighting key concepts and definitions, important checkpoints, communication activities, and best practices - A companion Web site includes links to numerous data quality resources, including many of the templates featured in the text, quick summaries of key ideas from the Ten Steps methodology, and other tools and information that are available online
Publisher: Academic Press
ISBN: 0128180161
Category : Computers
Languages : en
Pages : 378
Book Description
Executing Data Quality Projects, Second Edition presents a structured yet flexible approach for creating, improving, sustaining and managing the quality of data and information within any organization. Studies show that data quality problems are costing businesses billions of dollars each year, with poor data linked to waste and inefficiency, damaged credibility among customers and suppliers, and an organizational inability to make sound decisions. Help is here! This book describes a proven Ten Step approach that combines a conceptual framework for understanding information quality with techniques, tools, and instructions for practically putting the approach to work – with the end result of high-quality trusted data and information, so critical to today's data-dependent organizations. The Ten Steps approach applies to all types of data and all types of organizations – for-profit in any industry, non-profit, government, education, healthcare, science, research, and medicine. This book includes numerous templates, detailed examples, and practical advice for executing every step. At the same time, readers are advised on how to select relevant steps and apply them in different ways to best address the many situations they will face. The layout allows for quick reference with an easy-to-use format highlighting key concepts and definitions, important checkpoints, communication activities, best practices, and warnings. The experience of actual clients and users of the Ten Steps provide real examples of outputs for the steps plus highlighted, sidebar case studies called Ten Steps in Action. This book uses projects as the vehicle for data quality work and the word broadly to include: 1) focused data quality improvement projects, such as improving data used in supply chain management, 2) data quality activities in other projects such as building new applications and migrating data from legacy systems, integrating data because of mergers and acquisitions, or untangling data due to organizational breakups, and 3) ad hoc use of data quality steps, techniques, or activities in the course of daily work. The Ten Steps approach can also be used to enrich an organization's standard SDLC (whether sequential or Agile) and it complements general improvement methodologies such as six sigma or lean. No two data quality projects are the same but the flexible nature of the Ten Steps means the methodology can be applied to all. The new Second Edition highlights topics such as artificial intelligence and machine learning, Internet of Things, security and privacy, analytics, legal and regulatory requirements, data science, big data, data lakes, and cloud computing, among others, to show their dependence on data and information and why data quality is more relevant and critical now than ever before. - Includes concrete instructions, numerous templates, and practical advice for executing every step of The Ten Steps approach - Contains real examples from around the world, gleaned from the author's consulting practice and from those who implemented based on her training courses and the earlier edition of the book - Allows for quick reference with an easy-to-use format highlighting key concepts and definitions, important checkpoints, communication activities, and best practices - A companion Web site includes links to numerous data quality resources, including many of the templates featured in the text, quick summaries of key ideas from the Ten Steps methodology, and other tools and information that are available online
Data Quality
Author: Jack E. Olson
Publisher: Elsevier
ISBN: 0080503691
Category : Computers
Languages : en
Pages : 313
Book Description
Data Quality: The Accuracy Dimension is about assessing the quality of corporate data and improving its accuracy using the data profiling method. Corporate data is increasingly important as companies continue to find new ways to use it. Likewise, improving the accuracy of data in information systems is fast becoming a major goal as companies realize how much it affects their bottom line. Data profiling is a new technology that supports and enhances the accuracy of databases throughout major IT shops. Jack Olson explains data profiling and shows how it fits into the larger picture of data quality.* Provides an accessible, enjoyable introduction to the subject of data accuracy, peppered with real-world anecdotes. * Provides a framework for data profiling with a discussion of analytical tools appropriate for assessing data accuracy. * Is written by one of the original developers of data profiling technology. * Is a must-read for any data management staff, IT management staff, and CIOs of companies with data assets.
Publisher: Elsevier
ISBN: 0080503691
Category : Computers
Languages : en
Pages : 313
Book Description
Data Quality: The Accuracy Dimension is about assessing the quality of corporate data and improving its accuracy using the data profiling method. Corporate data is increasingly important as companies continue to find new ways to use it. Likewise, improving the accuracy of data in information systems is fast becoming a major goal as companies realize how much it affects their bottom line. Data profiling is a new technology that supports and enhances the accuracy of databases throughout major IT shops. Jack Olson explains data profiling and shows how it fits into the larger picture of data quality.* Provides an accessible, enjoyable introduction to the subject of data accuracy, peppered with real-world anecdotes. * Provides a framework for data profiling with a discussion of analytical tools appropriate for assessing data accuracy. * Is written by one of the original developers of data profiling technology. * Is a must-read for any data management staff, IT management staff, and CIOs of companies with data assets.
Data Governance
Author: John Ladley
Publisher: Academic Press
ISBN: 0128158328
Category : Technology & Engineering
Languages : en
Pages : 352
Book Description
Managing data continues to grow as a necessity for modern organizations. There are seemingly infinite opportunities for organic growth, reduction of costs, and creation of new products and services. It has become apparent that none of these opportunities can happen smoothly without data governance. The cost of exponential data growth and privacy / security concerns are becoming burdensome. Organizations will encounter unexpected consequences in new sources of risk. The solution to these challenges is also data governance; ensuring balance between risk and opportunity. Data Governance, Second Edition, is for any executive, manager or data professional who needs to understand or implement a data governance program. It is required to ensure consistent, accurate and reliable data across their organization. This book offers an overview of why data governance is needed, how to design, initiate, and execute a program and how to keep the program sustainable. This valuable resource provides comprehensive guidance to beginning professionals, managers or analysts looking to improve their processes, and advanced students in Data Management and related courses. With the provided framework and case studies all professionals in the data governance field will gain key insights into launching successful and money-saving data governance program. - Incorporates industry changes, lessons learned and new approaches - Explores various ways in which data analysts and managers can ensure consistent, accurate and reliable data across their organizations - Includes new case studies which detail real-world situations - Explores all of the capabilities an organization must adopt to become data driven - Provides guidance on various approaches to data governance, to determine whether an organization should be low profile, central controlled, agile, or traditional - Provides guidance on using technology and separating vendor hype from sincere delivery of necessary capabilities - Offers readers insights into how their organizations can improve the value of their data, through data quality, data strategy and data literacy - Provides up to 75% brand-new content compared to the first edition
Publisher: Academic Press
ISBN: 0128158328
Category : Technology & Engineering
Languages : en
Pages : 352
Book Description
Managing data continues to grow as a necessity for modern organizations. There are seemingly infinite opportunities for organic growth, reduction of costs, and creation of new products and services. It has become apparent that none of these opportunities can happen smoothly without data governance. The cost of exponential data growth and privacy / security concerns are becoming burdensome. Organizations will encounter unexpected consequences in new sources of risk. The solution to these challenges is also data governance; ensuring balance between risk and opportunity. Data Governance, Second Edition, is for any executive, manager or data professional who needs to understand or implement a data governance program. It is required to ensure consistent, accurate and reliable data across their organization. This book offers an overview of why data governance is needed, how to design, initiate, and execute a program and how to keep the program sustainable. This valuable resource provides comprehensive guidance to beginning professionals, managers or analysts looking to improve their processes, and advanced students in Data Management and related courses. With the provided framework and case studies all professionals in the data governance field will gain key insights into launching successful and money-saving data governance program. - Incorporates industry changes, lessons learned and new approaches - Explores various ways in which data analysts and managers can ensure consistent, accurate and reliable data across their organizations - Includes new case studies which detail real-world situations - Explores all of the capabilities an organization must adopt to become data driven - Provides guidance on various approaches to data governance, to determine whether an organization should be low profile, central controlled, agile, or traditional - Provides guidance on using technology and separating vendor hype from sincere delivery of necessary capabilities - Offers readers insights into how their organizations can improve the value of their data, through data quality, data strategy and data literacy - Provides up to 75% brand-new content compared to the first edition
Data Quality Assessment
Author: Arkady Maydanchik
Publisher:
ISBN: 9780977140022
Category : Computers
Languages : en
Pages : 0
Book Description
Imagine a group of prehistoric hunters armed with stone-tipped spears. Their primitive weapons made hunting large animals, such as mammoths, dangerous work. Over time, however, a new breed of hunters developed. They would stretch the skin of a previously killed mammoth on the wall and throw their spears, while observing which spear, thrown from which angle and distance, penetrated the skin the best. The data gathered helped them make better spears and develop better hunting strategies. Quality data is the key to any advancement, whether it is from the Stone Age to the Bronze Age. Or from the Information Age to whatever Age comes next. The success of corporations and government institutions largely depends on the efficiency with which they can collect, organise, and utilise data about products, customers, competitors, and employees. Fortunately, improving your data quality does not have to be such a mammoth task. This book is a must read for anyone who needs to understand, correct, or prevent data quality issues in their organisation. Skipping theory and focusing purely on what is practical and what works, this text contains a proven approach to identifying, warehousing, and analysing data errors. Master techniques in data profiling and gathering metadata, designing data quality rules, organising rule and error catalogues, and constructing the dimensional data quality scorecard. David Wells, Director of Education of the Data Warehousing Institute, says "This is one of those books that marks a milestone in the evolution of a discipline. Arkady's insights and techniques fuel the transition of data quality management from art to science -- from crafting to engineering. From deep experience, with thoughtful structure, and with engaging style Arkady brings the discipline of data quality to practitioners."
Publisher:
ISBN: 9780977140022
Category : Computers
Languages : en
Pages : 0
Book Description
Imagine a group of prehistoric hunters armed with stone-tipped spears. Their primitive weapons made hunting large animals, such as mammoths, dangerous work. Over time, however, a new breed of hunters developed. They would stretch the skin of a previously killed mammoth on the wall and throw their spears, while observing which spear, thrown from which angle and distance, penetrated the skin the best. The data gathered helped them make better spears and develop better hunting strategies. Quality data is the key to any advancement, whether it is from the Stone Age to the Bronze Age. Or from the Information Age to whatever Age comes next. The success of corporations and government institutions largely depends on the efficiency with which they can collect, organise, and utilise data about products, customers, competitors, and employees. Fortunately, improving your data quality does not have to be such a mammoth task. This book is a must read for anyone who needs to understand, correct, or prevent data quality issues in their organisation. Skipping theory and focusing purely on what is practical and what works, this text contains a proven approach to identifying, warehousing, and analysing data errors. Master techniques in data profiling and gathering metadata, designing data quality rules, organising rule and error catalogues, and constructing the dimensional data quality scorecard. David Wells, Director of Education of the Data Warehousing Institute, says "This is one of those books that marks a milestone in the evolution of a discipline. Arkady's insights and techniques fuel the transition of data quality management from art to science -- from crafting to engineering. From deep experience, with thoughtful structure, and with engaging style Arkady brings the discipline of data quality to practitioners."
Measuring Quality Improvement in Healthcare
Author: Raymond G. Carey
Publisher: Quality Press
ISBN: 1636940811
Category : Health & Fitness
Languages : en
Pages : 339
Book Description
This ground-breaking book addresses the critical, growing need among health care administrators and practitioners to measure the effectiveness of quality improvement efforts. Written by respected healthcare quality professionals, Measuring Quality Improvement in Healthcare covers practical applications of the tools and techniques of statistical process control (SPC), including control charts, in healthcare settings. The authors' straightforward discussions of data collection, variation, and process improvement set the context for the use and interpretation of control charts. Their approach incorporates "the voice of the customer" as a key element driving the improvement processes and outcomes. The core of the book is a set of 12 case studies that show how to apply statistical thinking to health care process, and when and how to use different types of control charts. The practical, down-to-earth orientation of the book makes it accessible to a wide readership. "Only authors who have used statistics and control charts to solve real-world healthcare problems could have written a book so practical and timely." - Barry S. Bader, Publisher The Quality Letter for Healthcare Leaders "Many clinicians and other healthcare leaders underestimate the great contributions that better statistical thinking could make toward reducing costs and improving outcomes. This fascinating and timely book is a fine guide for getting started." - Donald M. Berwick, M.D. President and CEO, Institute for Healthcare Improvement Associate Professor of Pediatrics, Harvard Medical School Contents: Planning Your CQI Journey, Preparing to Collect Data, Data Collection, Understanding Variation, Using Run and Control Charts to Analyze Process Variation, Control Chart Case Studies, Developing Improvement Strategies, Using Patient Surveys for CQI, Formulas for Calculating Control Limits
Publisher: Quality Press
ISBN: 1636940811
Category : Health & Fitness
Languages : en
Pages : 339
Book Description
This ground-breaking book addresses the critical, growing need among health care administrators and practitioners to measure the effectiveness of quality improvement efforts. Written by respected healthcare quality professionals, Measuring Quality Improvement in Healthcare covers practical applications of the tools and techniques of statistical process control (SPC), including control charts, in healthcare settings. The authors' straightforward discussions of data collection, variation, and process improvement set the context for the use and interpretation of control charts. Their approach incorporates "the voice of the customer" as a key element driving the improvement processes and outcomes. The core of the book is a set of 12 case studies that show how to apply statistical thinking to health care process, and when and how to use different types of control charts. The practical, down-to-earth orientation of the book makes it accessible to a wide readership. "Only authors who have used statistics and control charts to solve real-world healthcare problems could have written a book so practical and timely." - Barry S. Bader, Publisher The Quality Letter for Healthcare Leaders "Many clinicians and other healthcare leaders underestimate the great contributions that better statistical thinking could make toward reducing costs and improving outcomes. This fascinating and timely book is a fine guide for getting started." - Donald M. Berwick, M.D. President and CEO, Institute for Healthcare Improvement Associate Professor of Pediatrics, Harvard Medical School Contents: Planning Your CQI Journey, Preparing to Collect Data, Data Collection, Understanding Variation, Using Run and Control Charts to Analyze Process Variation, Control Chart Case Studies, Developing Improvement Strategies, Using Patient Surveys for CQI, Formulas for Calculating Control Limits