نام کتاب
Data Quality Engineering in Financial Services

Applying Manufacturing Techniques to Data

Brian Buzzelli

Paperback177 Pages
PublisherO'Reilly
Edition1
LanguageEnglish
Year2022
ISBN9781098136932
816
A4652
انتخاب نوع چاپ:
جلد سخت
415,000ت
0
جلد نرم
355,000ت
0
طلق پاپکو و فنر
365,000ت
0
مجموع:
0تومان
کیفیت متن:اورجینال انتشارات
قطع:B5
رنگ صفحات:دارای متن و کادر رنگی
پشتیبانی در روزهای تعطیل!
ارسال به سراسر کشور

#Data

#Data_Quality

#Data_Engineering

#Data_Management

#Financial

#DQS

توضیحات

Data quality will either make you or break you in the financial services industry. Missing prices, wrong market values, trading violations, client performance restatements, and incorrect regulatory filings can all lead to harsh penalties, lost clients, and financial disaster. This practical guide provides data analysts, data scientists, and data practitioners in financial services firms with the framework to apply manufacturing principles to financial data management, understand data dimensions, and engineer precise data quality tolerances at the datum level and integrate them into your data processing pipelines.


You'll get invaluable advice on how to:

  • Evaluate data dimensions and how they apply to different data types and use cases
  • Determine data quality tolerances for your data quality specification
  • Choose the points along the data processing pipeline where data quality should be assessed and measured
  • Apply tailored data governance frameworks within a business or technical function or across an organization
  • Precisely align data with applications and data processing pipelines
  • And more


Table of Contents

Chapter 1. Thinking Like a Manufacturer

Chapter 2. The Shape of Data

Chapter 3. Data Quality Specifications

Chapter 4. DQS Model Example

Chapter 5. Data Quality Metrics and Visualization

Chapter 6. Operational Efficiency Cost Model

Chapter 7. Data Governance

Chapter 8. Master Data Management

Chapter 9. Data Project Methodology

Chapter 10. Enterprise Data Management


Most people would say we live in a world where we trust in the manufacturing discipline and quality standards used to provide the food we eat, the water we drink, the medications we take, and the sophisticated technology products we use in our daily lives. We can appreciate the years of evolution in science, refinement in manufacturing techniques, and codification of product specifications that form the basis of the trust we enjoy in consuming and using physical products today. Given the monumental achievements in science, technology, and manufacturing; what then is so different about the data used in the financial industry whereby data and information must be constantly checked, rechecked, and reconciled to ensure its accuracy and quality?


Data is the fundamental raw material used in the financial industry to manage your retirement and family’s wealth assets, provide operating and growth capital to companies, and drive the global financial system as the life blood of the global economy. Unlike the manufacturing industry, data flows in the financial industry have evolved from being based on open outcry, telephone, paper trails, and ticker tapes, to being grounded in sophisticated and complex computational, artificial intelligence, and machine learning applications. We capture, store, and pass along data through complex applications, and we use data in business processes with a general assumption that the data is reliable and suitable for use.


However, data has no physical form and has the capacity to be infinitely malleable. By contrast, the raw materials in manufacturing have physical form. The physical properties can be measured and assessed for suitability based on the specification for the physical properties and tolerances for which the raw material is certified compliant for use. This is one of the key concepts whereby we will apply a similar manufacturing framework to data and define the properties of data that can be measured against a specification. Examples of data will be presented as if it has mass and physical form, but in the context of measurable data dimensions: completeness, timeliness, accuracy, precision, conformity, congruence, collection, and cohesion.


The premise in this book is that data has shape, has measurable dimensions, and can be inspected and measured relative to data quality specifications and tolerances that yield data quality metrics. The results can then be analyzed using data quality specifications to derive data quality metrics. The data quality processes in manufacturing include evaluating specific measurements of physical materials relative to control specifications. The results are analyzed to determine whether the materials quality measurements and metrics are within design specifications and acceptable tolerances.


While the financial industry struggles with a lack of industry standards for data identification, definition, and representation, combined with the fast pace of financial product innovation, manufacturing evolution and maturity demonstrates highly robust methodologies, accuracy, and purity in techniques, and precision in materials processing. Today we enjoy and perhaps take for granted the technical complexities that give us modern medicines, genetics, super crops, jets, satellites, smartphones, flat screen TVs, artificial intelligence, robotics, wristwatch computers—the list is endless.


The financial industry can learn a great deal from the application of mature manufacturing techniques to our immature data management discipline. The primary benefit of applying similar precision in data quality validations is high quality data. However, from a business perspective, additional benefits include the following:

  • Operational efficiency
  • Lower cost of operations
  • Less wasted effort
  • Higher data precision
  • More accurate business decision making


This book is intended to provide useful frameworks and techniques that can be introduced into your data structures and data management operations. The expectation is the application of these techniques and frameworks will improve data processing efficiency, data identification, data quality, reduce operational data issues, and increase trust that the data is business ready and fit for purpose.


Review

"This book is a must for any data professional, regardless of industry. Brian has provided a definitive guide on how to best ensure that data processes - from sourcing and ingestion, to firmwide utilization - are properly monitored, measured and controlled. The insights that he illustrates are born out of a long history of working with content and enabling financial professionals to perform their jobs. The principles presented herein are applicable to any organization that needs to build proper and efficient data governance and data management. Finally, here is a tool that can help everyone from Chief Data Officers to data engineers in the performance of their roles."

-- Barry S. Raskin, Head of Data Practice, Relevate Data Monetization Corp.


"Brian Buzzelli presents a clear how-to guide for the finance professional to motivate, design, and implement a comprehensive data quality framework. Even in early stages, the data quality program will improve efficiency, reduce risk, and build trust with clients and across functions. Brian demonstrates the connection between data integrity and fiduciary obligation with relevant examples. Borrowing unabashedly from concepts in high precision manufacturing, Brian provides a step-by-step plan to engineer an enterprise level data quality program with solutions designed for specific functions. The code examples are especially applicable, providing the reader with a set of practical tools. I believe these concepts are an important contribution to the field."

-- Matthew Lyberg, CFA, Quantitative Researcher, NDVR Inc.


"This book is an essential reading not only for the data management specialists but for anyone who works with and relies on data. Brian Buzzelli harnesses his many years of practical, "been there, done that, have scars to prove it" experience to teach the reader how to apply manufacturing quality control principles to "find a needle in a haystack" - that one erroneous attribute that will have an outside impact."

-- Julia Bardmesser, SVP, Head of Data, Architecture and Salesforce Development, Voya Financial


"This is the perfect playbook that, if implemented, will allow any financial services company to put their data on an offensive footing to drive alpha and insights without sacrificing quality, governance, or compliance."

-- Michael McCarthy, Principal Investment Data Architect, Investment Data Management Office, MFS


"The approach to data quality expressed in this book is based on an original idea of using quality and standardization principles applied from manufacturing. It provides insights into a pragmatic and tested data quality framework that will be useful to any data practitioner."


About the Author

Mr. Buzzelli is Senior Vice President, Head of Enterprise Data Management for Acadian, a quantitative institutional asset management firm specializing in active global, emerging and frontier investments utilizing sophisticated analytical models and specialized research expertise. Brian has defined a systematic and rigorous approach to data quality engineering through the application of specific tolerances to data dimensions based on manufacturing principles and his expertise developed over 27years of experience. His leadership in implementing data governance, data usage policies, data standards, data quality measurement, data taxonomies, architecture, and meta-data have supported some of the most complex financial business functions at Acadian, Nomura, Thomson Reuters, and Mellon Financial. Data quality engineering, data management, and the application of manufacturing principles to data dimensions and data quality validation is at the center of his professional focus.


He is a graduate of Carnegie Mellon University with a Bachelor of Science degree in Information and Decision Systems and holds two masterâ??s degrees: Management of Information Systems and an MBA in Finance from the Katz Business School at the University of Pittsburgh.

دیدگاه خود را بنویسید
نظرات کاربران (0 دیدگاه)
نظری وجود ندارد.
کتاب های مشابه
Data
762
Building Integrations with MuleSoft
519,000 تومان
Data
980
Essential SQLAlchemy
389,000 تومان
Data
887
Pro Database Migration to Azure
540,000 تومان
Data
509
Practical Lakehouse Architecture
475,000 تومان
Data
599
Working with Data in Public Health
382,000 تومان
Data
568
Data Plane Development Kit (DPDK)
518,000 تومان
Data
963
CompTIA Data+ Study Guide
569,000 تومان
Data
1,460
Optimizing Databricks Workloads
413,000 تومان
Data
1,774
Data Lakehouse in Action
387,000 تومان
Python
792
Data Wrangling with Python
872,000 تومان
قیمت
منصفانه
ارسال به
سراسر کشور
تضمین
کیفیت
پشتیبانی در
روزهای تعطیل
خرید امن
و آسان
آرشیو بزرگ
کتاب‌های تخصصی
هـر روز با بهتــرین و جــدیــدتـرین
کتاب های روز دنیا با ما همراه باشید
آدرس
پشتیبانی
مدیریت
ساعات پاسخگویی
درباره اسکای بوک
دسترسی های سریع
  • راهنمای خرید
  • راهنمای ارسال
  • سوالات متداول
  • قوانین و مقررات
  • وبلاگ
  • درباره ما
چاپ دیجیتال اسکای بوک. 2024-2022 ©