3 Months Free Update
3 Months Free Update
3 Months Free Update
In Data Modelling, the generalization of the concept of person and organization into a party enables:
Integrating data security with document and content management knowledge areas.
guides the implementation of:
The data warehouse and marts differ from that in applications as the data is organized by subject rather than function.
Characteristics that minimise distractions and maximise useful information include, but not limited to, consistent object attributes
Data mining is a sub-field of supervised learning where users attempt to model data elements and predict future outcomes through the evaluation of probability estimates.
Achieving security risk reduction in an organisation begins with developing what?
The business glossary application is structured to meet the functional requirements of the three core audiences:
Controlling data availability requires management of user entitlements and of structures that technically control access based on entitlements.
Through similarity analysis, slight variation in data can be recognized and data values can be consolidated. Two basic approaches, which can be used together, are:
An implemented warehouse and its customer-facing BI tools is a technology product.
When recovering from multiple system failures, what is the biggest difficulty faced
by a DBA?
The better an organization understands the lifecycle and lineage of its data, the better able it will be to manage its data. Please select correct implication of the focus of data management on the data lifecycle.
Data modelling tools and model repositories are necessary for managing the enterprise data model in all levels.
Data warehousing describes the operational extract, cleaning, transformation, control and load processes that maintain the data in a data warehouse.
Data Management maturity has many goals for accomplishment including having a positive effect on culture. This is important to a Data Governance program for the following reason:
Data security internal audits ensure data security and regulatory compliance policies are followed should be conducted regularly and consistently.
People often incorrectly combine the concepts of data management and information technology into one. Which of the following is NOT an example of this?
The goals of implementing best practices around document and content management include:
Examples of concepts that can be standardized within the data architecture knowledge area include:
What are the three characteristics of effective Data Governance communication?
Data for Big Data ingestion can also be called the data lake. This needs to be carefully managed, or the data lake will become:
A data warehouse deployment with multiple ETL, storage and querying tools often
suffers due to the lack of:
The ethics of data handling are complex, but is centred on several core concepts. Please select the correct answers.
The search function associated with a document management store is failing to return known artefacts. This is due to a failure of:
Risk classifications describe the sensitivity of the data and the likelihood that it might be sought after for malicious purposes.
When doing reference data management, there many organizations that have standardized data sets that are incredibly valuable and should be subscribed to. Which of these organizations would be least useful?
Data quality management is a key capability of a data management practice and organization.
Enterprise data architects in an application migration project are primarily concerned with:
With respect to health data, what is the difference between the privacy and the security of the data?
The advantage of a decentralised Data Governance model over a centralised model is:
Which of the following is NOT a stage of the Data Quality Management Cycle?
In a data warehouse, where the classification lists for organisation type are
inconsistent in different source systems, there is an indication that there is a lack of
focus on:
Business metadata focuses largely on the content and condition of the data and includes details related to data governance.
When data is classified as either security data or regulatory data, the result will be:
A project scope requires the collection, exchange, and reporting of data from multiple in-house custom systems. Documents gathered include business concepts, existing database schemas, XSDs, and reporting layouts. How many models of each layer of abstraction can be expected?
The load step of ETL is physically storing or presenting the results of the transformation in the target system.
Which model has one Data Governance organization coordinate with multiple Business Units to maintain consistent definitions and standards?
A ‘Golden Record’ means that it is always a 100% complete and accurate representation of all entities within the organization.
Functionality-focused requirements associated with a comprehensive metadata solution, include:
The Belmont principles that may be adapted for Information Management disciplines, include:
Defining quality content requires understanding the context of its production and use, including:
Data Governance Office (DGO) focuses on enterprise-level data definitions and data management standards across all DAMA-DMBOK knowledge areas. Consists of coordinating data management roles.
Some document management systems have a module that may support different types of workflows such as:
Practitioners identify development of staff capability to be a primary concern of Data Governance. Why would this be a main concern?
Field overloading: Unnecessary data duplication is often a result of poor data management.
Domains can be identified in different ways including: data type; data format; list; range; and rule-based.
According to the DMBoK, Data Governance is central to Data Management. In practical terms, what other functions of Data Management are required to ensure that your Data Governance programme is successful?
An implemented warehouse and its customer facing BI tool is a data product.
Quality Assurance Testing (QA) is used to test functionality against requirements.
Customer value comes when the economic benefit of using data outweighs the costs of acquiring and storing it, as well we managing risk related to usage. Which of these is not a way to measure value?
Looking at the DMBoK definition of Data Governance, and other industry definitions, what are some of the common key elements of Data Governance?
What ISO standard defines characteristics that can be tested by any organisation in the data supply chain to objectively determine conformance of the data to this ISO standard.
Obfuscating or redacting data is the practice of making information anonymous ot removing sensitive information. Risks are present in the following instances:
Effectiveness metrics for a data governance programme includes: achievement of goals and objectives; extend stewards are using the relevant tools; effectiveness of communication; and effectiveness of education.
Big Data and Data Science Governance should address such data questions as:
The term data quality refers to only the characteristics associated with high quality data.
Data Governance includes developing alignment of the data management approach with organizational touchpoints outside of the direct authority of the Chief Data Officer. Select the example of such a touchpoint.
The goals of data security practices is to protect information assets in alignment with privacy and confidentiality regulations, contractual agreements and business requirements. These requirements come from:
Which of the following are must-do for any successful Data Governance programme?
A Metadata repository contains information about the data in an organization, including:
An input in the data architecture context diagram includes data governance.
Data integrity is the state of being partitioned – protected from being whole.
Change only requires change agents in special circumstances, especially when there is little to no adoption.
What areas should you consider when constructing an organization's Data Governance operating model?
It is unwise to implement data quality checks to ensure that the copies of the attributes are correctly stored.
Validity, as a dimension of data quality, refers to whether data values are consistent with a defined domain of values.
Over a decade an organization has rationalized implementation of party concepts from 48 systems to 3. This is a result of good:
Emergency contact phone number would be found in which master data
management program?
Bold means doing something that might cause short term pain, not just something that looks good in a marketing email.
DMMA ratings represent a snapshot of the organization’s capability level.
Please select the correct component pieces that form part of an Ethical Handling Strategy and Roadmap.
An effective team is based on two simple foundations: trust and a common goal.
It is recommended that organizations not print their business data glossaries for general use, why would you not want to print the glossary?
What position should be responsible for leading the Data Governance Council (DGC)?
The impact of the changes from new volatile data must be isolated from the bulk of the historical, non-volatile DW data. There are three main approaches, including:
The roles associated with enterprise data architecture are data architect, data modellers and data stewards.
The biggest business driver for developing organizational capabilities around Big Data and Data Science is the desire to find and act on business opportunities that may be discovered through data sets generated through a diversified range of processes.
A goal of reference and master data is to provide authoritative source of reconciled and quality-assessed master and reference data.
Data governance requires control mechanisms and procedures for, but not limited to, assignment and tracking of action items.
All metadata management solutions include architectural layers including:
XML provides a language for representing both structures and unstructured data and information.
A sandbox is an alternate environment that allows write-only connections to production data and can be managed by the administrator.
DAMA International’s Certified Data Management Professional (CDMP) certification required that data management professionals subscribe to a formal code of ethics, including an obligation to handle data ethically for the sake of society beyond the organization that employs them.
Data architect: A senior analyst responsible for data architecture and data integration.
Data Management Professionals only work with the technical aspects related to data.
Information gaps represent enterprise liabilities with potentially profound impacts on operational effectiveness and profitability.
Which of these is NOT likely in the scope of Data Governance and Stewardship?
Resource Description Framework (RDF), a common framework used to describe information about any Web resource, is a standard model for data interchange in the Web.
Communication should start later in the process as too many inputs will distort the vision.
Organizations conduct capability maturity assessments for a number of reasons, including:
Companies do not rely on their information systems to run their operations.
Corrective actions are implemented after a problem has occurred and been detected.
Measuring the effects of change management on in five key areas including: Awareness of the need to change; Desire to participate and support the change; Knowledge about how to change; Ability to implement new skills and behaviors; and Reinforcement to keep the change in place.
Operationality and interoperability depends on the data quality. In order to measure the efficiency of a repository the data quality needs to be:
Customer relationship management systems manage Master Data about customers.
In designing and building the database, the DBA should keep the following design principles in mind:
The failure to gain acceptance of a business glossary may be due to ineffective:
Data governance can be understood in terms of political governance. It includes the following three function types:
The language used in file-based solutions is called MapReduce. This language has three main steps:
Data quality rules and standards are a critical form of Metadata. Ti be effective they need to be managed as Metadata. Rules include:
A database uses foreign keys from code tables for column values. This is a way of
implementing:
The information governance maturity model describes the characteristics of the information governance and recordkeeping environment at five levels of maturity for each of the eight GARP principles. Please select the correct level descriptions:
A ‘Content Distribution Network’ supporting a multi-national website is likely to use:
With reliable Metadata an organization does not know what data it has, what the data represents and how it moves through the systems, who has access to it, or what it means for the data to be of high quality.
Data asset valuation is the process of understanding and calculating the economic value of data to an organisation. Value comes when the economic benefit of using data outweighs the costs of acquiring and storing it, as
Small reference data value sets in the logical data model can be implemented in a physical model in three common ways:
Normalisation is the process of applying rules in order to organise business complexity into stable data structures.
Archiving is the process of moving data off immediately accessible storage media and onto media with lower retrieval performance.
What are the three characteristics of effective Data Governance communication?
Deliverables in the data management maturity assessment context diagram include:
Time-based patterns are used when data values must be associated in chronological order and with specific time values.
Uniqueness, as a dimension of data quality, states no entity exists more than once within the data set.
An advantage of a centralized repository include: High availability since it is independent of the source systems.
Use business rules to support Data Integration and Interoperability at various points, to:
The library of Alexandria was one of the largest collection of books in the ancient
world. Which DMBoK knowledge area is most aligned with managing the collection?
Where does the ethical responsibility lie with respect to managing data to reduce risks of misrepresentation, misuse, or misunderstanding?
The most common drivers for initiating a Mater Data Management Program are:
Part of alignment includes developing organizational touchpoints for data governance work. Some examples of touchpoints include: Procurement and Contracts; Budget and Funding; Regulatory Compliance; and the SDLC framework.
In an information management context, the short-term wins and goals often arise from the resolution of an identified problem.
Factors that have shown to play a key role in the success in the success of effective data management organizations does not include:
In the Abate Information Triangle the past moves through the following echelons befor it comes insight:
Data science merges data mining, statistical analysis, and machine learning with the integration and data modelling capabilities, to build predictive models that explore data content patterns.
The independent updating of data into a system of reference is likely to cause:
Data management professionals who understand formal change management will be more successful in bringing about changes that will help their organizations get more value from their data. To do so, it is important to understand:
A business driver for Master Data Management program is managing data quality.
The need to manage data movement efficiently is a primary driver for Data Integration and Interoperability.
Three classic implementation approaches that support Online Analytical Processing include:
Misleading visualisations could be an example where a base level of truthfulness and transparency are not adhered to.
An Operational Data Mart is a data mart focused on tactical decision support.
What position is responsible for the quality and use of their organization’s data assets?
Media monitoring and text analysis are automated methods for retrieving insights from large unstructured or semi-structured data, such as transaction data, social media, blogs, and web news sites.
A dimensional physical data model is usually a star schema, meaning there is one structure for each dimension.
Select the areas to consider when constructing an organization’s operating model:
Which Data Architecture Artifact describes how data transforms into business
assets?
If the target system has more transformation capability than either the source or the intermediary application system, the order of processes may be switched to ELT – Extract Load Tranform.
Who should write the main content for a security policy for an organisation?
Machine learning explores the construction and study of learning algorithms.
All data is of equal importance. Data quality management efforts should be spread between all the data in the organization.
Access to data for Multidimensional databases use a variant of SQL called MDX or Multidimensional expression.
Data and enterprise architecture deal with complexity from two viewpoints: