We at Crack4sure are committed to giving students who are preparing for the DAMA DMF-1220 Exam the most current and reliable questions . To help people study, we've made some of our Data Management Fundamentals exam materials available for free to everyone. You can take the Free DMF-1220 Practice Test as many times as you want. The answers to the practice questions are given, and each answer is explained.
In Data Modelling, the generalization of the concept of person and organization into a party enables:
Metrics tied to Reference and Master Data Quality include:
The 'Data Governance Steering Committee' is best described as:
Operational reports are outputs from the data stewards.
Which of the following is a Data Quality principle?
Metrics tied to Reference and Master Data quality include:
Creating the CDM involves the following steps:
Accomplish repository scanning in two distinct approaches, including:
Integrating data security with document and content management knowledge areas.
guides the implementation of:
There are three techniques for data-based change data capture, namely:
The data warehouse and marts differ from that in applications as the data is organized by subject rather than function.
Characteristics that minimise distractions and maximise useful information include, but not limited to, consistent object attributes
Data mining is a sub-field of supervised learning where users attempt to model data elements and predict future outcomes through the evaluation of probability estimates.
Achieving security risk reduction in an organisation begins with developing what?
The business glossary application is structured to meet the functional requirements of the three core audiences:
The minority of operational metadata is generated as data is processed.
What is one of the most important things about collecting data?
Controlling data availability requires management of user entitlements and of structures that technically control access based on entitlements.
During the intial scoping of a project, a data model can be used to:
Through similarity analysis, slight variation in data can be recognized and data values can be consolidated. Two basic approaches, which can be used together, are:
An implemented warehouse and its customer-facing BI tools is a technology product.
During the initial scoping of a project, a data model can be used to:
Advantages if a centralized metadata repository include:
When recovering from multiple system failures, what is the biggest difficulty faced
by a DBA?
The better an organization understands the lifecycle and lineage of its data, the better able it will be to manage its data. Please select correct implication of the focus of data management on the data lifecycle.
Data modelling tools and model repositories are necessary for managing the enterprise data model in all levels.
A complexity in documenting data lineage is:
CMA is an abbreviation for Capability Maturity Assessment.
Data warehousing describes the operational extract, cleaning, transformation, control and load processes that maintain the data in a data warehouse.
Data Management maturity has many goals for accomplishment including having a positive effect on culture. This is important to a Data Governance program for the following reason:
Reduced risk is a benefit of high quality data.
Data security internal audits ensure data security and regulatory compliance policies are followed should be conducted regularly and consistently.
People often incorrectly combine the concepts of data management and information technology into one. Which of the following is NOT an example of this?
The goals of implementing best practices around document and content management include:
What is the best definition of Crowdsourced data collection?
Class operations can be:
Examples of concepts that can be standardized within the data architecture knowledge area include:
What are the three characteristics of effective Data Governance communication?
Please select the correct name for the LDM abbreviation
A data dictionary is necessary to support the use of a DW.
Which answer is considered to be the best definition of data security?
Data for Big Data ingestion can also be called the data lake. This needs to be carefully managed, or the data lake will become:
A data warehouse deployment with multiple ETL, storage and querying tools often
suffers due to the lack of:
The ethics of data handling are complex, but is centred on several core concepts. Please select the correct answers.
The search function associated with a document management store is failing to return known artefacts. This is due to a failure of:
Risk classifications describe the sensitivity of the data and the likelihood that it might be sought after for malicious purposes.
When doing reference data management, there many organizations that have standardized data sets that are incredibly valuable and should be subscribed to. Which of these organizations would be least useful?
Data quality management is a key capability of a data management practice and organization.
Enterprise data architects in an application migration project are primarily concerned with:
There are several methods for masking data:
With respect to health data, what is the difference between the privacy and the security of the data?
Please select the types of DBA specializations:
Device security standard include:
The advantage of a decentralised Data Governance model over a centralised model is:
Which of the following is NOT a stage of the Data Quality Management Cycle?
In a data warehouse, where the classification lists for organisation type are
inconsistent in different source systems, there is an indication that there is a lack of
focus on:
Business metadata focuses largely on the content and condition of the data and includes details related to data governance.
Databases are categorized in three general ways:
The placement of Data Governance is most effective when:
When data is classified as either security data or regulatory data, the result will be:
Malware types include:
Latency can be:
A project scope requires the collection, exchange, and reporting of data from multiple in-house custom systems. Documents gathered include business concepts, existing database schemas, XSDs, and reporting layouts. How many models of each layer of abstraction can be expected?
Data Fabric is:
The load step of ETL is physically storing or presenting the results of the transformation in the target system.
Which model has one Data Governance organization coordinate with multiple Business Units to maintain consistent definitions and standards?
A ‘Golden Record’ means that it is always a 100% complete and accurate representation of all entities within the organization.
Functionality-focused requirements associated with a comprehensive metadata solution, include:
Identify indicative components of a Data Strategy.
The Belmont principles that may be adapted for Information Management disciplines, include:
Defining quality content requires understanding the context of its production and use, including:
Examples of interaction models include:
Data Governance Office (DGO) focuses on enterprise-level data definitions and data management standards across all DAMA-DMBOK knowledge areas. Consists of coordinating data management roles.
Some document management systems have a module that may support different types of workflows such as:
Practitioners identify development of staff capability to be a primary concern of Data Governance. Why would this be a main concern?
Field overloading: Unnecessary data duplication is often a result of poor data management.
The four main types of NoSQL databases are:
Enterprise Data Architects should support projects for potential:
Domains can be identified in different ways including: data type; data format; list; range; and rule-based.
According to the DMBoK, Data Governance is central to Data Management. In practical terms, what other functions of Data Management are required to ensure that your Data Governance programme is successful?
An implemented warehouse and its customer facing BI tool is a data product.
Quality Assurance Testing (QA) is used to test functionality against requirements.
Reference and master data require governance processes, including:
Customer value comes when the economic benefit of using data outweighs the costs of acquiring and storing it, as well we managing risk related to usage. Which of these is not a way to measure value?
Looking at the DMBoK definition of Data Governance, and other industry definitions, what are some of the common key elements of Data Governance?
Examples of transformation in the ETL process onclude:
Triplestores can be classified into these categories:
What ISO standard defines characteristics that can be tested by any organisation in the data supply chain to objectively determine conformance of the data to this ISO standard.
Obfuscating or redacting data is the practice of making information anonymous ot removing sensitive information. Risks are present in the following instances:
Effectiveness metrics for a data governance programme includes: achievement of goals and objectives; extend stewards are using the relevant tools; effectiveness of communication; and effectiveness of education.
Big Data and Data Science Governance should address such data questions as:
The term data quality refers to only the characteristics associated with high quality data.
Data Governance includes developing alignment of the data management approach with organizational touchpoints outside of the direct authority of the Chief Data Officer. Select the example of such a touchpoint.
The goals of data security practices is to protect information assets in alignment with privacy and confidentiality regulations, contractual agreements and business requirements. These requirements come from:
Which of the following are must-do for any successful Data Governance programme?
A Metadata repository contains information about the data in an organization, including:
An input in the data architecture context diagram includes data governance.
Data integrity is the state of being partitioned – protected from being whole.
Change only requires change agents in special circumstances, especially when there is little to no adoption.
What areas should you consider when constructing an organization's Data Governance operating model?
Self-service is a fundamental delivery channel in the BI portfolio.
It is unwise to implement data quality checks to ensure that the copies of the attributes are correctly stored.
Validity, as a dimension of data quality, refers to whether data values are consistent with a defined domain of values.
Which of the following is NOT a type of Data Steward?
The four A’s in security processes include:
Select three correct attributes a data governance programme must be:
Over a decade an organization has rationalized implementation of party concepts from 48 systems to 3. This is a result of good:
What is the best reason for capturing synonyms in a data repository?
Emergency contact phone number would be found in which master data
management program?
Bold means doing something that might cause short term pain, not just something that looks good in a marketing email.
DMMA ratings represent a snapshot of the organization’s capability level.
Please select the correct component pieces that form part of an Ethical Handling Strategy and Roadmap.
An effective team is based on two simple foundations: trust and a common goal.
It is recommended that organizations not print their business data glossaries for general use, why would you not want to print the glossary?
What position should be responsible for leading the Data Governance Council (DGC)?
The impact of the changes from new volatile data must be isolated from the bulk of the historical, non-volatile DW data. There are three main approaches, including:
The roles associated with enterprise data architecture are data architect, data modellers and data stewards.
Please select the correct types of data stewards:
The biggest business driver for developing organizational capabilities around Big Data and Data Science is the desire to find and act on business opportunities that may be discovered through data sets generated through a diversified range of processes.
Examples of transformation include:
The accuracy dimension has to do with the precision of data values.
A goal of reference and master data is to provide authoritative source of reconciled and quality-assessed master and reference data.
Control activities to manage metadata stores include:
Data governance requires control mechanisms and procedures for, but not limited to, assignment and tracking of action items.
The list of V’s include:
All metadata management solutions include architectural layers including:
XML provides a language for representing both structures and unstructured data and information.
A sandbox is an alternate environment that allows write-only connections to production data and can be managed by the administrator.
Sustainable Data Governance depends on:
The business glossary should capture business terms attributes such as:
DAMA International’s Certified Data Management Professional (CDMP) certification required that data management professionals subscribe to a formal code of ethics, including an obligation to handle data ethically for the sake of society beyond the organization that employs them.
A data lineage tool enables a user to:
Data architect: A senior analyst responsible for data architecture and data integration.
Data Management Professionals only work with the technical aspects related to data.
Information gaps represent enterprise liabilities with potentially profound impacts on operational effectiveness and profitability.
Developing complex event processing solutions require:
Which of these is NOT likely in the scope of Data Governance and Stewardship?
Resource Description Framework (RDF), a common framework used to describe information about any Web resource, is a standard model for data interchange in the Web.
OCR is the abbreviation for Optical Character Recognition.
Communication should start later in the process as too many inputs will distort the vision.
Organizations conduct capability maturity assessments for a number of reasons, including:
Dimensions of data quality include:
Companies do not rely on their information systems to run their operations.
Corrective actions are implemented after a problem has occurred and been detected.
Data Governance deliverables commonly include:
Most document programs have policies related to:
Metadata is described using different set of categories, including:
Measuring the effects of change management on in five key areas including: Awareness of the need to change; Desire to participate and support the change; Knowledge about how to change; Ability to implement new skills and behaviors; and Reinforcement to keep the change in place.
Operationality and interoperability depends on the data quality. In order to measure the efficiency of a repository the data quality needs to be:
Customer relationship management systems manage Master Data about customers.
In designing and building the database, the DBA should keep the following design principles in mind:
The failure to gain acceptance of a business glossary may be due to ineffective:
Data governance can be understood in terms of political governance. It includes the following three function types:
The language used in file-based solutions is called MapReduce. This language has three main steps:
Data quality rules and standards are a critical form of Metadata. Ti be effective they need to be managed as Metadata. Rules include:
A database uses foreign keys from code tables for column values. This is a way of
implementing:
Barriers to effective management of data quality include:
The information governance maturity model describes the characteristics of the information governance and recordkeeping environment at five levels of maturity for each of the eight GARP principles. Please select the correct level descriptions:
Sample value metrics for a data governance program include:
The goals of data security include:
A ‘Content Distribution Network’ supporting a multi-national website is likely to use:
With reliable Metadata an organization does not know what data it has, what the data represents and how it moves through the systems, who has access to it, or what it means for the data to be of high quality.
Data asset valuation is the process of understanding and calculating the economic value of data to an organisation. Value comes when the economic benefit of using data outweighs the costs of acquiring and storing it, as
Small reference data value sets in the logical data model can be implemented in a physical model in three common ways:
Normalisation is the process of applying rules in order to organise business complexity into stable data structures.
A weak point in an organization’s defenses is a:
Basic profiling of data involves analysis of:
Archiving is the process of moving data off immediately accessible storage media and onto media with lower retrieval performance.
What are the three characteristics of effective Data Governance communication?
Deliverables in the data management maturity assessment context diagram include:
Time-based patterns are used when data values must be associated in chronological order and with specific time values.
Uniqueness, as a dimension of data quality, states no entity exists more than once within the data set.
Poorly managed metadata leads to:
An advantage of a centralized repository include: High availability since it is independent of the source systems.
Use business rules to support Data Integration and Interoperability at various points, to:
Data quality issues only emerge at initial stages of the data lifecycle.
The library of Alexandria was one of the largest collection of books in the ancient
world. Which DMBoK knowledge area is most aligned with managing the collection?
Where does the ethical responsibility lie with respect to managing data to reduce risks of misrepresentation, misuse, or misunderstanding?
Reference and Master Data Management follow these guiding principles:
The most common drivers for initiating a Mater Data Management Program are:
Part of alignment includes developing organizational touchpoints for data governance work. Some examples of touchpoints include: Procurement and Contracts; Budget and Funding; Regulatory Compliance; and the SDLC framework.
In an information management context, the short-term wins and goals often arise from the resolution of an identified problem.
Factors that have shown to play a key role in the success in the success of effective data management organizations does not include:
In the Abate Information Triangle the past moves through the following echelons befor it comes insight:
Data Governance focuses exclusively on:
Typically, DW/BI have three concurrent development tracks:
Data science merges data mining, statistical analysis, and machine learning with the integration and data modelling capabilities, to build predictive models that explore data content patterns.
The independent updating of data into a system of reference is likely to cause:
The goal of data architecture is to:
Data management professionals who understand formal change management will be more successful in bringing about changes that will help their organizations get more value from their data. To do so, it is important to understand:
A business driver for Master Data Management program is managing data quality.
The need to manage data movement efficiently is a primary driver for Data Integration and Interoperability.
Three classic implementation approaches that support Online Analytical Processing include:
Misleading visualisations could be an example where a base level of truthfulness and transparency are not adhered to.
Types of metadata include:
An Operational Data Mart is a data mart focused on tactical decision support.
Examples of data enhancement includes:
Deliverables in the data quality context diagram include:
A primary business driver of data storage and operations is:
What position is responsible for the quality and use of their organization’s data assets?
Media monitoring and text analysis are automated methods for retrieving insights from large unstructured or semi-structured data, such as transaction data, social media, blogs, and web news sites.
A dimensional physical data model is usually a star schema, meaning there is one structure for each dimension.
Select the areas to consider when constructing an organization’s operating model:
Which Data Architecture Artifact describes how data transforms into business
assets?
Well prepared records have characteristics such as:
If the target system has more transformation capability than either the source or the intermediary application system, the order of processes may be switched to ELT – Extract Load Tranform.
Who should write the main content for a security policy for an organisation?
Machine learning explores the construction and study of learning algorithms.
All data is of equal importance. Data quality management efforts should be spread between all the data in the organization.
Access to data for Multidimensional databases use a variant of SQL called MDX or Multidimensional expression.
Which of the following answers best describes an Active Data Dictionary?
Data and enterprise architecture deal with complexity from two viewpoints:
The acronym BASE is made up of:
3 Months Free Update
3 Months Free Update
3 Months Free Update