Examen Fundamental Big Data

Description

Examen 1 de certificacion
Juan Taborda
Quiz by Juan Taborda, updated more than 1 year ago
Juan Taborda
Created by Juan Taborda almost 8 years ago
428
1

Resource summary

Question 1

Question
Big Data
Answer
  • A Not-only SQL (NoSQL) database is a non-relational database that can be use to store it
  • is an open-source framework for large-scale data storage and data processing that is mor or less run on commodity hardware
  • are capable of providing highly scalable, on-demand IT resources that can be leased via pay-as-you-go models
  • Is a field dedicated to the analysis, processing and storage of large collections of data that frequenty originate from disparate sources

Question 2

Question
Big Data Solutions
Answer
  • queries can take several minutes or even longer, depending on the complexity of the query and the number of records queried
  • is a measured for gauging sucess within a particular context
  • Examples can include EDI, e-mails, spreadcheets, RSS feeds, rss feeds and sensor data
  • are typically requiered when traditional data analysis, processing and storage technologies and techniques are insufficient

Question 3

Question
Big Data Addresses
Answer
  • Arrives at such fast speeds that enormous datasets can accumulate within very shorts periods of time
  • does not conform to a data model or data schema
  • Data adquired such as via online customer registrations, usually contains less noise
  • distinct requierements, such as the combining of multiple unrelated datasets, processing of large ammounts of unstructured data and harvesting of hidden information, in a time-sensitive manner

Question 4

Question
Using Big Data Solutions
Answer
  • are closesly liked with an enterprise's strategic objectives
  • further use databases that store historical data in multidimensional arrays and can answer complex queries based on multiple dimensions of the data
  • multiple formats and types of data that need to be supported by Big Data Solutions
  • complex analysis tasks can be carried out to arrive at deeply meaningful and insightful analysis results for the benefit of the business

Question 5

Question
Big Data Solutions
Answer
  • Some streams are public. Other streams go to vendors and business directly
  • Analytics and Data Science
  • are relevant to big data in that they can serve as both a datas source as well as an data sink that is capable of receiving data
  • can process massive quantities of data that arrive at varying speeds, may be of many different varieties and have numerous incompatibilities

Question 6

Question
Data within Big Data
Answer
  • is the process of gaining insights into the workings of an enterprise to improve decision-making by analyzing external data and data generated by its business processes
  • can have multiple data marts
  • is a process of loading data from a source system into a target system, the source system can be a database, a flat file or an application, similarly, the target system can be a database or some other information system
  • accumulates from being amassed within the enterprise (via applications) or from external sources that are then stored by the big datat solution

Question 7

Question
Data processed by Big Data
Answer
  • does generally require special or customized logic when it comes to pre-processing and storage
  • Data adquired such as blog posting, usually contains more noise
  • store historical data that is aggregated and denormalized to support fast reporting capability
  • can be used by enterprise applications directly, or fed into a data warehouse to enrich existing data.This data is typically analyzed and subjected to analytics

Question 8

Question
Processed data and analysis results
Answer
  • are closesly liked with an enterprise's strategic objectives
  • represents the main operation through which data warehouses are fed data
  • does often have special pre-processing and storage requierements, especially if the underline format is not text-based
  • are commonly used for meaningful and complex reporting and assessment task and can also be fed back into applications to enhance their behavior (such as when product recommendations are displayed online)

Question 9

Question
Data processed by Big Data
Answer
  • Analytics and Data Science
  • actionable intelligence
  • operational optimization
  • can be human-generated or machine generated, although it is ultimately the responsibility of machines to generate the processing results

Question 10

Question
Human-generated data
Answer
  • is a subset of the data stored in a data warehouse, that typically belongs to a department, division or specific line of business
  • each technology is uniquely relevant to modern-day Big Data Solutions and ecosystems
  • used to identify problem areas in order to take corrective actions
  • is the result of human interaction with systems, such as online services and digital devices (Ex. Social media, micro blogging, e-mails, photo sharing and messaging)

Question 11

Question
Machine-generated data
Answer
  • represents the main operation through which data warehouses are fed data
  • With periodic data imports from accross the enterprise, the amount of data contained will continue to increase. Query response times for data analysis task performed as part of BI can suffer as a result
  • defined as the usefulness of data for an enterprise
  • is the result of the automated, event-driven generation of data by software programs or hardware devices (Ex. Web logs, sensor data, telemetry data, smart meter data and appliance usage data

Question 12

Question
BDS processing results
Answer
  • is the process of gaining insights into the workings of an enterprise to improve decision-making by analyzing external data and data generated by its business processes
  • scientific and research data (large Hadron Collider, Atacama Large Milimeter/Submilimeter Array Telescope)
  • operational optimization
  • actionable intelligence

Question 13

Question
BDS processing results
Answer
  • is crucial to big data processing storage and analysis
  • With periodic data imports from accross the enterprise, the amount of data contained will continue to increase. Query response times for data analysis task performed as part of BI can suffer as a result
  • identification of new markets
  • accurate predictions

Question 14

Question
BDS processing results
Answer
  • is directly related to the veracity characteristic
  • The required data is first obtained from the sources, after which the extracts are modified by applying rules
  • fault and fraud detection
  • more detailed records

Question 15

Question
BDS processing results
Answer
  • related to collecting and processing large quantities of diverse data has become increasingly affordable
  • simple insert, delete and update operations with sub-second response times
  • improved decision-making
  • scientific discoveries

Question 16

Question
Datasets
Answer
  • improved decision-making
  • representing a common source of structured analytics input
  • The anticipated volume of data that is processed by Big Data solutions is substantial and usually ever-growing
  • Collections or groups of related data (Ex. Tweets stored in a flat file, collection of image files, extract of rows stored in a table, historical weather observations that are stored as XML Files)

Question 17

Question
Datum
Answer
  • Shares the same set of attributes as others in the same dataset
  • Are the data analysis results being accurately communicated to the appropriate decision-makers?
  • The anticipated volume of data that is processed by Big Data solutions is substantial and usually ever-growing
  • is based on a quantifiable indicator that is identified and agreed upon beforehand

Question 18

Question
Data analysis
Answer
  • either exists in textual or binary form
  • is the result of human interaction with systems, such as online services and digital devices (Ex. Social media, micro blogging, e-mails, photo sharing and messaging)
  • is the process of examining data to find facts, relationships, patterns, insights and/or trends. The eventual goal is to support decision-making
  • helps establish patterns and relationships amog the data being analyzed

Question 19

Question
Analytics
Answer
  • semi-structured data
  • Can exist as a separate DBMS, as in the case of an OLAP database
  • is the discipline of gaininng an understanding of data by analyzing it via a multitude of scientific techniques and automated tools, with a focus on locating hidden patterns and correlations
  • is usually applied using highly scalable distributed technologies and frameworks for analyzing large volumes of data from different sources

Question 20

Question
Analytics
Answer
  • generally involves sifting through large amounts of raw, unstructured data to extract meaningful information that can serve as an input for identifying patterns, enriching existing enterprise data, or performing large-scale searches
  • may not always be high. For Example, MRI scan images are usually not generated as frequently as log entries form a high-traffic Web Server
  • Shares the same set of attributes as others in the same dataset
  • attributes providing the file size and resolution of a digital photograph

Question 21

Question
in the business-oriented environments analytics results can lower operational costs and facilitate strategic decision-making?
Answer
  • True
  • False

Question 22

Question
scientific domain
Answer
  • does often have special pre-processing and storage requierements, especially if the underline format is not text-based
  • is also dependent on how long data processing takes, time are inversely proportional to each other
  • is a data analysis technique that focuses on quantifying the patterns and correlations found in the data
  • analytics can help identify the cause of a phenomenon to improve the accuracy of predictions

Question 23

Question
services-based environments
Answer
  • are relevant to big data in that they can serve as both a datas source as well as an data sink that is capable of receiving data
  • each technology is uniquely relevant to modern-day Big Data Solutions and ecosystems
  • are commonly used for meaningful and complex reporting and assessment task and can also be fed back into applications to enhance their behavior (such as when product recommendations are displayed online)
  • analytics can help strengthen the focus on delivering high quality services by driving down cost

Question 24

Question
Analytics
Answer
  • are closesly liked with an enterprise's strategic objectives
  • Shares the same set of attributes as others in the same dataset
  • generally makes up 80% of the data within an enterprise, and has a faster growth rate than structured data
  • enables data-driven decision-making with scientific backing, so that decisions can be based on a factual data and not on past experience or intuition alone

Question 25

Question
Business Intelligence
Answer
  • generally involves sifting through large amounts of raw, unstructured data to extract meaningful information that can serve as an input for identifying patterns, enriching existing enterprise data, or performing large-scale searches
  • can be used as an ETL engine, or as an analytics engine for processing large amounts of structured, semi-structured and unstructured data
  • is the process of gaining insights into the workings of an enterprise to improve decision-making by analyzing external data and data generated by its business processes
  • applyes analytics to large amounts of data across the enterprise

Question 26

Question
Business Intelligence
Answer
  • store historical data that is aggregated and denormalized to support fast reporting capability
  • is the process of examining data to find facts, relationships, patterns, insights and/or trends. The eventual goal is to support decision-making
  • The anticipated volume of data that is processed by Big Data solutions is substantial and usually ever-growing
  • can be further utilize the consolidated data contained in data warehouses to run analytical queries

Question 27

Question
KPI
Answer
  • is crucial to big data processing storage and analysis
  • is mostly machine-generated and automatically appended to the data
  • is a measured for gauging sucess within a particular context
  • are closesly liked with an enterprise's strategic objectives

Question 28

Question
KPI
Answer
  • Shares the same set of attributes as others in the same dataset
  • ticket reservation systems and banking and POS transactions
  • used to identify problem areas in order to take corrective actions
  • used to achieve regulatory compliance

Question 29

Question
KPI
Answer
  • more detailed records
  • big data solutions particularly rely on it when processing semi-structured and unstructured data
  • act as quick reference points for measuring the overall performance of the business
  • is based on a quantifiable indicator that is identified and agreed upon beforehand

Question 30

Question
primary business and technology drivers
Answer
  • the relational data is stored as denormalized data in the form of cubes, this allows the data to be queried during any data analysis task that are performed later
  • XML tags providing the author and creation date of a document
  • Analytics and Data Science
  • Digitization

Question 31

Question
primary business and technology drivers
Answer
  • A Not-only SQL (NoSQL) database is a non-relational database that can be use to store it
  • are capable of providing highly scalable, on-demand IT resources that can be leased via pay-as-you-go models
  • Affordable Technology & Commodity Hardware
  • Social Media

Question 32

Question
primary business and technology drivers
Answer
  • does often have special pre-processing and storage requierements, especially if the underline format is not text-based
  • is directly related to the veracity characteristic
  • Hyper-Connected Communities & Devices
  • Cloud Computing

Question 33

Question
Analytics & Data Science
Answer
  • generally makes up 80% of the data within an enterprise, and has a faster growth rate than structured data
  • more detailed records
  • fault and fraud detection
  • The maturity of these fields of practice inspired and enabled much of the core functionality expected from contemporary Big Data solutions and tools

Question 34

Question
Digitized data
Answer
  • How well has the data been stored?
  • is always fed with data from multiple OLTP systems using regular batch processing jobs
  • The longer it takes for data to be turned into meaninful information, the less potential it may have for the business
  • Leads to an opportunity to collect further "secondary" data, such as when individuals carry out searches or complete surveys

Question 35

Question
Colecting secondary data
Answer
  • accurate predictions
  • Extract Transform Load (ETL)
  • data bearing value leading to meaningful information
  • can be important to businesses. Mining this data may allow for customized marketing, automated recomendations and the development of optimized product features

Question 36

Question
Affordable Technology
Answer
  • Hyper-Connected Communities & Devices
  • is usually applied using highly scalable distributed technologies and frameworks for analyzing large volumes of data from different sources
  • are relevant to big data in that they can serve as both a datas source as well as an data sink that is capable of receiving data
  • related to collecting and processing large quantities of diverse data has become increasingly affordable

Question 37

Question
Tipical Big Data solutions
Answer
  • is typically stored in relational databases and frequently generated by custom enterprise applications, ERP systems amd CRM systems
  • The longer it takes for data to be turned into meaninful information, the less potential it may have for the business
  • operational optimization
  • are based on open-source software that requires little more than commodity hardware

Question 38

Question
commodity hardware
Answer
  • How well has the data been stored?
  • Hyper-Connected Communities & Devices
  • fault and fraud detection
  • makes the adoption of big data solutions accessible to businesses without large capital investments

Question 39

Question
Social Media
Answer
  • does not conform to a data model or data schema
  • store historical data that is aggregated and denormalized to support fast reporting capability
  • provide feedback in near-realtime via open and public mediums
  • business are storing increasing amounts of data on customer interaction and from social media avenues in an attempt to harvest this data to increase sales, enable targeted marketing and create new products and service

Question 40

Question
Social Media
Answer
  • may not always be high. For Example, MRI scan images are usually not generated as frequently as log entries form a high-traffic Web Server
  • Are the data analysis results being accurately communicated to the appropriate decision-makers?
  • operational optimization
  • business are also increasingly interested in incorporating publicly avaliable datasets from social media and other external data source

Question 41

Question
Hyper-Connected Communities & Devices
Answer
  • Examples can include EDI, e-mails, spreadcheets, RSS feeds, rss feeds and sensor data
  • is the process of examining data to find facts, relationships, patterns, insights and/or trends. The eventual goal is to support decision-making
  • The broadening coverage of the internet and the proliferation of cellular and Wi-Fi networks has enabled more people to be continuously active in virtual communities
  • This is either directly through online interaction on indirectly through the usage of connected devices, this has resulted in massive data streams

Question 42

Question
Hyper-Connected Communities & Devices
Answer
  • is an open-source framework for large-scale data storage and data processing that is mor or less run on commodity hardware
  • can be important to businesses. Mining this data may allow for customized marketing, automated recomendations and the development of optimized product features
  • can also be fed back into OLTPs
  • Some streams are public. Other streams go to vendors and business directly

Question 43

Question
Cloud Computing
Answer
  • is the process of gaining insights into the workings of an enterprise to improve decision-making by analyzing external data and data generated by its business processes
  • attributes providing the file size and resolution of a digital photograph
  • have led to the creation of remote environments
  • are capable of providing highly scalable, on-demand IT resources that can be leased via pay-as-you-go models

Question 44

Question
Cloud Computing
Answer
  • multiple formats and types of data that need to be supported by Big Data Solutions
  • applyes analytics to large amounts of data across the enterprise
  • Business have the opportunity to leverage the infraestructure, storage and processing capabilities provided by these environments in order to build large scale Big Data Solutions
  • Can be leveraged for its scaling capabilities to perform Big Data Processing task

Question 45

Question
Cloud Computing
Answer
  • either exists in textual or binary form
  • actionable intelligence
  • have a greater noise-to-signal ratio
  • can be leased dramatically reduces the requiered up-front investment of big data projects

Question 46

Question
Technologies Related to Big Data
Answer
  • It also periodically pulls data from other sources for consolidation into a dataset (such as from OLTP, ERP, CRM, and SCM systems).
  • This is either directly through online interaction on indirectly through the usage of connected devices, this has resulted in massive data streams
  • Online Transaction Processing (OLTP)
  • Online Analytical Processing (OLAP)

Question 47

Question
Technologies Related to Big Data
Answer
  • each technology is uniquely relevant to modern-day Big Data Solutions and ecosystems
  • represents the main operation through which data warehouses are fed data
  • Extract Transform Load (ETL)
  • Data Warehouses

Question 48

Question
Technologies Related to Big Data
Answer
  • are capable of providing highly scalable, on-demand IT resources that can be leased via pay-as-you-go models
  • is the discipline of gaininng an understanding of data by analyzing it via a multitude of scientific techniques and automated tools, with a focus on locating hidden patterns and correlations
  • is crucial to big data processing storage and analysis
  • Hadoop

Question 49

Question
OLTP
Answer
  • further use databases that store historical data in multidimensional arrays and can answer complex queries based on multiple dimensions of the data
  • is a process of loading data from a source system into a target system, the source system can be a database, a flat file or an application, similarly, the target system can be a database or some other information system
  • store operational data that is fully normalized
  • is a software system that processes transaction-oriented data

Question 50

Question
Online Transaction
Answer
  • operational optimization
  • A Not-only SQL (NoSQL) database is a non-relational database that can be use to store it
  • Collections or groups of related data (Ex. Tweets stored in a flat file, collection of image files, extract of rows stored in a table, historical weather observations that are stored as XML Files)
  • the completion on an activity in realtime and not batch-processed

Question 51

Question
OLTP
Answer
  • representing a common source of structured analytics input
  • generally involves sifting through large amounts of raw, unstructured data to extract meaningful information that can serve as an input for identifying patterns, enriching existing enterprise data, or performing large-scale searches
  • require automated data cleansing and data verification when carrying out ETL processes
  • are closesly liked with an enterprise's strategic objectives

Question 52

Question
Big Data Analysis Results
Answer
  • used to identify problem areas in order to take corrective actions
  • either exists in textual or binary form
  • enables data-driven decision-making with scientific backing, so that decisions can be based on a factual data and not on past experience or intuition alone
  • can also be fed back into OLTPs

Question 53

Question
Queries Supported by OLTP
Answer
  • mostly exist in textual form such as XML or JSON files.
  • data bearing value leading to meaningful information
  • The broadening coverage of the internet and the proliferation of cellular and Wi-Fi networks has enabled more people to be continuously active in virtual communities
  • simple insert, delete and update operations with sub-second response times

Question 54

Question
Examples of OLTP
Answer
  • Data Warehouses
  • big data solutions particularly rely on it when processing semi-structured and unstructured data
  • structured data
  • ticket reservation systems and banking and POS transactions

Question 55

Question
OLAP
Answer
  • related to collecting and processing large quantities of diverse data has become increasingly affordable
  • XML tags providing the author and creation date of a document
  • is a system used for processing data analysis queries
  • form an integral part of business intelligence, data mining and machine learning processes

Question 56

Question
OLAP
Answer
  • Collections or groups of related data (Ex. Tweets stored in a flat file, collection of image files, extract of rows stored in a table, historical weather observations that are stored as XML Files)
  • store historical data that is aggregated and denormalized to support fast reporting capability
  • are relevant to big data in that they can serve as both a datas source as well as an data sink that is capable of receiving data
  • are using in diagnostic, predictive and prescriptive analysis

Question 57

Question
OLAP
Answer
  • Social Media
  • Sensor Data (RFID, Smart meters, GPS sensors)
  • further use databases that store historical data in multidimensional arrays and can answer complex queries based on multiple dimensions of the data
  • is always fed with data from multiple OLTP systems using regular batch processing jobs

Question 58

Question
OLAP
Answer
  • have a less noise-to-signal ratio
  • Are the right types of question being asked during data analysis?
  • queries can take several minutes or even longer, depending on the complexity of the query and the number of records queried
  • the relational data is stored as denormalized data in the form of cubes, this allows the data to be queried during any data analysis task that are performed later

Question 59

Question
ETL
Answer
  • either exists in textual or binary form
  • generally involves sifting through large amounts of raw, unstructured data to extract meaningful information that can serve as an input for identifying patterns, enriching existing enterprise data, or performing large-scale searches
  • is a process of loading data from a source system into a target system, the source system can be a database, a flat file or an application, similarly, the target system can be a database or some other information system
  • represents the main operation through which data warehouses are fed data

Question 60

Question
ETL
Answer
  • online transactions (point-of-scale, banking)
  • act as quick reference points for measuring the overall performance of the business
  • A big data solution encompasses this tool feature-set for converting data of different types
  • The required data is first obtained from the sources, after which the extracts are modified by applying rules

Question 61

Question
ETL
Answer
  • analytics results can lower operational costs and facilitate strategic decision-making
  • Collections or groups of related data (Ex. Tweets stored in a flat file, collection of image files, extract of rows stored in a table, historical weather observations that are stored as XML Files)
  • generally involves sifting through large amounts of raw, unstructured data to extract meaningful information that can serve as an input for identifying patterns, enriching existing enterprise data, or performing large-scale searches
  • The data is inserted into a target system

Question 62

Question
Data Warehouse
Answer
  • impose distinct data storage and processing demands, as well as management ans access processes
  • is based on a quantifiable indicator that is identified and agreed upon beforehand
  • is a central, enterprise-wide repository, consisting of historical and current data
  • are heavily used by BI to run various analytical queries

Question 63

Question
Data Warehouse
Answer
  • The required data is first obtained from the sources, after which the extracts are modified by applying rules
  • analytics can help identify the cause of a phenomenon to improve the accuracy of predictions
  • usually interface with an OLAP system to support analytical queries
  • It also periodically pulls data from other sources for consolidation into a dataset (such as from OLTP, ERP, CRM, and SCM systems).

Question 64

Question
Data Warehouse
Answer
  • This is either directly through online interaction on indirectly through the usage of connected devices, this has resulted in massive data streams
  • conforms to a data model or schema
  • Data pertaining to multiple business entities from different operational systems is periodically extracted, validated, transformed an consolidated into a single database
  • With periodic data imports from accross the enterprise, the amount of data contained will continue to increase. Query response times for data analysis task performed as part of BI can suffer as a result

Question 65

Question
Data Warehouse
Answer
  • can also be fed back into OLTPs
  • helps establish patterns and relationships amog the data being analyzed
  • the relational data is stored as denormalized data in the form of cubes, this allows the data to be queried during any data analysis task that are performed later
  • Usually contain optimized databases called analytical database to handle reporting and data analysis tasks

Question 66

Question
Analytical Database
Answer
  • This is either directly through online interaction on indirectly through the usage of connected devices, this has resulted in massive data streams
  • Brings challenges for enterprises in terms of data integration, transformation, processing and storage
  • does not conform to a data model or data schema
  • Can exist as a separate DBMS, as in the case of an OLAP database

Question 67

Question
Data Mart
Answer
  • act as quick reference points for measuring the overall performance of the business
  • online transactions (point-of-scale, banking)
  • can also be fed back into OLTPs
  • is a subset of the data stored in a data warehouse, that typically belongs to a department, division or specific line of business

Question 68

Question
Data Warehouse
Answer
  • does generally require special or customized logic when it comes to pre-processing and storage
  • is directly related to the veracity characteristic
  • can have multiple data marts
  • single version of "truth" is based on cleansed data, which is a prerequisite for accurate and error-free reports

Question 69

Question
Hadoop
Answer
  • further use databases that store historical data in multidimensional arrays and can answer complex queries based on multiple dimensions of the data
  • identification of new markets
  • is an open-source framework for large-scale data storage and data processing that is mor or less run on commodity hardware
  • has established itself as a de facto industry platform for contemporary Big Data Solutions

Question 70

Question
Hadoop
Answer
  • analytics can help strengthen the focus on delivering high quality services by driving down cost
  • have led to the creation of remote environments
  • are closesly liked with an enterprise's strategic objectives
  • can be used as an ETL engine, or as an analytics engine for processing large amounts of structured, semi-structured and unstructured data

Question 71

Question
Data Characteristics
Answer
  • does not conform to a data model or data schema
  • Are the data analysis results being accurately communicated to the appropriate decision-makers?
  • is the process of examining data to find facts, relationships, patterns, insights and/or trends. The eventual goal is to support decision-making
  • Volume, Velocity, Variety, Veracity & Value

Question 72

Question
Volume
Answer
  • scientific and research data (large Hadron Collider, Atacama Large Milimeter/Submilimeter Array Telescope)
  • is the process of gaining insights into the workings of an enterprise to improve decision-making by analyzing external data and data generated by its business processes
  • The anticipated volume of data that is processed by Big Data solutions is substantial and usually ever-growing
  • impose distinct data storage and processing demands, as well as management ans access processes

Question 73

Question
Volume
Answer
  • Leads to an opportunity to collect further "secondary" data, such as when individuals carry out searches or complete surveys
  • Digitization
  • online transactions (point-of-scale, banking)
  • Sensor Data (RFID, Smart meters, GPS sensors)

Question 74

Question
Volume
Answer
  • is crucial to big data processing storage and analysis
  • can be leased dramatically reduces the requiered up-front investment of big data projects
  • impose distinct data storage and processing demands, as well as management ans access processes
  • Social Media (Facebook, Tweeter)

Question 75

Question
Velocity
Answer
  • can be human-generated or machine generated, although it is ultimately the responsibility of machines to generate the processing results
  • analytics can help strengthen the focus on delivering high quality services by driving down cost
  • Arrives at such fast speeds that enormous datasets can accumulate within very shorts periods of time
  • translates into the amount of time it takes for the data to be processed once it enters the enterprise perimeter

Question 76

Question
Velocity
Answer
  • Examples can include EDI, e-mails, spreadcheets, RSS feeds, rss feeds and sensor data
  • is a measured for gauging sucess within a particular context
  • Coping with the fast inflow of data requires the enterprise to design highly elastic and avaliable processing solutions and corresponding data storage capabilities
  • may not always be high. For Example, MRI scan images are usually not generated as frequently as log entries form a high-traffic Web Server

Question 77

Question
Variety
Answer
  • data bearing value leading to meaningful information
  • big data solutions particularly rely on it when processing semi-structured and unstructured data
  • multiple formats and types of data that need to be supported by Big Data Solutions
  • Brings challenges for enterprises in terms of data integration, transformation, processing and storage

Question 78

Question
Veracity
Answer
  • Online Transaction Processing (OLTP)
  • Shares the same set of attributes as others in the same dataset
  • generally makes up 80% of the data within an enterprise, and has a faster growth rate than structured data
  • refers to the quality or fidelity of data

Question 79

Question
Noise
Answer
  • has a defined level of structure and consistency, but cannot be relational in nature
  • The anticipated volume of data that is processed by Big Data solutions is substantial and usually ever-growing
  • Coping with the fast inflow of data requires the enterprise to design highly elastic and avaliable processing solutions and corresponding data storage capabilities
  • data carrying no value

Question 80

Question
Signal
Answer
  • is a subset of the data stored in a data warehouse, that typically belongs to a department, division or specific line of business
  • provide feedback in near-realtime via open and public mediums
  • A Not-only SQL (NoSQL) database is a non-relational database that can be use to store it
  • data bearing value leading to meaningful information

Question 81

Question
controlled source
Answer
  • are heavily used by BI to run various analytical queries
  • Examples can include EDI, e-mails, spreadcheets, RSS feeds, rss feeds and sensor data
  • makes the adoption of big data solutions accessible to businesses without large capital investments
  • Data adquired such as via online customer registrations, usually contains less noise

Question 82

Question
uncontrolled source
Answer
  • business are also increasingly interested in incorporating publicly avaliable datasets from social media and other external data source
  • accurate predictions
  • Business have the opportunity to leverage the infraestructure, storage and processing capabilities provided by these environments in order to build large scale Big Data Solutions
  • Data adquired such as blog posting, usually contains more noise

Question 83

Question
Degree of noise
Answer
  • is a measured for gauging sucess within a particular context
  • act as quick reference points for measuring the overall performance of the business
  • analytics results can lower operational costs and facilitate strategic decision-making
  • Depends on the type of data present

Question 84

Question
Value
Answer
  • store historical data that is aggregated and denormalized to support fast reporting capability
  • is an open-source framework for large-scale data storage and data processing that is mor or less run on commodity hardware
  • defined as the usefulness of data for an enterprise
  • is directly related to the veracity characteristic

Question 85

Question
Value
Answer
  • is the process of gaining insights into the workings of an enterprise to improve decision-making by analyzing external data and data generated by its business processes
  • Brings challenges for enterprises in terms of data integration, transformation, processing and storage
  • is also dependent on how long data processing takes, time are inversely proportional to each other
  • The longer it takes for data to be turned into meaninful information, the less potential it may have for the business

Question 86

Question
Value Considerations
Answer
  • scientific discoveries
  • ticket reservation systems and banking and POS transactions
  • How well has the data been stored?
  • Has the data been stripped of any valuable attributes?

Question 87

Question
Value Considerations
Answer
  • have a less noise-to-signal ratio
  • attributes providing the file size and resolution of a digital photograph
  • Are the right types of question being asked during data analysis?
  • Are the data analysis results being accurately communicated to the appropriate decision-makers?

Question 88

Question
Data Types
Answer
  • improved decision-making
  • does not conform to a data model or data schema
  • structured data
  • unstructured data

Question 89

Question
Data Types
Answer
  • translates into the amount of time it takes for the data to be processed once it enters the enterprise perimeter
  • have led to the creation of remote environments
  • The anticipated volume of data that is processed by Big Data solutions is substantial and usually ever-growing
  • semi-structured data

Question 90

Question
structured data
Answer
  • Are the right types of question being asked during data analysis?
  • makes the adoption of big data solutions accessible to businesses without large capital investments
  • conforms to a data model or schema
  • is stored in a tabular form

Question 91

Question
structured data
Answer
  • is crucial to big data processing storage and analysis
  • is the process of examining data to find facts, relationships, patterns, insights and/or trends. The eventual goal is to support decision-making
  • can be relational
  • is typically stored in relational databases and frequently generated by custom enterprise applications, ERP systems amd CRM systems

Question 92

Question
structured data
Answer
  • can be important to businesses. Mining this data may allow for customized marketing, automated recomendations and the development of optimized product features
  • analytics can help strengthen the focus on delivering high quality services by driving down cost
  • The anticipated volume of data that is processed by Big Data solutions is substantial and usually ever-growing
  • does not generally have any special pre-processing or storage requirements. Examples include banking transactions, OLTP system records and customer records

Question 93

Question
unstructured data
Answer
  • qualitative analysis
  • enables data-driven decision-making with scientific backing, so that decisions can be based on a factual data and not on past experience or intuition alone
  • does not conform to a data model or data schema
  • is generally inconsistent and non-relational

Question 94

Question
unstructured data
Answer
  • simple insert, delete and update operations with sub-second response times
  • Shares the same set of attributes as others in the same dataset
  • either exists in textual or binary form
  • generally makes up 80% of the data within an enterprise, and has a faster growth rate than structured data

Question 95

Question
unstructured data
Answer
  • is mostly machine-generated and automatically appended to the data
  • Shares the same set of attributes as others in the same dataset
  • does generally require special or customized logic when it comes to pre-processing and storage
  • cannot be inherently processed or queried using SQL or traditional programming features and is usually an awkward fit with relational databases

Question 96

Question
unstructured data
Answer
  • has a defined level of structure and consistency, but cannot be relational in nature
  • are relevant to big data in that they can serve as both a datas source as well as an data sink that is capable of receiving data
  • A big data solution encompasses this tool feature-set for converting data of different types
  • A Not-only SQL (NoSQL) database is a non-relational database that can be use to store it

Question 97

Question
semi-structured data
Answer
  • Are the right types of question being asked during data analysis?
  • How well has the data been stored?
  • has a defined level of structure and consistency, but cannot be relational in nature
  • mostly exist in textual form such as XML or JSON files.

Question 98

Question
semi-structured data
Answer
  • defined as the usefulness of data for an enterprise
  • may not always be high. For Example, MRI scan images are usually not generated as frequently as log entries form a high-traffic Web Server
  • Examples can include EDI, e-mails, spreadcheets, RSS feeds, rss feeds and sensor data
  • does often have special pre-processing and storage requierements, especially if the underline format is not text-based

Question 99

Question
metadata
Answer
  • is the process of gaining insights into the workings of an enterprise to improve decision-making by analyzing external data and data generated by its business processes
  • require automated data cleansing and data verification when carrying out ETL processes
  • provide information about dataset's characteristics and structure
  • is mostly machine-generated and automatically appended to the data

Question 100

Question
metadata
Answer
  • refers to the quality or fidelity of data
  • Has the data been stripped of any valuable attributes?
  • XML tags providing the author and creation date of a document
  • attributes providing the file size and resolution of a digital photograph

Question 101

Question
metadata
Answer
  • The data is inserted into a target system
  • semi-structured data
  • single version of "truth" is based on cleansed data, which is a prerequisite for accurate and error-free reports
  • big data solutions particularly rely on it when processing semi-structured and unstructured data

Question 102

Question
structured data
Answer
  • data carrying no value
  • can also be fed back into OLTPs
  • quantitative analysis
  • have a less noise-to-signal ratio

Question 103

Question
semi-structured data and unstructured data
Answer
  • identification of new markets
  • Are the data analysis results being accurately communicated to the appropriate decision-makers?
  • improved decision-making
  • have a greater noise-to-signal ratio

Question 104

Question
Noise
Answer
  • A big data solution encompasses this tool feature-set for converting data of different types
  • structured data
  • Brings challenges for enterprises in terms of data integration, transformation, processing and storage
  • require automated data cleansing and data verification when carrying out ETL processes

Question 105

Question
Types of data analysis
Answer
  • is a measured for gauging sucess within a particular context
  • Shares the same set of attributes as others in the same dataset
  • quantitative analysis
  • qualitative analysis

Question 106

Question
Types of data analysis
Answer
  • does not generally have any special pre-processing or storage requirements. Examples include banking transactions, OLTP system records and customer records
  • online transactions (point-of-scale, banking)
  • is a central, enterprise-wide repository, consisting of historical and current data
  • data mining

Question 107

Question
quantitative analysis
Answer
  • The longer it takes for data to be turned into meaninful information, the less potential it may have for the business
  • queries can take several minutes or even longer, depending on the complexity of the query and the number of records queried
  • translates into the amount of time it takes for the data to be processed once it enters the enterprise perimeter
  • is a data analysis technique that focuses on quantifying the patterns and correlations found in the data

Question 108

Question
quantitative analysis
Answer
  • cannot be inherently processed or queried using SQL or traditional programming features and is usually an awkward fit with relational databases
  • refers to the quality or fidelity of data
  • this technique involves analyzing a large number of observations from a dataset
  • since the sample size is large, the results can be applied in a generalized manner to the entire dataset

Question 109

Question
quantitative analysis
Answer
  • defined as the usefulness of data for an enterprise
  • provide more value than any other type of analytics and correspondingly require the most advance skillset, as well as specialized software and tools
  • single version of "truth" is based on cleansed data, which is a prerequisite for accurate and error-free reports
  • are absolute in nature and can therefore be used for numerical comparisons

Question 110

Question
qualitative analysis
Answer
  • Data pertaining to multiple business entities from different operational systems is periodically extracted, validated, transformed an consolidated into a single database
  • can also be fed back into OLTPs
  • is a data analysis technique that focuses on describing various data qualities using words
  • involves analyzing a smaller sample in greater depth compared to quantitative data analysis

Question 111

Question
qualitative analysis
Answer
  • accurate predictions
  • the information is generated at periodic intervals in realtime or near realtime
  • theses analysis results cannot be generalized to an entire dataset due to the small sample size
  • they also cannot be measured numerically or used for numerical comparisons

Question 112

Question
data mining
Answer
  • policies for data privacy and data anonymization
  • aim to determine the cause of a phenomenon that occuried in the past, using questions that focus on the reason behind the event
  • also known as data discovery, is a specialized form of data analysis that targets large datasets
  • refers to automated, sofware-based techniques that sift through massive datasets to identify patterns and trends

Question 113

Question
data mining
Answer
  • is typically stored in relational databases and frequently generated by custom enterprise applications, ERP systems amd CRM systems
  • actionable intelligence
  • involves extracting hidden or unknown patterns in the data with the intention of identifying previously unknown patterns
  • forms the basis for predictive analytics and business intelligence (BI)

Question 114

Question
Analysis & Analitycs
Answer
  • based on the input data, the algorithm develops an understanding of which data belongs to which category
  • data carrying no value
  • act as quick reference points for measuring the overall performance of the business
  • These techniques may not provide accurate findings in a timely manner because of the data's volume, velocity and/or variety

Question 115

Question
Analytics tools
Answer
  • enables multiple outcomes to be visualized by enabling related factors to be dynamically changed
  • are often carried out via ad-hoc reporting or dashboards
  • some realtime data analysis solutions that do exist are proprietary
  • can automate data analyses through the use of highly scalable computational technologies that apply automated statistical quantitative analysis, data mining an machine learning techniques

Question 116

Question
Types of Analytics
Answer
  • the adoption of a big data environment may necessitate that some or all of that environment be hosted witin a cloud
  • Are the right types of question being asked during data analysis?
  • descriptive analytics
  • diagnostic analytics

Question 117

Question
Types of Analytics
Answer
  • involves analyzing a smaller sample in greater depth compared to quantitative data analysis
  • also known as data discovery, is a specialized form of data analysis that targets large datasets
  • predictive analytics
  • prescriptive analytics

Question 118

Question
Types of Analytics
Answer
  • does not generally have any special pre-processing or storage requirements. Examples include banking transactions, OLTP system records and customer records
  • policies for data cleansing and filtering
  • can be important to businesses. Mining this data may allow for customized marketing, automated recomendations and the development of optimized product features
  • Value and complexity increase as we move from descriptive to prescriptive analytics

Question 119

Question
descriptive analytics
Answer
  • is generally inconsistent and non-relational
  • This involves identifying patterns in the training data and classifying new or unseen data based on known patterns
  • is carried out to answer questions about events that have already occurred
  • Arround 80% of analytics are ________ in nature

Question 120

Question
descriptive analytics
Answer
  • refers to the information about the source of the data that helps determine its authenticity and quality. It also used for auditing purposes
  • This is either directly through online interaction on indirectly through the usage of connected devices, this has resulted in massive data streams
  • provides the least value and requires a relatively basic skillset
  • are often carried out via ad-hoc reporting or dashboards

Question 121

Question
descriptive analytics
Answer
  • is directly related to the veracity characteristic
  • Business have the opportunity to leverage the infraestructure, storage and processing capabilities provided by these environments in order to build large scale Big Data Solutions
  • The reports are generally static in nature and display historical data that is presented in the form of data grids or charts
  • Queries are executed on the OLTP systems or data obtained from various other information systems, such as CRMs and ERPs

Question 122

Question
diagnostic analytics
Answer
  • aim to determine the cause of a phenomenon that occuried in the past, using questions that focus on the reason behind the event
  • are considered to provide more value than descriptive analysis, requiring a more advanced skillset
  • data bearing value leading to meaningful information
  • The data is inserted into a target system

Question 123

Question
diagnostic analytics
Answer
  • single version of "truth" is based on cleansed data, which is a prerequisite for accurate and error-free reports
  • a substancial budget may still be required to obtain external data
  • usually require collecting data from multiple sources and storing it in a structure that lends itself to performing drill-downs and roll-ups
  • analytics results are viewed via interactive visualization tools that enable users to identify trends and patterns

Question 124

Question
diagnostic analytics
Answer
  • can join structured and unstructured data that is kept in memory for fast data access
  • impose distinct data storage and processing demands, as well as management ans access processes
  • will be required to control how data flows in and out of big data solutions and how feedback loops can be established to enable the processed data to undergo repeated refinements
  • the executed queries are more complex compared to descriptive analytics, and are performed on multi-dimensional data held in OLAP systems

Question 125

Question
predictive analytics
Answer
  • the adoption of a big data environment may necessitate that some or all of that environment be hosted witin a cloud
  • is also dependent on how long data processing takes, time are inversely proportional to each other
  • are carried out to attempt to determine the outcome of an event that might occur in the future
  • try to predict the event outcome and predictions are made based on patterns, trends and exceptions found in historical and current data

Question 126

Question
predictive analytics
Answer
  • as big data initiatives are inherently business-driven, there needs to be a clear business case for adopting a big data solution to ensure that it is justified and that expectations are met
  • Graphically representing data can make it easier to understand reports, view trends and identify patterns
  • This can lead to the identification of risk and opportunities
  • involve the use of large datasets (comprised of both internal and external data), statistical techniques, quantitative analysis, machine learning and data mining techniques

Question 127

Question
predictive analytics
Answer
  • may employ machine learning algorithms, such as unsupervised learning to extract previously unknown attributes
  • is considered to provide more value and required more advance skillset than both descriptive and diagnostic analytics
  • tool generally abstract underlying statistical intricacies by providing user-friendly front-end interfaces
  • enables a detailed view of the data of interest by focusing in on a data subset from the summarized view

Question 128

Question
prescriptive analytics
Answer
  • is the process of teaching computers to learn from existing data and apply the adquired knowledge to formulate predictions about unknown data
  • incorporate predictive and prescriptive data analytics and data transformation features
  • build upon the results of predictive analytics by prescribing actions that should be taken. The focus is on which prescribed options to follow, and why and when it should be followed, to gain an advantage or mitigate a risk
  • provide more value than any other type of analytics and correspondingly require the most advance skillset, as well as specialized software and tools

Question 129

Question
prescriptive analytics
Answer
  • rely on BI and data warehouses as core components of big data environments and ecosystems
  • risk associated with collecting accurate and relevant data, and with integrating the big data environment itself, need to be identified and quantified
  • various outcomes are calculated, and the best course of action for each outcome is suggested
  • The approach shifts form explanatory to advisory and can include the simulation of various scenarios

Question 130

Question
prescriptive analytics
Answer
  • helps establish patterns and relationships amog the data being analyzed
  • unstructured data
  • incorporate internal data (current and historical sales data, customer information, product data, business rules) and external data (social media data, weather data, demographic data)
  • involve the use of business rules and large amounts of internal and/or external data to simulate outcomes and prescribe the best course of action

Question 131

Question
machine learning
Answer
  • coupling a traditional data warehouse with these new technologies results in a hybrid data warehouse
  • various outcomes are calculated, and the best course of action for each outcome is suggested
  • is the process of teaching computers to learn from existing data and apply the adquired knowledge to formulate predictions about unknown data
  • This involves identifying patterns in the training data and classifying new or unseen data based on known patterns

Question 132

Question
machine learning types
Answer
  • even analyzing separate datasets that contain seemingly benign can reveal private information when the datasets are analyzed jointly
  • scientific discoveries
  • supervised learning
  • unsupervised learning

Question 133

Question
supervised learning
Answer
  • distinct requierements, such as the combining of multiple unrelated datasets, processing of large ammounts of unstructured data and harvesting of hidden information, in a time-sensitive manner
  • theses analysis results cannot be generalized to an entire dataset due to the small sample size
  • algorithm is first fed sample data where the data categories are already known
  • based on the input data, the algorithm develops an understanding of which data belongs to which category

Question 134

Question
supervised learning
Answer
  • refers to the quality or fidelity of data
  • usually require collecting data from multiple sources and storing it in a structure that lends itself to performing drill-downs and roll-ups
  • the information is generated at periodic intervals in realtime or near realtime
  • having developed an understanding, the algorithm can then apply the learned behavior to categorize unknown data

Question 135

Question
unsupervised learning
Answer
  • identification of new markets
  • try to predict the event outcome and predictions are made based on patterns, trends and exceptions found in historical and current data
  • data categories are unknown and no sample data is fed
  • Instead, the algorithm attemps to categorize data by grouping data with similar attributes together

Question 136

Question
data mining
Answer
  • is directly related to the veracity characteristic
  • Online Transaction Processing (OLTP)
  • unearths hidden patterns and relationships based on previously unknown attributes of data
  • may employ machine learning algorithms, such as unsupervised learning to extract previously unknown attributes

Question 137

Question
machine learning
Answer
  • This can lead to the identification of risk and opportunities
  • is not "intelligent" as such because it only provides answers to correctly formulated questions
  • makes predictions by categorizing data based on known patterns
  • can use the output from data mining (identified patterns) for further data classification through supervised learning

Question 138

Question
data mining
Answer
  • provide a holistic view of key business areas
  • Due to the volumes of data that some big data solutions are required to process, performance can sometimes become a concern
  • may employ machine learning algorithms, such as unsupervised learning to extract previously unknown attributes
  • this is accomplished by categorizing data which leads to the identification of patterns

Question 139

Question
Big Data Solutions
Answer
  • is stored in a tabular form
  • aim to determine the cause of a phenomenon that occuried in the past, using questions that focus on the reason behind the event
  • rely on BI and data warehouses as core components of big data environments and ecosystems
  • has advance BI and data warehouses technologies and practices to a point where a new generation of these platforms has emerged

Question 140

Question
Traditional BI
Answer
  • queries and statistical formulae can then be applied as part of various data analysis tasks for viewing data in a user-friendly format, such as on a dashboard
  • more detailed records
  • utilizes descriptive and diagnostic analysis to provide information on historical and current events
  • is not "intelligent" as such because it only provides answers to correctly formulated questions

Question 141

Question
Traditional BI
Answer
  • can also be fed back into OLTPs
  • is mostly machine-generated and automatically appended to the data
  • they also cannot be measured numerically or used for numerical comparisons
  • correctly formulating questions requires an understanding of business problems and issues, and of the data itself

Question 142

Question
BI reports on KPI
Answer
  • Sensor Data (RFID, Smart meters, GPS sensors)
  • tool generally abstract underlying statistical intricacies by providing user-friendly front-end interfaces
  • ad-hoc reports
  • dashboards

Question 143

Question
ad-hoc reporting
Answer
  • are commonly used for meaningful and complex reporting and assessment task and can also be fed back into applications to enhance their behavior (such as when product recommendations are displayed online)
  • Online Analytical Processing (OLAP)
  • is a process that involves manually processing data to produce custom-made reports
  • the focus is usually on a specific area of the business, such as its marketing or supply chain management.

Question 144

Question
ad-hoc reporting
Answer
  • Data adquired such as via online customer registrations, usually contains less noise
  • policies for data privacy and data anonymization
  • makes the adoption of big data solutions accessible to businesses without large capital investments
  • the generated custom reports are detailed and often tabular in nature

Question 145

Question
OLAP and OLTP data sources
Answer
  • Instead, the algorithm attemps to categorize data by grouping data with similar attributes together
  • each iteration can then help fine-tune processing steps, algorithms and data models to improve the accuracy of the result and deliver greater value to the business
  • Big data solutions require tools that can seamlessly connect to structured, semi-structured and unstructured data sources and are further capable of handling millions of data records
  • can be used by BI tools for both ad-hoc reporting and dashboards

Question 146

Question
dashboards
Answer
  • analytics results are viewed via interactive visualization tools that enable users to identify trends and patterns
  • in-house hardware resources are inadequate
  • provide a holistic view of key business areas
  • the information is generated at periodic intervals in realtime or near realtime

Question 147

Question
dashboards
Answer
  • are not turn-key solutions
  • does often have special pre-processing and storage requierements, especially if the underline format is not text-based
  • performing analytics on datasets can reveal confidential information about organizations or individuals
  • the presentation of data is graphical in nature, such as column charts, pie charts and gauges

Question 148

Question
OLAP and OLTP
Answer
  • The longer it takes for data to be turned into meaninful information, the less potential it may have for the business
  • datasets that need to be processed reside in a cloud
  • provide feedback in near-realtime via open and public mediums
  • BI tools use to display the information on dashboards

Question 149

Question
data warehouse and data marts
Answer
  • is carried out to answer questions about events that have already occurred
  • either exists in textual or binary form
  • can have multiple data marts
  • contain consolidated and validated information about enterprise-wide business entities

Question 150

Question
Traditional BI
Answer
  • policies that regulate the kind of external data that can be adquired
  • does often have special pre-processing and storage requierements, especially if the underline format is not text-based
  • cannot function effectively without data marts because they contain the optimized and segregated data requires for reporting purposes
  • without data marts, data needs to be extracted from the data warehouse via an ETL process on an ad-hoc basis whenever a query needs to be run

Question 151

Question
Traditional BI
Answer
  • can be used as an ETL engine, or as an analytics engine for processing large amounts of structured, semi-structured and unstructured data
  • accumulates from being amassed within the enterprise (via applications) or from external sources that are then stored by the big datat solution
  • Near-realtime data processing can be archieved by processing transactional data as it arrives and combining it with already summarized batch-processed data
  • uses datawarehouses and data marts for reporting and data analysis, because they allow complex data analysis queries with multiple joins and aggregations to be issued

Question 152

Question
Big Data BI
Answer
  • each feedback cycle may reveal the need for existing steps to be modified, or new steps, such as pre-processing for data cleasing, to be added
  • policies for data archiving data sources and analysis results
  • builds upon BI by acting on the cleansed, consolidated enterprise-wide data in the data warehouse and combining it with semi-structured and unstructured data sources
  • comprises both predictive and prescriptive analysis to facilitate the development of an enterprise-wide understanding of the way a business works

Question 153

Question
Big Data BI
Answer
  • The broadening coverage of the internet and the proliferation of cellular and Wi-Fi networks has enabled more people to be continuously active in virtual communities
  • they also cannot be measured numerically or used for numerical comparisons
  • sound processes and sufficient skillsets for those who will be responsible for implementing, customizing, populating and using big data solutions are also necessary
  • analyses focus on multiple business processes simultaneously

Question 154

Question
Traditional BI
Answer
  • analyses generally focus on individual business processes
  • Depends on the type of data present
  • as big data initiatives are inherently business-driven, there needs to be a clear business case for adopting a big data solution to ensure that it is justified and that expectations are met
  • refers to the information about the source of the data that helps determine its authenticity and quality. It also used for auditing purposes

Question 155

Question
Big Data BI
Answer
  • it is important to accept that big data solutions are not necessary for all business
  • business are also increasingly interested in incorporating publicly avaliable datasets from social media and other external data source
  • This helps reveal patterns and anomalies across a broader scope within the enterprise
  • It also leads to data discovery by identifying insights and information that may have been previously absent or unknown

Question 156

Question
Big Data BI
Answer
  • distinct requierements, such as the combining of multiple unrelated datasets, processing of large ammounts of unstructured data and harvesting of hidden information, in a time-sensitive manner
  • generally involves sifting through large amounts of raw, unstructured data to extract meaningful information that can serve as an input for identifying patterns, enriching existing enterprise data, or performing large-scale searches
  • requires the analysis of unstructured, semi-structured and structured data residing in the enterprise data warehouse
  • requires a "next-generation" data warehouse that use new features and technologies to store cleansed data originating from a variety of sources in a single uniform data format

Question 157

Question
Big Data BI
Answer
  • has advance BI and data warehouses technologies and practices to a point where a new generation of these platforms has emerged
  • Volume, Velocity, Variety, Veracity & Value
  • coupling a traditional data warehouse with these new technologies results in a hybrid data warehouse
  • this type of data warehouse acts as a uniform and central repository of structured, semi-structured and unstructured data that can provide tools with all of the data they require

Question 158

Question
Big Data BI
Answer
  • Arround 80% of analytics are ________ in nature
  • is directly related to the veracity characteristic
  • this eliminates the need for tools to have to connect to multiple data sources to retrieve or access data
  • A next-generation data warehouse establishes a standarized data access layer accross a range of data sources

Question 159

Question
Data Visualization
Answer
  • conforms to a data model or schema
  • is based on a quantifiable indicator that is identified and agreed upon beforehand
  • is a technique whereby analytical results are graphically communicated using elements like charts, maps, data grids, infographics and alerts
  • Graphically representing data can make it easier to understand reports, view trends and identify patterns

Question 160

Question
Traditional Data Visualization
Answer
  • contain consolidated and validated information about enterprise-wide business entities
  • the nature of the business may make external data very valuable. The greater the volume and variety of data, the higher the chances of finding hidden insights from patterns
  • provided mostly static charts and graphs in reports and dashboards
  • query data from relational databases, OLAP systems, data warehouses and spreadsheets to present both descriptive and diagnostic analytics results

Question 161

Question
contemporary data visualization
Answer
  • unearths hidden patterns and relationships based on previously unknown attributes of data
  • can be human-generated or machine generated, although it is ultimately the responsibility of machines to generate the processing results
  • can be used by enterprise applications directly, or fed into a data warehouse to enrich existing data.This data is typically analyzed and subjected to analytics
  • are interactive and can provide both summarized and detailed views of data

Question 162

Question
Data Visualization
Answer
  • analyses focus on multiple business processes simultaneously
  • semi-structured data
  • they are designed to help people who lack statistical and/or mathematical skills to better understand analytical results, without having to resort to spreadsheets
  • Big data solutions require tools that can seamlessly connect to structured, semi-structured and unstructured data sources and are further capable of handling millions of data records

Question 163

Question
Data Visualization
Answer
  • has advance BI and data warehouses technologies and practices to a point where a new generation of these platforms has emerged
  • policies for data archiving data sources and analysis results
  • generally use in-memory analytical technologies that reduce the latency normally attributed to traditional, disk-based tools
  • Big data solutions require tools that can seamlessly connect to structured, semi-structured and unstructured data sources and are further capable of handling millions of data records

Question 164

Question
Data Visualization Features
Answer
  • does not generally have any special pre-processing or storage requirements. Examples include banking transactions, OLTP system records and customer records
  • each technology is uniquely relevant to modern-day Big Data Solutions and ecosystems
  • Aggregation
  • Drill-Down

Question 165

Question
Data Visualization Features
Answer
  • also known as data discovery, is a specialized form of data analysis that targets large datasets
  • this type of data warehouse acts as a uniform and central repository of structured, semi-structured and unstructured data that can provide tools with all of the data they require
  • Filtering
  • Roll-Up

Question 166

Question
Data Visualization Features
Answer
  • are closesly liked with an enterprise's strategic objectives
  • Filtering
  • used to achieve regulatory compliance
  • What-if Analysis

Question 167

Question
Aggregation
Answer
  • in-house hardware resources are inadequate
  • distinct requierements, such as the combining of multiple unrelated datasets, processing of large ammounts of unstructured data and harvesting of hidden information, in a time-sensitive manner
  • involves extracting hidden or unknown patterns in the data with the intention of identifying previously unknown patterns
  • provides a holistic and sumerized view of data across multiple contexts

Question 168

Question
Drill-Down
Answer
  • Big data solutions access data and generate data, all of which become assets of the business
  • forms the basis for predictive analytics and business intelligence (BI)
  • since the sample size is large, the results can be applied in a generalized manner to the entire dataset
  • enables a detailed view of the data of interest by focusing in on a data subset from the summarized view

Question 169

Question
Filtering
Answer
  • Value and complexity increase as we move from descriptive to prescriptive analytics
  • provides a holistic and sumerized view of data across multiple contexts
  • is a data analysis technique that focuses on describing various data qualities using words
  • helps focus on a particular set of data by filtering away the data that is not of immediate interest

Question 170

Question
Roll-Up
Answer
  • qualitative analysis
  • structured data
  • queries can take several minutes or even longer, depending on the complexity of the query and the number of records queried
  • groups data across multiple categories to show subtotals and totals

Question 171

Question
What-if Analysis
Answer
  • adressing concerns can require the annotation of data with source information and other metadata, when it is generated or as it arrives
  • scientific discoveries
  • also, the quality of the data targeted for processing by big data solutions needs to be assessed
  • enables multiple outcomes to be visualized by enabling related factors to be dynamically changed

Question 172

Question
advance visualization tools
Answer
  • is stored in a tabular form
  • These techniques may not provide accurate findings in a timely manner because of the data's volume, velocity and/or variety
  • incorporate predictive and prescriptive data analytics and data transformation features
  • these tools eliminate the need for data pre-processing methods (such as ETL) and provide the ability to directly connect to structured, semi-structured and unstructured data sources

Question 173

Question
advance visualization tools
Answer
  • based on the input data, the algorithm develops an understanding of which data belongs to which category
  • can join structured and unstructured data that is kept in memory for fast data access
  • queries and statistical formulae can then be applied as part of various data analysis tasks for viewing data in a user-friendly format, such as on a dashboard
  • correctly formulating questions requires an understanding of business problems and issues, and of the data itself

Question 174

Question
business justification
Answer
  • this eliminates the need for tools to have to connect to multiple data sources to retrieve or access data
  • It also leads to data discovery by identifying insights and information that may have been previously absent or unknown
  • as big data initiatives are inherently business-driven, there needs to be a clear business case for adopting a big data solution to ensure that it is justified and that expectations are met
  • clear goals regarding the measurable business value of an enterprise's big data solution need to be set

Question 175

Question
business justification
Answer
  • algorithm is first fed sample data where the data categories are already known
  • Leads to an opportunity to collect further "secondary" data, such as when individuals carry out searches or complete surveys
  • anticipated benefits need to be weighed against risk and investments
  • risk associated with collecting accurate and relevant data, and with integrating the big data environment itself, need to be identified and quantified

Question 176

Question
business justification
Answer
  • refers to the quality or fidelity of data
  • a substancial budget may still be required to obtain external data
  • distinct requierements, such as the combining of multiple unrelated datasets, processing of large ammounts of unstructured data and harvesting of hidden information, in a time-sensitive manner
  • it is important to accept that big data solutions are not necessary for all business

Question 177

Question
big data frameworks
Answer
  • based on the input data, the algorithm develops an understanding of which data belongs to which category
  • provides a holistic and sumerized view of data across multiple contexts
  • are interactive and can provide both summarized and detailed views of data
  • are not turn-key solutions

Question 178

Question
organizational prerequisites
Answer
  • prescriptive analytics
  • enables multiple outcomes to be visualized by enabling related factors to be dynamically changed
  • in order for data analysis and analytics to be successful and offer value, enterprise need to have data management and big data governance frameworks
  • sound processes and sufficient skillsets for those who will be responsible for implementing, customizing, populating and using big data solutions are also necessary

Question 179

Question
organizational prerequisites
Answer
  • is mostly machine-generated and automatically appended to the data
  • Big data solutions require tools that can seamlessly connect to structured, semi-structured and unstructured data sources and are further capable of handling millions of data records
  • also, the quality of the data targeted for processing by big data solutions needs to be assessed
  • outdated, invalid or poorly identified data will result in low-quality input which, regardless of how good the big data solution is, will continue to produce low-quality output

Question 180

Question
organizational prerequisites
Answer
  • refers to automated, sofware-based techniques that sift through massive datasets to identify patterns and trends
  • makes predictions by categorizing data based on known patterns
  • the longevity of the big data environment also needs to be planned for
  • a roadmap needs to be defined to ensure that any necessary expansion or augmentation of the environment is planned out to stay in sinc with the requirements of the enterprise

Question 181

Question
data procurement
Answer
  • can be important to businesses. Mining this data may allow for customized marketing, automated recomendations and the development of optimized product features
  • Hadoop
  • the adquisition of big data solutions themselves can be economical, due to open-source platform availability and opportunities to leverage commodity hardware
  • a substancial budget may still be required to obtain external data

Question 182

Question
data procurement
Answer
  • build upon the results of predictive analytics by prescribing actions that should be taken. The focus is on which prescribed options to follow, and why and when it should be followed, to gain an advantage or mitigate a risk
  • they are designed to help people who lack statistical and/or mathematical skills to better understand analytical results, without having to resort to spreadsheets
  • the nature of the business may make external data very valuable. The greater the volume and variety of data, the higher the chances of finding hidden insights from patterns
  • external data sources include data markets and the government. Government-provided data, like geo-spatial data may be free

Question 183

Question
data procurement
Answer
  • predictive analytics
  • can process massive quantities of data that arrive at varying speeds, may be of many different varieties and have numerous incompatibilities
  • Value and complexity increase as we move from descriptive to prescriptive analytics
  • most commercially relevant data will need to be purchased. Such an investment may be on-going in order to obtain updated versions of the datasets

Question 184

Question
privacy
Answer
  • store operational data that is fully normalized
  • Coping with the fast inflow of data requires the enterprise to design highly elastic and avaliable processing solutions and corresponding data storage capabilities
  • performing analytics on datasets can reveal confidential information about organizations or individuals
  • even analyzing separate datasets that contain seemingly benign can reveal private information when the datasets are analyzed jointly

Question 185

Question
privacy
Answer
  • predictive analytics
  • descriptive analytics
  • this can lead to intentional or inadvertent breaches of privacy
  • adressing these privacy concerns requires an undestanding of the nature of data being accumulated and relevant data privacy regulations, as well as special techniques for data tagging and anonymization

Question 186

Question
privacy
Answer
  • big data security further involves establishing data access levels for different categories of users
  • The maturity of these fields of practice inspired and enabled much of the core functionality expected from contemporary Big Data solutions and tools
  • some of the components of big data solutions lack the robustness of traditional enterprise solution environments when it comes to access control and data security
  • securing big data involves ensuring that data networks provide access to repositories that are sufficiently secured, via custom authentication and autorization mechanisms

Question 187

Question
provenance
Answer
  • provide a holistic view of key business areas
  • without data marts, data needs to be extracted from the data warehouse via an ETL process on an ad-hoc basis whenever a query needs to be run
  • refers to the information about the source of the data that helps determine its authenticity and quality. It also used for auditing purposes
  • maintaining as large volumes of data are adquired, combined and put through multiple processing stages can be a complex task

Question 188

Question
provenance
Answer
  • provide a holistic view of key business areas
  • store historical data that is aggregated and denormalized to support fast reporting capability
  • adressing concerns can require the annotation of data with source information and other metadata, when it is generated or as it arrives
  • data may also need to be annotated with the source dataset attributes and processing steps details as it passes through the data transformation steps

Question 189

Question
Limited Realtime Support
Answer
  • is stored in a tabular form
  • performing analytics on datasets can reveal confidential information about organizations or individuals
  • Dashboards and other applications that require streaming data and alerts often demand realtime or near-realtime data transmissions
  • Many contemporary open-source big data solutions and tools are batch-oriented meaning support for streaming data analysis may either be limited or non-existent

Question 190

Question
Limited Realtime Support
Answer
  • algorithm is first fed sample data where the data categories are already known
  • they are designed to help people who lack statistical and/or mathematical skills to better understand analytical results, without having to resort to spreadsheets
  • some realtime data analysis solutions that do exist are proprietary
  • Near-realtime data processing can be archieved by processing transactional data as it arrives and combining it with already summarized batch-processed data

Question 191

Question
Distinct performance challenges
Answer
  • refers to automated, sofware-based techniques that sift through massive datasets to identify patterns and trends
  • anticipated benefits need to be weighed against risk and investments
  • queries can take several minutes or even longer, depending on the complexity of the query and the number of records queried
  • Due to the volumes of data that some big data solutions are required to process, performance can sometimes become a concern

Question 192

Question
Distinct governance requirements
Answer
  • having developed an understanding, the algorithm can then apply the learned behavior to categorize unknown data
  • are considered to provide more value than descriptive analysis, requiring a more advanced skillset
  • the relational data is stored as denormalized data in the form of cubes, this allows the data to be queried during any data analysis task that are performed later
  • Big data solutions access data and generate data, all of which become assets of the business

Question 193

Question
governance framework
Answer
  • can use the output from data mining (identified patterns) for further data classification through supervised learning
  • business are storing increasing amounts of data on customer interaction and from social media avenues in an attempt to harvest this data to increase sales, enable targeted marketing and create new products and service
  • analyses focus on multiple business processes simultaneously
  • is required to ensure that the data and the solution environment itself are regulated, standarized and evolved in a controlled manner

Question 194

Question
what a big data governance framework would encompass
Answer
  • does not conform to a data model or data schema
  • big data solutions particularly rely on it when processing semi-structured and unstructured data
  • standardizing how data is tagged and the metadata used for tagging
  • policies that regulate the kind of external data that can be adquired

Question 195

Question
what a big data governance framework would encompass
Answer
  • policies for data cleansing and filtering
  • can also be fed back into OLTPs
  • policies for data privacy and data anonymization
  • policies for data archiving data sources and analysis results

Question 196

Question
Distinct methodology
Answer
  • upfront capital investment is not available
  • simple insert, delete and update operations with sub-second response times
  • will be required to control how data flows in and out of big data solutions and how feedback loops can be established to enable the processed data to undergo repeated refinements
  • each feedback cycle may reveal the need for existing steps to be modified, or new steps, such as pre-processing for data cleasing, to be added

Question 197

Question
Distinct methodology
Answer
  • the focus is usually on a specific area of the business, such as its marketing or supply chain management.
  • A Not-only SQL (NoSQL) database is a non-relational database that can be use to store it
  • Extract Transform Load (ETL)
  • each iteration can then help fine-tune processing steps, algorithms and data models to improve the accuracy of the result and deliver greater value to the business

Question 198

Question
Cloud Computing
Answer
  • generally makes up 80% of the data within an enterprise, and has a faster growth rate than structured data
  • A next-generation data warehouse establishes a standarized data access layer accross a range of data sources
  • introduces remote environments that can host IT infrastructure for, among other things, large-scale storage and processing
  • the adoption of a big data environment may necessitate that some or all of that environment be hosted witin a cloud

Question 199

Question
Cloud Computing
Answer
  • Collections or groups of related data (Ex. Tweets stored in a flat file, collection of image files, extract of rows stored in a table, historical weather observations that are stored as XML Files)
  • as big data initiatives are inherently business-driven, there needs to be a clear business case for adopting a big data solution to ensure that it is justified and that expectations are met
  • upfront capital investment is not available
  • the project is to be isolated from the rest of the business so that existing business processes are not impacted

Question 200

Question
Cloud Computing
Answer
  • the limits of available computing and storage resources used by an in-house Big Data solution are being reached
  • is typically stored in relational databases and frequently generated by custom enterprise applications, ERP systems amd CRM systems
  • the big data initiative is a proof of concept
  • datasets that need to be processed reside in a cloud
Show full summary Hide full summary

Similar

FUNDAMENTOS DE REDES DE COMPUTADORAS
anhita
Test: "La computadora y sus partes"
Dayana Quiros R
Abreviaciones comunes en programación web
Diego Santos
Seguridad en la red
Diego Santos
Excel Básico-Intermedio
Diego Santos
Evolución de la Informática
Diego Santos
Introducción a la Ingeniería de Software
David Pacheco Ji
Conceptos básicos de redes
ARISAI DARIO BARRAGAN LOPEZ
La ingenieria de requerimientos
Sergio Abdiel He
TECNOLOGÍA TAREA
Denisse Alcalá P
Navegadores de Internet
M Siller