3.5.2 Module 3 Quiz – Preparing and cleaning data for Analysis Exam Answers Full 100% | Data Analytics Essentials 2023 2024

This is 3.5.2 Module 3 Quiz – Preparing and cleaning data for Analysis Exam Answers Full 100%  in 2023 and 2024. It is also module 3 quiz answers in the Cisco NetAcad SkillsForAll Data Analytics Essentials course. Our experts have verified all the answers with explanations to get the 100%.

  1. What term is used for the process of conditioning data into a form that can be analyzed?

    • transforming
    • extracting
    • loading
    • processing

      Explanation & Hint:

      Three processes must occur before data can be analyzed:

      • Extracting – the process of gathering data
      • Transforming – the process of conditioning data into a usable form
      • Loading – the process of transferring data to a database
  2. What task is performed as part of the loading process in data preparation?

    • transferring the data into a database
    • providing values for missing data
    • collecting and recording the data
    • locating and removing duplicate data

      Explanation & Hint:

      There are three processes that are part of data preparation.

      • Extracting – the process of gathering data
      • Transforming – the process of conditioning data into a usable form
      • Loading – the process of transferring data to a database
  3. Which two tasks are part of the transforming data process? (Choose two.)

    • collecting data required to perform the analysis
    • reviewing the data and addressing erroneous or missing values
    • eliminating the data that is not relevant to the analysis question to be answered
    • creating visual representations of the data
    • presenting the knowledge gained from the data

      Explanation & Hint:

      Transforming data is the process of conditioning data into a usable form. This includes tasks such as sorting data and removing duplicate data. Collecting data is the process of extracting data. Creating visual representations of data and presenting the knowledge gained from the data are examples of the final steps that are used in data analysis.

  4. In a data analysis project, which process gathers all grades for students enrolled in a college course?

    • action
    • extract
    • load
    • transform

      Explanation & Hint:

      There are three processes in data preparation:

      • Extract – The process of gathering the data to be analyzed.
      • Transform – The process of conditioning data into a usable form.
      • Load – The process of transferring data to a database.

      Action is the application of knowledge, but it is not one of the processes used in data preparation.

  5. What is a characteristic of open data?

    • data that lacks intellectual property restrictions
    • data that lacks predefined organization
    • data that does not need to be stored
    • data that does not generate new knowledge

      Explanation & Hint:

      Open data is not protected by intellectual property restrictions and can be used and redistributed without legal, technical, or social restrictions.

  6. An analyst is reviewing gathered data from various sources and modifying incompatible metrics into a usable form. In which ETL step would this task be completed in?

    • action
    • extract
    • load
    • transform

      Explanation & Hint:

      There are three processes in data preparation:

      • Extract – the process of gathering the data to be analyzed
      • Transform – the process of aggregating data and modifying it into a usable form
      • Load – the process of transferring data to a database
  7. What are Web scrapping tools used for in big data analytics?

    • to automatically extract data from HTML pages
    • to provide standardized interfaces for data collection
    • to store real-time data in its original format
    • to create historical record of everything and anything that happens within a system, including events such as transactions, errors, or intrusions

      Explanation & Hint:

      Web scraping tools are used to automatically extract data from HTML pages. Typically, web scraping is an automated process which uses a bot or web crawler to find and obtain data. Specific data is gathered and copied from the web to a database or spreadsheet.

  8. Which universal file format is organized in rows and has the columns separated by a delimiter character?

    • HTML
    • CSV
    • JSON
    • XML

      Explanation & Hint:

      Comma-separated values (CSV) files are a plaintext file format in which a row contains a complete record, and a delimiter character separates the individual columns. The other file formats use markup language to define the data.

  9. Which two methods are used to collect unstructured data for analysis? (Choose two.)

    • querying a relational database
    • using a web-scraping tool
    • copying a spreadsheet
    • obtaining data through an API
    • downloading a CSV file

      Explanation & Hint:

      Unstructured data is not organized in any predefined manner. It does not easily fit into relational databases, CSV files, or spreadsheets. Methods to collect unstructured data include retrieving it from NoSQL databases or data lakes, web-scraping content from the internet, and using application program interfaces (APIs).

  10. What are two important considerations when selecting data for an analysis project? (Choose two.)

    • the data is relevant to the original business question
    • the data must be free and unlicensed
    • the data is current and updated frequently
    • the data can be downloaded from the internet
    • the data is not being streamed live

      Explanation & Hint:

      When choosing data for analysis, important considerations include ensuring that the data is relevant to the original business question and that the data is current.

Subscribe
Notify of
guest
0 Comments
Newest
Oldest Most Voted
Inline Feedbacks
View all comments