CCSP : Certified Cloud Security Professional (CCSP) : Part 11

  1. Your boss has tasked your team with getting your legacy systems and applications connected with new cloud-based services that management has decided are crucial to customer service and offerings.

    Which role would you be assuming under this directive?

    • Cloud service administrator
    • Cloud service user
    • Cloud service integrator
    • Cloud service business manager

    Explanation: 
    The cloud service integrator role is responsible for connecting and integrating existing services and applications with cloud-based services. A cloud service administrator is responsible for testing, monitoring, and securing cloud services, as well as providing usage reporting and dealing with service problems. The cloud service user is someone who consumes cloud services. The cloud service business manager is responsible for overseeing the billing, auditing, and purchasing of cloud services.

  2. One of the main components of system audits is the ability to track changes over time and to match these changes with continued compliance and internal processes.

    Which aspect of cloud computing makes this particular component more challenging than in a traditional data center?

    • Portability
    • Virtualization
    • Elasticity
    • Resource pooling
    Explanation: 
    Cloud services make exclusive use of virtualization, and systems change over time, including the addition, subtraction, and reimaging of virtual machines. It is extremely unlikely that the exact same virtual machines and images used in a previous audit would still be in use or even available for a later audit, making the tracking of changes over time extremely difficult, or even impossible. Elasticity refers to the ability to add and remove resources from a system or service to meet current demand, and although it plays a factor in making the tracking of virtual machines very difficult over time, it is not the best answer in this case. Resource pooling pertains to a cloud environment sharing a large amount of resources between different customers and services. Portability refers to the ability to move systems or services easily between different cloud providers.
  3. In the wake of many scandals with major corporations involving fraud and the deception of investors and regulators, which of the following laws was passed to govern accounting and financial records and disclosures?

    • GLBA
    • Safe Harbor
    • HIPAA
    • SOX
    Explanation: 
    The Sarbanes-Oxley Act (SOX) regulates the financial and accounting practices used by organizations in order to protect shareholders from improper practices and accounting errors.The Health Insurance Portability and Accountability Act (HIPAA) pertains to the protection of patient medical records and privacy. The Gramm-Leach-Bliley Act (GLBA) focuses on the use of PII within financial institutions. The Safe Harbor program was designed by the US government as a way for American companies to comply with European Union privacy laws.
  4. Which one of the following threat types to applications and services involves the sending of requests that are invalid and manipulated through a user’s client to execute commands on the application under the user’s own credentials?

    • Injection
    • Missing function-level access control
    • Cross-site scripting
    • Cross-site request forgery
    Explanation: 
    A cross-site request forgery (CSRF) attack forces a client that a user has used to authenticate to an application to send forged requests under the user’s own credentials to execute commands and requests that the application thinks are coming from a trusted client and user. Although this type of attack cannot be used to steal data directly because the attacker has no way of seeing the results of the commands, it does open other ways to compromise an application. Missing function-level access control exists where an application only checks for authorization during the initial login process and does not further validate with each function call. Cross-site scripting occurs when an attacker is able to send untrusted data to a user’s browser without going through validation processes. An injection attack is where a malicious actor sends commands or other arbitrary data through input and data fields with the intent of having the application or system execute the code as part of its normal processing and queries.
  5. Which cloud service category would be most ideal for a cloud customer that is developing software to test its applications among multiple hosting providers to determine the best option for its needs?

    • DaaS
    • PaaS
    • IaaS
    • SaaS
    Explanation: 
    Platform as a Service would allow software developers to quickly and easily deploy their applications among different hosting providers for testing and validation in order to determine the best option. Although IaaS would also be appropriate for hosting applications, it would require too much configuration of application servers and libraries in order to test code. Conversely, PaaS would provide a ready-to-use environment from the onset. DaaS would not be appropriate in any way for software developers to use to deploy applications. IaaS would not be appropriate in this scenario because it would require the developers to also deploy and maintain the operating system images or to contract with another firm to do so. SaaS, being a fully functional software platform, would not be appropriate for deploying applications into.
  6. You just hired an outside developer to modernize some applications with new web services and functionality. In order to implement a comprehensive test platform for validation, the developer needs a data set that resembles a production data set in both size and composition.

    In order to accomplish this, what type of masking would you use?

    • Development
    • Replicated
    • Static
    • Dynamic
    Explanation: 
    Static masking takes a data set and produces a copy of it, but with sensitive data fields masked. This allows for a full data set from production for testing purposes, but without any sensitive data. Dynamic masking works with a live system and is not used to produce a distinct copy. The terms “replicated” and “development” are not types of masking.
  7. In order to prevent cloud customers from potentially consuming enormous amounts of resources within a cloud environment and thus having a negative impact on other customers, what concept is commonly used by a cloud provider?

    • Limit
    • Cap
    • Throttle
    • Reservation
    Explanation: 
    A limit puts a maximum value on the amount of resources that may be consumed by either a system, a service, or a cloud customer. It is commonly used to prevent one entity from consuming enormous amounts of resources and having an operational impact on other tenants within the same cloud system. Limits can either be hard or somewhat flexible, meaning a customer can borrow from other customers while still having their actual limit preserved. A reservation is a guarantee to a cloud customer that a certain level of resources will always be available to them, regardless of what operational demands are currently placed on the cloud environment. Both cap and throttle are terms that sound similar to limit, but they are not the correct terms in this case.
  8. Where is a DLP solution generally installed when utilized for monitoring data at rest?

    • Network firewall
    • Host system
    • Application server
    • Database server
    Explanation: 
    To monitor data at rest appropriately, the DLP solution would be installed on the host system where the data resides. A database server, in some situations, may be an appropriate answer, but the host system is the best answer because a database server is only one example of where data could reside. An application server processes data and typically sits between the data and presentation zones, and as such, does not store data at rest. A network firewall would be more appropriate for data in transit because it is not a place where data would reside.
  9. Which of the following aspects of security is solely the responsibility of the cloud provider?

    • Regulatory compliance
    • Physical security
    • Operating system auditing
    • Personal security of developers
    Explanation: 
    Regardless of the particular cloud service used, physical security of hardware and facilities is always the sole responsibility of the cloud provider. The cloud provider may release information about their physical security policies and procedures to ensure any particular requirements of potential customers will meet their regulatory obligations. Personal security of developers and regulatory compliance are always the responsibility of the cloud customer. Responsibility for operating systems, and the auditing of them, will differ based on the cloud service category used.
  10. Humidity levels for a data center are a prime concern for maintaining electrical and computing resources properly as well as ensuring that conditions are optimal for top performance.

    Which of the following is the optimal humidity level, as established by ASHRAE?

    • 20 to 40 percent relative humidity
    • 50 to 75 percent relative humidity
    • 40 to 60 percent relative humidity
    • 30 to 50 percent relative humidity
    Explanation: 
    The American Society of Heating, Refrigeration, and Air Conditioning Engineers (ASHRAE) recommends 40 to 60 percent relatively humidity for data centers. None of these options is the recommendation from ASHRAE.
  11. Within a SaaS environment, what is the responsibility on the part of the cloud customer in regard to procuring the software used?

    • Maintenance
    • Licensing
    • Development
    • Purchasing
    Explanation: 
    Within a SaaS implementation, the cloud customer licenses the use of the software from the cloud provider because SaaS delivers a fully functional application to the customer. With SaaS, the cloud provider is responsible for the entire software application and any necessary infrastructure to develop, run, and maintain it. The purchasing, development, and maintenance are fully the responsibility of the cloud provider.
  12. Implementing baselines on systems would take an enormous amount of time and resources if the staff had to apply them to each server, and over time, it would be almost impossible to keep all the systems in sync on an ongoing basis.

    Which of the following is NOT a package that can be used for implementing and maintaining baselines across an enterprise?

    • Puppet
    • SCCM
    • Chef
    • GitHub
    Explanation: 
    GitHub is a software development platform that serves as a code repository and versioning system. It is solely used for software development and would not be appropriate for applying baselines to systems. Puppet is an open-source configuration management tool that runs on many platforms and can be used to apply and maintain baselines. The Software Center Configuration Manager (SCCM) was developed by Microsoft for managing systems across large groups of servers. Chef is also a system for maintaining large groups of systems throughout an enterprise.
  13. From the perspective of compliance, what is the most important consideration when it comes to data center location?

    • Natural disasters
    • Utility access
    • Jurisdiction
    • Personnel access
    Explanation: 
    Jurisdiction will dictate much of the compliance and audit requirements for a data center. Although all the aspects listed are very important to security, from a strict compliance perspective, jurisdiction is the most important. Personnel access, natural disasters, and utility access are all important operational considerations for selecting a data center location, but they are not related to compliance issues like jurisdiction is.
  14. Different certifications and standards take different approaches to data center design and operations. Although many traditional approaches use a tiered methodology, which of the following utilizes a macro-level approach to data center design?

    • IDCA
    • BICSI
    • Uptime Institute
    • NFPA
    Explanation: 
    The Infinity Paradigm of the International Data Center Authority (IDCA) takes a macro-level approach to data center design. The IDCA does not use a specific, focused approach on specific components to achieve tier status. Building Industry Consulting Services International (BICSI) issues certifications for data center cabling. The National Fire Protection Association (NFPA) publishes a broad range of fire safety and design standards for many different types of facilities. The Uptime Institute publishes the most widely known and used standard for data center topologies and tiers.
  15. The European Union is often considered the world leader in regard to the privacy of personal data and has declared privacy to be a “human right.”

    In what year did the EU first assert this principle?

    • 1995
    • 2000
    • 2010
    • 1999
    Explanation: 
    The EU passed Directive 95/46 EC in 1995, which established data privacy as a human right. The other years listed are incorrect.
  16. A DLP solution/implementation has three main components.

    Which of the following is NOT one of the three main components?

    • Monitoring
    • Enforcement
    • Auditing
    • Discovery and classification
    Explanation: 
    Auditing, which can be supported to varying degrees by DLP solutions, is not a core component of them. Data loss prevention (DLP) solutions have core components of discovery and classification, enforcement, and monitoring. Discovery and classification are concerned with determining which data should be applied to the DLP policies, and then determining its classification level. Monitoring is concerned with the actual watching of data and how it’s used through its various stages. Enforcement is the actual application of policies determined from the discovery stage and then triggered during the monitoring stage.
  17. What type of storage structure does object storage employ to maintain files?

    • Directory
    • Hierarchical
    • tree
    • Flat
    Explanation: 
    Object storage uses a flat file system to hold storage objects; it assigns files a key value that is then used to access them, rather than relying on directories or descriptive filenames. Typical storage layouts such as tree, directory, and hierarchical structures are used within volume storage, whereas object storage maintains a flat structure with key values.
  18. Which cloud storage type requires special consideration on the part of the cloud customer to ensure they do not program themselves into a vendor lock-in situation?

    • Unstructured
    • Object
    • Volume
    • Structured
    Explanation:​
    Structured storage is designed, maintained, and implemented by a cloud service provider as part of a PaaS offering. It is specific to that cloud provider and the way they have opted to implement systems, so special care is required to ensure that applications are not designed in a way that will lock the cloud customer into a specific cloud provider with that dependency. Unstructured storage for auxiliary files would not lock a customer into a specific provider. With volume and object storage, because the cloud customer maintains their own systems with IaaS, moving and replicating to a different cloud provider would be very easy.
  19. Which cloud deployment model would be ideal for a group of universities looking to work together, where each university can gain benefits according to its specific needs?

    • Private
    • Public
    • Hybrid
    • Community
    Explanation:
    A community cloud is owned and maintained by similar organizations working toward a common goal. In this case, the universities would all have very similar needs and calendar requirements, and they would not be financial competitors of each other. Therefore, this would be an ideal group for working together within a community cloud. A public cloud model would not work in this scenario because it is designed to serve the largest number of customers, would not likely be targeted toward specific requirements for individual customers, and would not be willing to make changes for them. A private cloud could accommodate such needs, but would not meet the criteria for a group working together, and a hybrid cloud spanning multiple cloud providers would not fit the specifics of the question.
  20. Data centers have enormous power resources that are distributed and consumed throughout the entire facility.

    Which of the following standards pertains to the proper fire safety standards within that scope?

    • IDCA
    • BICSI
    • NFPA
    • Uptime Institute
    Explanation: 
    The National Fire Protection Association (NFPA) publishes a broad range of fire safety and design standards for many different types of facilities. Building Industry Consulting Services International (BICSI) issues certifications for data center cabling. The Uptime Institute publishes the most widely known and used standard for data center topologies and tiers. The International Data Center Authority (IDCA) offers the Infinity Paradigm, which takes a macro-level approach to data center design.
Subscribe
Notify of
guest
0 Comments
Newest
Oldest Most Voted
Inline Feedbacks
View all comments