CCSP : Certified Cloud Security Professional (CCSP) : Part 13

  1. Three central concepts define what type of data and information an organization is responsible for pertaining to eDiscovery.

    Which of the following are the three components that comprise required disclosure?

    • Possession, ownership, control
    • Ownership, use, creation
    • Control, custody, use
    • Possession, custody, control

    Explanation: 
    Data that falls under the purview of an eDiscovery request is that which is in the possession, custody, or control of the organization. Although this is an easy concept in a traditional data center, it can be difficult to distinguish who actually possesses and controls the data in a cloud environment due to multitenancy and resource pooling. Although these options provide similar-sounding terms, they are ultimately incorrect.

  2. Which of the following threat types involves the sending of commands or arbitrary data through input fields in an application in an attempt to get that code executed as part of normal processing?

    • Cross-site scripting
    • Missing function-level access control
    • Injection
    • Cross-site forgery
    Explanation: 
    An injection attack is where a malicious actor will send commands or other arbitrary data through input and data fields with the intent of having the application or system execute the code as part of its normal processing and queries. This can trick an application into exposing data that is not intended or authorized to be exposed, or it could potentially allow an attacker to gain insight into configurations or security controls. Missing function-level access control exists where an application only checks for authorization during the initial login process and does not further validate with each function call. Cross-site request forgery occurs when an attack forces an authenticated user to send forged requests to an application running under their own access and credentials. Cross-site scripting occurs when an attacker is able to send untrusted data to a user’s browser without going through validation processes.
  3. With a cloud service category where the cloud customer is responsible for deploying all services, systems, and components needed for their applications, which of the following storage types are MOST likely to be available to them?

    • Structured and hierarchical
    • Volume and object
    • Volume and database
    • Structured and unstructured
    Explanation:
    The question is describing the Infrastructure as a Service (IaaS) cloud offering, and as such, the volume and object storage types will be available to the customer. Structured and unstructured are storage types associated with PaaS, and although the other answers present similar-sounding storage types, they are a mix of real and fake names.
  4. Which of the following roles would be responsible for managing memberships in federations and the use and integration of federated services?

    • Inter-cloud provider
    • Cloud service business manager
    • Cloud service administrator
    • Cloud service integrator
    Explanation:
    The inter-cloud provider is responsible for peering with other cloud services and providers, as well as overseeing and managing federations and federated services. A cloud service administrator is responsible for testing, monitoring, and securing cloud services, as well as providing usage reporting and dealing with service problems. The cloud service integrator is responsible for connecting existing systems and services with a cloud. The cloud service business manager is responsible for overseeing the billing, auditing, and purchasing of cloud services.
  5. Which data state would be most likely to use TLS as a protection mechanism?

    • Data in use
    • Data at rest
    • Archived
    • Data in transit
    Explanation:
    TLS would be used with data in transit, when packets are exchanged between clients or services and sent across a network. During the data-in-use state, the data is already protected via a technology such as TLS as it is exchanged over the network and then relies on other technologies such as digital signatures for protection while being used. The data-at-rest state primarily uses encryption for stored file objects. Archived data would be the same as data at rest.
  6. You are working for a cloud service provider and receive an eDiscovery order pertaining to one of your customers.

    Which of the following would be the most appropriate action to take first?

    • Take a shapshot of the virtual machines
    • Escrow the encryption keys
    • Copy the data
    • Notify the customer
    Explanation:
    When a cloud service provider receives an eDiscovery order pertaining to one of their customers, the first action they must take is to notify the customer. This allows the customer to be aware of what was received, as well as to conduct a review to determine if any challenges are necessary or warranted. Taking snapshots of virtual machines, copying data, and escrowing encryption keys are all processes involved in the actual collection of data and should not be performed until the customer has been notified of the request.
  7. If a cloud computing customer wishes to guarantee that a minimum level of resources will always be available, which of the following set of services would compromise the reservation?

    • Memory and networking
    • CPU and software
    • CPU and storage
    • CPU and memory
    Explanation:
    A reservation guarantees to a cloud customer that they will have access to a minimal level of resources to run their systems, which will help mitigate against DoS attacks or systems that consume high levels of resources. A reservation pertains to memory and CPU resources. Under the concept of a reservation, memory and CPU are the guaranteed resources, but storage and networking are not included even though they are core components of cloud computing. Software would be out of scope for a guarantee and doesn’t really pertain to the concept.
  8. Which of the following threat types can occur when baselines are not appropriately applied or when unauthorized changes are made?

    • Security misconfiguration
    • Insecure direct object references
    • Unvalidated redirects and forwards
    • Sensitive data exposure
    Explanation:
    Security misconfigurations occur when applications and systems are not properly configured or maintained in a secure manner. This can be due to a shortcoming in security baselines or configurations, unauthorized changes to system configurations, or a failure to patch and upgrade systems as the vendor releases security patches. Insecure direct object references occur when code references aspects of the infrastructure, especially internal or private systems, and an attacker can use that knowledge to glean more information about the infrastructure. Unvalidated redirects and forwards occur when an application has functions to forward users to other sites, and these functions are not properly secured to validate the data and redirect requests, allowing spoofing for malware or phishing attacks. Sensitive data exposure occurs when an application does not use sufficient encryption and other security controls to protect sensitive application data.
  9. Which of the following is considered an internal redundancy for a data center?

    • Power feeds
    • Chillers
    • Network circuits
    • Generators
    Explanation:
    Chillers and cooling systems are internal to a data center and its operations, and as such they are considered an internal redundancy. Power feeds, network circuits, and generators are all external to a data center and provide utility services to them, which makes them an external redundancy.
  10. Which of the following threat types involves the sending of invalid and manipulated requests through a user’s client to execute commands on the application under their own credentials?

    • Injection
    • Cross-site request forgery
    • Missing function-level access control
    • Cross-site scripting
    Explanation:
    A cross-site request forgery (CSRF) attack forces a client that a user has used to authenticate to an application to send forged requests under the user’s own credentials to execute commands and requests that the application thinks are coming from a trusted client and user. Although this type of attack cannot be used to steal data directly because the attacker has no way to see the results of the commands, it does open other ways to compromise an application. Missing function-level access control exists where an application only checks for authorization during the initial login process and does not further validate with each function call. An injection attack is where a malicious actor sends commands or other arbitrary data through input and data fields with the intent of having the application or system execute the code as part of its normal processing and queries. Cross-site scripting occurs when an attacker is able to send untrusted data to a user’s browser without going through validation processes.
  11. With finite resources available within a cloud, even the largest cloud providers will at times need to determine which customers will receive additional resources first.

    What is the term associated with this determination?

    • Weighting
    • Prioritization
    • Shares
    • Scoring
    Explanation:
    Shares are used within a cloud environment to prioritize resource allocation when customer requests exceed the available resources. Cloud providers utilize shares by assigning a priority score to each customer and allocating resources to those with the highest scores first. Scoring is a component of shares that determines the actual order in which to allocate resources. Neither weighting nor prioritization is the correct term in this case.
  12. In order to comply with regulatory requirements, which of the following secure erasure methods would be available to a cloud customer using volume storage within the IaaS service model?

    • Demagnetizing
    • Shredding
    • Degaussing
    • Cryptographic erasure
    Explanation:
    Cryptographic erasure is a secure method to destroy data by destroying the keys that were used to encrypt it. This method is universally available for volume storage on IaaS and is also extremely quick. Shredding, degaussing, and demagnetizing are all physically destructive methods that would not be permitted within a cloud environment using shared resources.
  13. Where is a DLP solution generally installed when utilized for monitoring data in use?

    • Application server
    • Database server
    • Network perimeter
    • User’s client
    Explanation:
    To monitor data in use, the DLP solution’s optimal location would be on the user’s client or workstation, where the data would be used or processed, and where it would be most vulnerable to access or exposure. The network perimeter is most appropriate for data in transit, and an application server would serve as middle stage between data at rest and data in use, but is a less correct answer than a user’s client. A database server would be an example of a location appropriate for monitoring data at rest.
  14. Which of the following aspects of cloud computing would make it more likely that a cloud provider would be unwilling to satisfy specific certification requirements?

    • Regulation
    • Multitenancy
    • Virtualization
    • Resource pooling
    Explanation:
    With cloud providers hosting a number of different customers, it would be impractical for them to pursue additional certifications based on the needs of a specific customer. Cloud environments are built to a common denominator to serve the greatest number of customers. Especially within a public cloud model, it is not possible or practical for a cloud provider to alter its services for specific customer demands. Resource pooling and virtualization within a cloud environment would be the same for all customers, and would not impact certifications that a cloud provider might be willing to pursue. Regulations would form the basis for certification problems and would be a reason for a cloud provider to pursue specific certifications to meet customer requirements.
  15. Which phase of the cloud data lifecycle would be the MOST appropriate for the use of DLP technologies to protect the data?

    • Use
    • Store
    • Share
    • Create
    Explanation:
    During the share phase, data is allowed to leave the application for consumption by other vendors, systems, or services. At this point, as the data is leaving the security controls of the application, the use of DLP technologies is appropriate to control how the data is used or to force expiration. During the use, create, and store phases, traditional security controls are available and are more appropriate because the data is still internal to the application.
  16. During which phase of the cloud data lifecycle is it possible for the classification of data to change?

    • Use
    • Archive
    • Create
    • Share
    Explanation:
    The create phase encompasses any time data is created, imported, or modified. With any change in the content or value of data, the classification may also change. It must be continually reevaluated to ensure proper security. During the use, share, and archive phases, the data is not modified in any way, so the original classification is still relevant.
  17. If a key feature of cloud computing that your organization desires is the ability to scale and expand without limit or concern about available resources, which cloud deployment model would you MOST likely be considering?

    • Public
    • Hybrid
    • Private
    • Community
    Explanation:
    Public clouds, such as AWS and Azure, are massive systems run by major corporations, and they account for a significant share of Internet traffic and services. They are always expanding, offer enormous resources to customers, and are the least likely to run into resource constraints compared to the other deployment models. Private clouds would likely have the resources available for specific uses and could not be assumed to have a large pool of resources available for expansion. A community cloud would have the same issues as a private cloud, being targeted to similar organizations. A hybrid cloud, because it spans multiple clouds, would not fit the bill either, without the use of individual cloud models.
  18. What is a serious complication an organization faces from the compliance perspective with international operations?

    • Multiple jurisdictions
    • Different certifications
    • Different operational procedures
    • Different capabilities
    Explanation:
    When operating within a global framework, a security professional runs into a multitude of jurisdictions and requirements, which often may not be clearly applicable or may be in contention with each other. These requirements can involve the location of the users and the type of data they enter into systems, the laws governing the organization that owns the application and any regulatory requirements they may have, and finally the appropriate laws and regulations for the jurisdiction housing the IT resources and where the data is actually stored, which may be multiple jurisdictions as well. Different certifications would not come into play as a challenge because the major IT and data center certifications are international and would apply to any cloud provider. Different capabilities and different operational procedures would be mitigated by the organization’s selection of a cloud provider and would not be a challenge if an appropriate provider was chosen, regardless of location.
  19. ISO/IEC has established international standards for many aspects of computing and any processes or procedures related to information technology.

    Which ISO/IEC standard has been established to provide a framework for handling eDiscovery processes?

    • ISO/IEC 27001
    • ISO/IEC 27002
    • ISO/IEC 27040
    • ISO/IEC 27050
    Explanation:
    ISO/IEC 27050 strives to establish an internationally accepted standard for eDiscovery processes and best practices. It encompasses all steps of the eDiscovery process, including the identification, preservation, collection, processing, review, analysis, and the final production of the requested data archive. ISO/IEC 27001 is a general security specification for an information security management system. ISO/IEC 27002 gives best practice recommendations for information security management. ISO/IEC 27040 is focused on the security of storage systems.
  20. If a company needed to guarantee through contract and SLAs that a cloud provider would always have available sufficient resources to start their services and provide a certain level of provisioning, what would the contract need to refer to?

    • Limit
    • Reservation
    • Assurance
    • Guarantee
    Explanation:
    A reservation guarantees to a cloud customer that they will have access to a minimal level of resources to run their systems, which will help mitigate against DoS attacks or systems that consume high levels of resources. A limit refers to the enforcement of a maximum level of resources that can be consumed by or allocated to a cloud customer, service, or system. Both guarantee and assurance are terms that sound similar to reservation, but they are not correct choices.
Subscribe
Notify of
guest
0 Comments
Newest
Oldest Most Voted
Inline Feedbacks
View all comments