CCSP : Certified Cloud Security Professional (CCSP) : Part 20

  1. Which of the following types of data would fall under data rights management (DRM) rather than information rights management (IRM)?

    • Personnel data
    • Security profiles
    • Publications
    • Financial records

    Explanation: 
    Whereas IRM is used to protect a broad range of data, DRM is focused specifically on the protection of consumer media, such as publications, music, movies, and so on. IRM is used to protect general institution data, so financial records, personnel data, and security profiles would all fall under the auspices of IRM.

  2. Different security testing methodologies offer different strategies and approaches to testing systems, requiring security personnel to determine the best type to use for their specific circumstances.

    What does dynamic application security testing (DAST) NOT entail that SAST does?

    • Discovery
    • Knowledge of the system
    • Scanning
    • Probing
    Explanation: 
    Dynamic application security testing (DAST) is considered “black-box” testing and begins with no inside knowledge of the application or its configurations. Everything about it must be discovered during its testing. As with most types of testing, dynamic application security testing (DAST) involves probing, scanning, and a discovery process for system information.
  3. You need to gain approval to begin moving your company’s data and systems into a cloud environment. However, your CEO has mandated the ability to easily remove your IT assets from the cloud provider as a precondition.

    Which of the following cloud concepts would this pertain to?

    • Removability
    • Extraction
    • Portability
    • Reversibility
    Explanation: 
    Reversibility is the cloud concept involving the ability for a cloud customer to remove all of its data and IT assets from a cloud provider. Also, processes and agreements would be in place with the cloud provider that ensure all removals have been completed fully within the agreed upon timeframe. Portability refers to the ability to easily move between different cloud providers and not be locked into a specific one. Removability and extraction are both provided as terms similar to reversibility, but neither is the official term or concept.
  4. What does static application security testing (SAST) offer as a tool to the testers that makes it unique compared to other common security testing methodologies?

    • Live testing
    • Source code access
    • Production system scanning
    • Injection attempts
    Explanation: 
    Static application security testing (SAST) is conducted against offline systems with previous knowledge of them, including their source code. Live testing is not part of static testing but rather is associated with dynamic testing. Production system scanning is not appropriate because static testing is done against offline systems. Injection attempts are done with many different types of testing and are not unique to one particular type. It is therefore not the best answer to the question.
  5. A main objective for an organization when utilizing cloud services is to avoid vendor lock-in so as to ensure flexibility and maintain independence.

    Which core concept of cloud computing is most related to vendor lock-in?

    • Scalability
    • Interoperability
    • Portability
    • Reversibility
    Explanation: 
    Portability is the ability for a cloud customer to easily move their systems, services, and applications among different cloud providers. By avoiding reliance on proprietary APIs and other vendor-specific cloud features, an organization can maintain flexibility to move among the various cloud providers with greater ease. Reversibility refers to the ability for a cloud customer to quickly and easy remove all their services and data from a cloud provider. Interoperability is the ability to reuse services and components for other applications and uses. Scalability refers to the ability of a cloud environment to add or remove resources to meet current demands.
  6. Which of the following areas of responsibility always falls completely under the purview of the cloud provider, regardless of which cloud service category is used?

    • Infrastructure
    • Data
    • Physical
    • Governance
    Explanation: 
    Regardless of the cloud service category used, the physical environment is always the sole responsibility of the cloud provider. In many instances, the cloud provider will supply audit reports or some general information about their physical security practices, especially to those customers or potential customers that may have regulatory requirements, but otherwise the cloud customer will have very little insight into the physical environment. With IaaS, the infrastructure is a shared responsibility between the cloud provider and cloud customer. With all cloud service categories, the data and governance are always the sole responsibility of the cloud customer.
  7. What type of masking would you employ to produce a separate data set for testing purposes based on production data without any sensitive information?

    • Dynamic
    • Tokenized
    • Replicated
    • Static
    Explanation: 
    Static masking involves taking a data set and replacing sensitive fields and values with non-sensitive or garbage data. This is done to enable testing of an application against data that resembles production data, both in size and format, but without containing anything sensitive. Dynamic masking involves the live and transactional masking of data while an application is using it. Tokenized would refer to tokenization, which is the replacing of sensitive data with a key value that can later be matched back to the original value, and although it could be used as part of the production of test data, it does not refer to the overall process. Replicated is provided as an erroneous answer, as replicated data would be identical in value and would not accomplish the production of a test set.
  8. Which aspect of data poses the biggest challenge to using automated tools for data discovery and programmatic data classification?

    • Quantity
    • Language
    • Quality
    • Number of courses
    Explanation: 
    The biggest challenge for properly using any programmatic tools in data discovery is the actual quality of the data, including the data being uniform and well structured, labels being properly applied, and other similar facets. Without data being organized in such a manner, it is extremely difficult for programmatic tools to automatically synthesize and make determinations from it. The overall quantity of data, as well as the number of sources, does not pose an enormous challenge for data discovery programs, other than requiring a longer time to process the data. The language of the data itself should not matter to a program that is designed to process it, as long as the data is well formed and consistent.
  9. When an organization is considering a cloud environment for hosting BCDR solutions, which of the following would be the greatest concern?

    • Self-service
    • Resource pooling
    • Availability
    • Location
    Explanation: 
    If an organization wants to use a cloud service for BCDR, the location of the cloud hosting becomes a very important security consideration due to regulations and jurisdiction, which could be dramatically different from the organization’s normal hosting locations. Availability is a hallmark of any cloud service provider, and likely will not be a prime consideration when an organization is considering using a cloud for BCDR; the same goes for self-service options. Resource pooling is common among all cloud systems and would not be a concern when an organization is dealing with the provisioning of resources during a disaster.
  10. Just like the risk management process, the BCDR planning process has a defined sequence of steps and processes to follow to ensure the production of a comprehensive and successful plan.

    Which of the following is the correct sequence of steps for a BCDR plan?

    • Define scope, gather requirements, assess risk, implement
    • Define scope, gather requirements, implement, assess risk
    • Gather requirements, define scope, implement, assess risk
    • Gather requirements, define scope, assess risk, implement
    Explanation: 
    The correct sequence for a BCDR plan is to define the scope, gather requirements based on the scope, assess overall risk, and implement the plan. The other sequences provided are not in the correct order.
  11. What type of solution is at the core of virtually all directory services?

    • WS
    • LDAP
    • ADFS
    • PKI
    Explanation: 
    The Lightweight Directory Access Protocol (LDAP) forms the basis of virtually all directory services, regardless of the specific vendor or software package.WS is a protocol for information exchange between two systems and does not actually store the data. ADFS is a Windows component for enabling single sign-on for the operating system and applications, but it relies on data from an LDAP server. PKI is used for managing and issuing security certificates.
  12. The different cloud service models have varying levels of responsibilities for functions and operations depending with the model’s level of service.

    In which of the following models would the responsibility for patching lie predominantly with the cloud customer?

    • DaaS
    • SaaS
    • PaaS
    • IaaS
    Explanation:
    With Infrastructure as a Service (IaaS), the cloud customer is responsible for deploying and maintaining its own systems and virtual machines. Therefore, the customer is solely responsible for patching and any other security updates it finds necessary. With Software as a Service (SaaS), Platform as a Service (PaaS), and Desktop as a Service (DaaS), the cloud provider maintains the infrastructure components and is responsible for maintaining and patching them.
  13. Which component of ITIL involves the creation of an RFC ticket and obtaining official approvals for it?

    • Problem management
    • Release management
    • Deployment management
    • Change management
    Explanation: 
    The change management process involves the creation of the official Request for Change (RFC) ticket, which is used to document the change, obtain the required approvals from management and stakeholders, and track the change to completion. Release management is a subcomponent of change management, where the actual code or configuration change is put into place. Deployment management is similar to release management, but it’s where changes are actually implemented on systems. Problem management is focused on the identification and mitigation of known problems and deficiencies before they are able to occur.
  14. Which of the following are attributes of cloud computing?

    • Minimal management effort and shared resources
    • High cost and unique resources
    • Rapid provisioning and slow release of resources
    • Limited access and service provider interaction
    Explanation: 
    Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction.
  15. In a cloud environment, encryption should be used for all the following, except:

    • Secure sessions/VPN
    • Long-term storage of data
    • Near-term storage of virtualized images
    • Profile formatting
    Explanation: 
    All of these activities should incorporate encryption, except for profile formatting, which is a made-up term.
  16. Which of the following is considered a technological control?

    • Firewall software
    • Firing personnel
    • Fireproof safe
    • Fire extinguisher
    Explanation: 
    A firewall is a technological control. The safe and extinguisher are physical controls and firing someone is an administrative control.
  17. When using an IaaS solution, what is the capability provided to the customer?

    • To provision processing, storage, networks, and other fundamental computing resources when the consumer is able to deploy and run arbitrary software, which can include OSs and applications.
    • To provision processing, storage, networks, and other fundamental computing resources when the auditor is able to deploy and run arbitrary software, which can include OSs and applications.
    • To provision processing, storage, networks, and other fundamental computing resources when the provider is able to deploy and run arbitrary software, which can include OSs and applications.
    • To provision processing, storage, networks, and other fundamental computing resources when the consumer is not able to deploy and run arbitrary software, which can include OSs and applications.
    Explanation: 
    According to “The NIST Definition of Cloud Computing,” in IaaS, “the capability provided to the consumer is to provision processing, storage, networks, and other fundamental computing resources where the consumer is able to deploy and run arbitrary software, which can include operating systems and applications. The consumer does not manage or control the underlying cloud infrastructure but has control over operating systems, storage, and deployed applications; and possibly limited control of select networking components (e.g., host firewalls).
  18. When using an IaaS solution, what is a key benefit provided to the customer?

    • Metered and priced on the basis of units consumed
    • Increased energy and cooling system efficiencies
    • Transferred cost of ownership
    • The ability to scale up infrastructure services based on projected usage
    Explanation: 
    IaaS has a number of key benefits for organizations, which include but are not limited to these: — – Usage is metered and priced on the basis of units (or instances) consumed. This can also be billed back to specific departments or functions.
    – It has an ability to scale up and down infrastructure services based on actual usage. This is particularly useful and beneficial where there are significant spikes and dips within the usage curve for infrastructure.
    – It has a reduced cost of ownership. There is no need to buy assets for everyday use, no loss of asset value over time, and reduced costs of maintenance and support.
    – It has a reduced energy and cooling costs along with “green IT” environment effect with optimum use of IT resources and systems.
  19. Which of the following is considered an administrative control?

    • Keystroke logging
    • Access control process
    • Door locks
    • Biometric authentication
    Explanation: 
    A process is an administrative control; sometimes, the process includes elements of other types of controls (in this case, the access control mechanism might be a technical control, or it might be a physical control), but the process itself is administrative. Keystroke logging is a technical control (or an attack, if done for malicious purposes, and not for auditing); door locks are a physical control; and biometric authentication is a technological control.
  20. What is a key capability or characteristic of PaaS?

    • Support for a homogenous environment
    • Support for a single programming language
    • Ability to reduce lock-in
    • Ability to manually scale
    Explanation: 
    PaaS should have the following key capabilities and characteristics:
    – Support multiple languages and frameworks: PaaS should support multiple programming languages and frameworks, thus enabling the developers to code in whichever language they prefer or the design requirements specify. In recent times, significant strides and efforts have been taken to ensure that open source stacks are both supported and utilized, thus reducing “lock-in” or issues with interoperability when changing CSPs.
    – Multiple hosting environments: The ability to support a wide variety of underlying hosting environments for the platform is key to meeting customer requirements and demands. Whether public cloud, private cloud, local hypervisor, or bare metal, supporting multiple hosting environments allows the application developer or administrator to migrate the application when and as required. This can also be used as a form of contingency and continuity and to ensure the ongoing availability.
    – Flexibility: Traditionally, platform providers provided features and requirements that they felt suited the client requirements, along with what suited their service offering and positioned them as the provider of choice, with limited options for the customers to move easily. This has changed drastically, with extensibility and flexibility now afforded to meeting the needs and requirements of developer audiences. This has been heavily influenced by open source, which allows relevant plug-ins to be quickly and efficiently introduced into the platform.
    – Allow choice and reduce lock-in: PaaS learns from previous horror stories and restrictions, proprietary meant red tape, barriers, and restrictions on what developers could do when it came to migration or adding features and components to the platform. Although the requirement to code to specific APIs was made available by the providers, they could run their apps in various environments based on commonality and standard API structures, ensuring a level of consistency and quality for customers and users.
    – Ability to auto-scale: This enables the application to seamlessly scale up and down as required to accommodate the cyclical demands of users. The platform will allocate resources and assign these to the application as required. This serves as a key driver for any seasonal organizations that experience spikes and drops in usage.
Subscribe
Notify of
guest
0 Comments
Newest
Oldest Most Voted
Inline Feedbacks
View all comments