AIS-05: Automated Application Security Testing

CSF v1.1 References:

PF v1.0 References:

Info icon.

Control is new to this version of the control set and incorporates the following controls from the previous version: AIS-01: Application Security, AIS-03: Data Integrity.

Control Statement

Implement a testing strategy, including criteria for acceptance of new information systems, upgrades and new versions, which provides application security assurance and maintains compliance while enabling organizational speed of delivery goals. Automate when applicable and possible.

Implementation Guidance

Note: The implementation guidelines of AIS-05 should be interpreted as further guidance in addition to what is specified in AIS-03 and AIS-04. Automation of security testing should be implemented to reduce risks and errors and enable the scaling of security practices to meet organizational demands. Multiple test types and integration points will likely be needed to provide the appropriate level of assurance throughout the SDLC. Criteria should be developed for use when assessing the automation required by an application, as not all systems will benefit equally. Strategy:

  1. Identify the goals and requirements of the automation implementation.

Example goals:

  • Security requirements are not relaxed to improve speed.
  • All developers can leverage tools to detect security weaknesses while developing software.
  • All third-party libraries are scanned for known vulnerabilities.
  • All authentication and authorization functions “pass” abuse case unit tests before deployment.
  • All website security headers are verified to meet security requirements when deployed.

Example requirements:

  • Applicable programming languages should be supported by static analysis tools.
  • Python and C# should be supported by select static analysis tools.
  • Automation should not require infrastructure support.
  • All automation tools should offer an application programming interface (API).
  • All website security headers are verified to meet security requirements when deployed.

Strategy can also include, but is not limited to:

  1. Security testing for unintentional side effects and behaviors that are not specified in the test plan or design.
  2. Security testing for incident response procedures, such as simulating breaches.
  3. Determining which portfolio applications warrant investment in automation. Prioritize the adoption order based on criticality.

Considerations:

  1. Security requirements
  2. Risk, business, and compliance requirements
  3. Development methodology
  4. Lifecycle
  5. Metrics establishment

Example:

  • Count or percentage of (test type) adoption among applications requiring (test type) SAST, DAST, SCA, etc.
  • Count or percentage of false positives produced by test automation.
  • Count or percentage of execution-time SLA breaches by test automation.
  • Soft measures, including satisfaction levels with usability.
  • Change in the number of security weaknesses discovered after release.
  • Percentage coverage of automated test cases for exposed APIs and SDK functions by service (i.e., the total number of automated test cases for APIs/SDK functions; the total number of APIs/SDK by service).
  • Evaluate test types to determine which is best suited for different categories of applications based on the attributes of those in the prioritized inventory.

Considerations:

  • Development and security team sizes
  • Platform and operating systems
  • Maturity of build automation
  • Language support

Execution:

  1. Avoid approaches that cause unreasonable delays to builds and deployments or require significant process modification or resource commitment from development teams.
  2. Seek to adopt automation at multiple SDLC integration points

Example integration points:

  1. A plugin in the developer's integrated development environment (IDE).
  2. Abuse case unit and integration tests—created and maintained by developers—and executed during development and build cycles.
  3. Static application security testing or SCA scans executed during automated builds.
  4. Dynamic application security testing scans executed during automated deployment.
  5. Automate patching when possible.
  6. Use metrics to drive feedback loops and continuous improvement.
  7. Maintain an accurate count of license utilization.
  8. Tune automation to reduce false positives and false negatives.
  9. Analyze gaps in support and trends for response and planning.
  10. Continue to leverage manual testing for scenarios not easily tested with automation.

Auditing Guidance

  1. Examine policy and procedures for definition of testing strategies, automation of security testing, and change management.
  2. Determine security assurance and acceptance criteria for the new information system(s).
  3. Determine if the software release process is automated where applicable.

[csf.tools Note: For more information on the Cloud Controls Matrix, visit the CSA Cloud Controls Matrix Homepage.]

Cloud Control Matrix is Copyright 2023 Cloud Security Alliance.