This is the first in a blog series that will focus on all phases of security delivery: design, dev/build, test, deploy, operate and learn.
Many organizations are facing new cybersecurity concerns as they look to accelerate their digital transformations for themselves and their customers. At the same time, information technology (IT) and operations are being challenged to unite and innovate to meet their enterprise’s digital objectives.
Often, the company’s desire to accelerate, and the security team’s need to protect the organization cause friction that slows progress. The way many enterprises deliver cybersecurity in the form of processes, decision-making and experiences is not occurring at the rate, pace and speed of their own transformation.
The CISO Dilemma
Chief information security officers (CISOs) have a responsibility to protect their enterprise. Typically, they do so by establishing battle-tested policies, standards and services that become governance models. This results in an operating model that attempts to balance the tendency of prioritizing speed to market with risk and security. Cybersecurity teams want to become enablers of innovation instead of being seen as an obstacle or barrier. However, to facilitate transformation, they themselves must transform in addition to helping others adopt a security-aware culture.
Secure Software Delivery in the Cloud
The legacy apps of yesteryear look different than their modern cloud-native counterparts. Bare metal or virtual machines managed by a hypervisor, organized into three-tier applications (web server, web application and database), have been the norm for decades. We still see this pattern with clients today, but that stronghold is giving way to cloud-native applications.
Cloud-native apps are the result of the steady march of commoditization and open-source racing up the solution stack — operating systems (Linux), cloud computing, software-defined networks and, most recently, containers. Even interface approaches have become commoditized (think application programming interfaces). The very definition of the word “application” is evolving. The monolith of the past is now broken down into a collection of microservices that are containerized and managed by Kubernetes (k8s).
With fundamental changes in the architecture and the current structuring of applications, security concerns are also evolving:
- Deterrent, preventative, detective and corrective controls are applied differently in the cloud-native model.
- Attack surfaces are different (images, namespaces and service accounts don’t exist in the heritage compute models of the past).
- Attack surfaces are larger in number (Microservices inherently have more inter-service communication than their monolithic counterparts.).
- Attack surfaces are ephemeral (Scale-to-zero and dynamic-scaling capabilities can expose vulnerabilities that may be gone before detected by periodic scans and audits.).
Extend DevOps to Include Security
Before the prevalence of DevOps, security teams were typically engaged for reviews late in a project, offering guidance just before applications were pushed into production. Security was considered only toward the end of the development process, causing delays if issues were identified. This resulted in more substantial costs to remediate.
The original intent of DevOps was to bring together two disparate delivery stakeholders early in the lifecycle: development and operations. Why stop there?
Including IT security early as a third key stakeholder can not only prevent production delays but also change the perception that the security team is “the team of no.” It can transform the relationship with security into a powerful ally.
Clearly, when it comes to security, inclusiveness has a positive impact. This has given rise to trendsetting terms like DevSecOps, which place a focus on addressing security in the delivery process. With this approach, security is not just a trend, it is a bedrock of responsible enterprise delivery.
Security as the Fabric of Software Delivery
A Framework for Secure DevOps
The Modern DevOps Manifesto by Andrea C. Crawford and the IBM Garage, lists some key concepts to apply to end-to-end secure software delivery. The first theme, “everything is code,” opens opportunities to ingrain security and secure practices deeper into the enterprise — images, k8s cluster configurations, application configurations and even pipelines themselves.
Coded assets will need to be established as “trusted” resources. Good candidates are those constructs that are shared across the enterprise. An example is treating container images, application templates, role-based access control policies and cluster configurations as “trusted” resources that warrant their own governance and pipeline, as well as clearly defined personas that manage those resources. Larger enterprises might consider badging and internal certifications for new roles like image engineers, cluster engineers, pipeline engineers and so on.
Organizations can amp up the separation of duties with Zero Trust and the principle of least privilege. As new roles and personas are defined, they should receive just enough access to trusted resources to get their jobs done, thereby mitigating risk and limiting exposure. Depending on the industry and regional compliance environment, this could mean different access and responsibilities for one role. Highly-regulated enterprises might not allow a developer of an application to also be an image engineer due to the risk of having too much influence on an app stack.
The next entry in this series will explore how security should be infused early into the application design and delivery process.
This post appeared first on Security Intelligence
Author: John Wheeler