How to Secure Engineer Access to Cloud Workloads with Zero Trust
DevOps Access Challenges in the Era of Cloud and Remote
Now that remote work and developer infrastructure-in-the-cloud have become the new normal, securing engineer access to cloud workloads is more challenging than ever. With DevOps and engineering environments increasingly scaling across multitudes of servers, cloud providers, and hybrid architectures, security concerns are a top priority for companies harnessing the power of the cloud.
When it comes to securing engineering access to cloud environments such as AWS, GCP and Azure, most enterprises mitigate access risk by using a combination of solutions, each of which has both benefits and limitations.
- VPNs: Although VPNs for remote access eliminate the risk of MITM attacks, they cannot guarantee secure access to sensitive environments comprising hundreds of servers. VPN gateways are highly insecure, as they provide complete access to every resource in the network, with full layer-3 connectivity.
- Static segmentation: Network compartmentalization may be implemented through VLANs or internal security groups. But classification patterns for segmentation are extremely complex and require great effort to configure and Static segmentation also doesn’t allow real, granular controls, and commonly gives users access to broader segments than they actually need.
- IP whitelisting: If implemented properly, whitelisting IP addresses in the security group or firewall can be an effective strategy in cloud-based environments. But whitelist management creates enormous overhead. Dynamic IPs and changing locations require constant updates. If whitelists are not strictly maintained, the company’s most sensitive resources soon become overly
- Key management: Encryption keys are an effective method for monitoring user activity on a server. But key management requires dedicated tools in order to set up, manage and control accounts. Despite all the effort, keys and passwords can still be inadvertently
Zero Trust – Trust Nothing, Verify Everything
These established solutions can no longer be relied upon for secure DevOps access. With companies adopting newer infrastructure, methodologies, and tools, security measures and access standards need to change accordingly.
A new approach to security has emerged, Zero-Trust Network Access (ZTNA), which centers on preventing data breaches by not trusting anyone inside or outside an organization’s network architecture. ZTNA denies network connectivity to all users, machines and applications until they are explicitly verified.
This can be achieved through:
- Enforcing least privilege access that restricts access to the application layer only (layer 7 of the OSI) and providing access only after strong, contextual user authenticationhasbeenperformed
- Preventing visibility of internal resources through “datacenter blackening”
- Using an end-to-end encrypted tunnel with mutual TLS
- Monitoring every user session from access attempt to in-session activity
The ZTNA approach solves many remote access challenges by ensuring secure, agile, and seamless connectivity. However, technology-oriented companies with large teams of engineering, support, and IT may still struggle to support work at scale with dynamic servers, while maintaining a reasonable level of administration overhead and a seamless user experience.
Securing DevOps Access to Dynamic Cloud Workloads
To enable zero trust access for DevOps and engineers, and fully leverage the agility and flexibility of cloud-based development environments, look for solutions that offer:
- Built-in Privileged Access Management (PAM): Embedded PAM allows users to connect directly and securely to the end server without a machine key/password. Once a user is authenticated to the system with the organization’s identity provider (IDP), the solution extends the user’s identity to accessed resources, creating a kind of a single sign-on to production
- Automated server onboarding: Integrated cloud-resource discovery is crucial for organizations with large numbers of servers that are auto-scaled and require dynamic policy enforcement. To this end, seek a service that continuously syncs with your organization’s cloud resources and fetches them according to their region or tag, so that policies can be dynamically enforced as resources are created in the cloud or data
- Tag-based management: Tags simplify access management to cloud workloads by enabling administrators to categorize and label them in a way that is intuitive to them. Policies defined for an individual tag or group of tags are automatically enforced on any number of resources and dynamic services as they’re spun up and Tags also facilitate access for the end user by grouping applications together.
- Instant cloud deployment: A fully cloud-based platform lets you deploy zero trust access to cloud workloads in no time. Moreover, some solutions require no agents or clients on the user device, slashing deployment and ongoing management even
- Monitoring, alerting and full audit trail: In line with the zero trust framework, leave no security blind spots by monitoring sessions for unusual behavior at the access, command and query If needed, consider solutions that also provide session video recordings. Simplify reporting by integrating logs from your IDP where possible, so you gain a single SaaS interface of all user activity or export logs to other monitoring tools such as a SIEM.
Redefining Secure DevOps Access
To shift your DevOps and engineering access to a Zero Trust model, get started by exploring Harmony Connect Remote Access, Check Point’s ZTNA-as-a-service for on-prem and cloud environments.
Harmony Connect Remote Access takes only five minutes to deploy and secures access to any internal corporate application residing in the data center, IaaS, public or private clouds. With intuitive clientless access to Web, RDP, SSH and SQL-based resources, the service is both user and management friendly, while offering DevOps a wealth of cloud-native capabilities such as privileged access management (PAM) and automated server onboarding. The service protects against DDoS attacks by hiding resources behind a secure cloud, while preventing application-targeted threats.
Check out the following resources to learn more: