For most companies undertaking an AWS cloud adoption, their main concern is public cloud security. They often ask, “How can we be sure that our data is properly protected?”
Sometimes, SoftServe experts join a client in the middle of their cloud journey when the security foundations have already been built, and we discover the AWS security design principles haven’t been followed or the AWS Shared Responsibility Model has been misinterpreted.
According to the Shared Responsibility Model, AWS is responsible for cloud security when a customer’s responsibility is for the security in the cloud. Customers often misunderstand this “…security IN the cloud…” component of the model, and those areas it is composed of.
Here’s an example. A client that migrated their file-hosting solution to the AWS cloud used S3 buckets as file storage. After migration, they used the AWS Well-Architected Tool to conduct a self-assessment test and found their S3 buckets were not encrypted at rest. Then, they asked if SoftServe could correct that situation.
It looks like a straightforward solution, doesn’t it? First, sign into the AWS Management Console and open the Amazon S3 console. Navigate to the Buckets list, go to desired bucket, and open a tab with its properties. Then go to default encryption and select the “Enable” instead of “Disable” radio button. Repeat these steps for all of the remaining S3 buckets.
But why were the clients’ buckets encryption disabled in the first place? What if they have thousands of buckets? What if someone had disabled server-side encryption behavior? And what about new buckets? Will they be created with enabled encryption, too?
Instead of simply “enabling data encryption at rest,” the client needed additional security enhancements. Here’s how SoftServe proceeded.
Implement a strong identity foundation
First, we identified our client experts’ roles and their job responsibilities. That helps determine what permissions they need to implement the principle of least privilege and to enforce duty separation.
We’ve found the ideal approach is to limit the operational team to read-only permissions in reviewing S3-related metrics and logs, and to only respond to those issues that they observe. If they need to create, update a configuration, or delete S3 buckets, we recommend they use Infrastructure as a Code (IAC) approach and service roles—either in an AWS CloudFormation service role or an IAM role for EC2 as a CI/CD deployment agent tool in applying Terraform code.
This approach prevents human error and provides a single source of truth for the S3 configuration–IAC repository.
Enable traceability
Having established proper IAC implementation, we could track S3 configuration implemented changes by reviewing the commits to IAC repositories or logs from triggered IAC deploy jobs in case of integration with CI/CD tools.
Additionally, we configured CloudTrail to capture API calls for Amazon S3, so if someone receives unwanted IAM permissions, we could identify any configuration changes that had been applied to S3 buckets.
Apply security at all layers
Next, we determined if the client had their application deployed in EC2 VMs that managed and provided access to S3 objects. So, it makes sense to configure the gateway VPC endpoints to access S3 buckets from VPC where EC2 applications are launched and control access from the desired VPC by using the aws:sourceVpce condition key for S3 bucket policy, instead of routing traffic from EC2 to S3 via public Internet and introducing unnecessary security risks. If the client uses S3 buckets as origins for AWS CloudFront, bucket access should be restricted to these buckets by using an origin access identity (OAI).
Automate security best practices
We had already introduced an IAC approach using version-controlled templates and integration on the deployment pipelines. However, it also makes sense to add security tests to these pipelines. If those tests fail, we the configuration deployment job will be terminated. It can either be an AWS CloudFormation Hook or Terraform testing using Open Policy Agent (OPA) that would determine if an S3 encryption was enabled in the IAC templates.
Protect data in transit and at rest
“Protection at rest” was why we began this discussion. But to achieve “protection in transit” for the client, we needed to configure S3 buckets to support HTTPS requests only by using the aws:SecureTransport condition key as part of S3 bucket policy.
Keep people away from data
We’ve already discussed how important it is to have properly configured IAM permissions for the operations team and limiting their access to S3 bucket content by using mix of IAM and S3 bucket policies. IAC can create, update, and delete S3 buckets, but it cannot access data stored on S3.
Be prepared for security events
Even having implemented other design principles, we should always be ready for security events.
We had already configured CloudTrail trails and a log S3 bucket to save logs to. So at this point, we triggered a new Lamba with log parsing logic when a new CloudTrail log file copied to the log S3 bucket. This Lambda filtered S3 related configuration changes and sends a notification via SNS. By testing different scenarios on non-production environments, we were able to prepare playbooks that the client’s operational team will use during actual security events with their S3 buckets in the production environment.
Another option can be to use AWS Config and its managed rules, such as s3-bucket-server-side-encryption-enabled to check if client’s S3 buckets either have the S3 default encryption enabled, or if their bucket policies explicitly deny put-object requests without server-side encryption. We can then configure AWS Config to automatically remediate noncompliant S3 buckets and enable default encryption.
Your guide on an unfamiliar journey
AWS simplifies cloud security by handling the cloud portion security of the AWS Shared Responsibility Model. But AWS also provides the design principles, best-practice guidance, and tools for improving the client’s security posture at the same time. Unfortunately, most of the clients that begin an AWS cloud adoption have difficulties with planning and following the route of this unfamiliar journey.
SoftServe can be your trusted partner in sharing this journey and can help you improve your security posture. Let’s talk today about how we can help you design your architecture and create the security solution your business needs.