Skip to content

AWS (Amazon Web Services) Tips and Cheatsheet

Updated: at 10:12 AM

Table of contents

Open Table of contents

Regions and Availability Zones

References

  1. reference
  2. services by region

Region Examples

  1. North America
    1. us-east-1, norther virginia
    2. us-west-2, oregon
    3. us-west-1, northern california
    4. us-east-2, ohio

Commercial Regions

  1. AMER: North, Central, and South America
  2. APAC: Asia and Pacific
  3. EMEA: Europe, the Middle East, and Africa. AWS Inc, AWS EMEA, AWS South Africa, AWS Brazil, Korea

References

  1. country groupings
  2. ISO 3166 countries by region

Amazon DCV

How to use a GUI client with AWS EC2?

We could use the Amazon DCV client as mentioned here.

API Gateway

API Doc Generation

  1. Redoc: generate API doc for OpenAPI and Swagger, https://github.com/Redocly/redoc

Auth

Credential Configuration

How to perform auth to run aws cli commands?

We can get the aws access key, secret access key, and the session token and use them as environment variables in the current shell session. See this blog post on where to find and update the access keys. Alternatively, you can use a config file for frequently used credentials, see this doc for configuring.

# get access key, secret, and token from aws console
export AWS_ACCESS_KEY_ID=<>
export AWS_SECRET_ACCESS_KEY=<>
export AWS_SESSION_TOKEN=<>
# assume role, optional
aws sts assume-role --role-arn "arn:aws:iam::547*****681:role/Admin" --role-session-name <session_name>

Refreshing Credentials

How to refresh credentials so a long-running program can continue to get access?

Sometimes a program may continue to run for extended time until something else finishes. The default expiration time for an IAM role is not long enough and we need to refresh.

  1. refresh credentials in python autorefresh_session
  2. RefreshableCredentials
  3. python3 script running indefinitely stackoverflow question

Cloudformation

How to validate a cloudformation template?

aws cloudformation validate-template --template-body file://sampletemplate.json

# output
{
    "Description": "AWS CloudFormation Sample Template S3_Bucket: Sample template showing how to create a publicly accessible S3 bucket. **WARNING** This template creates an S3 bucket. You will be billed for the AWS resources used if you create a stack from this template.",
    "Parameters": [],
    "Capabilities": []
}

How to import existing (perhaps manually created resource) into cloudformation?

We can use IaC generator or manually import the resource.

Cloudwatch

Agent user guide

  1. user guide doc

How to monitor across multiple AWS accounts?

  1. aws doc user guide
  2. youtube video

log insight search tips

# search for failures (status code > 500)
fields @message, @timestamp, @logStream, requestId, status
| filter status >= 500
| display @message, @logStream, requestId, status

# search for request log failures
fields @timestamp, @message
| filter responseStatus >= 500
| stats count(*) by responseStatus fields @message, @timestamp
| parse @message "{ \"requestId\": \"*\", *, \"status\": \"*\", *}" as request_id, ignore1, status, ignore2
| filter abs(status) >= 500
| display @timestamp, request_id, status, @logStream
| sort by @timestamp desc

For example, we may find AWS integration endpoint request id for a 200 request in log group API-Gateway-Execution-Logs/beta.

fields @timestamp, @message
| filter @message like /cd6e1d24-8b5c-4434-969d-00db430cd03b/
| display @logStream, @message
| sort @timestamp desc

(cd6e1d24-8b5c-4434-969d-00db430cd03b) AWS Integration Endpoint RequestId : 2a3f76ac-1744-4e6c-91e2-7c1951808d1b

then search in the lambda log group /aws/lambda/<lambda_name>-beta-us-west-2

fields @timestamp, @message
| filter @message like /2a3f76ac-1744-4e6c-91e2-7c1951808d1b/
| sort @timestamp desc
| display @message

DynamoDB

Integration with Elasticsearch

  1. aws blog post
  2. design global secondary indexes (GSI) blog post

Data Science and Machine Learning on AWS

  1. “Data Science on AWS” book, also has the “Generative AI on AWS” book.

EC2

How to find EC2 image information by AMI id?

Check for more at this “Query for the latest Amazon Linux AMI IDs using AWS Systems Manager Parameter Store” reference.

# find ec2 image info by AMI id
aws ec2 describe-images \
  --image-id ami-0c7d8678e345b414c \
  --query "Images[*].Description[]" \
  --output text \
  --region us-east-1

# query list of the AMI name properties
aws ec2 describe-images --owners amazon --filters "Name=name,Values=amzn*" \
  --query 'sort_by(Images, &CreationDate)[].Name'

# query latest AMI with ssm
aws ssm get-parameters --names /aws/service/ami-amazon-linux-latest/amzn2-ami-hvm-x86_64-gp2 \
  --region us-east-1

# use latest in cloud formation
# Use public Systems Manager Parameter
Parameters:
  LatestAmiId:
    Type: 'AWS::SSM::Parameter::Value<AWS::EC2::Image::Id>'
    Default: '/aws/service/ami-amazon-linux-latest/amzn2-ami-hvm-x86_64-gp2'

Resources:
 Instance:
    Type: 'AWS::EC2::Instance'
    Properties:
      ImageId: !Ref LatestAmiId

ECS

Autoscaling

  1. cdk (php/Yii2 application) post
  2. autoscale developer guide
  3. ECS task placement post

How to connect (ssh) to the containers with AWS ECS/Fargate/EC2?

We could use aws cli ecs execute-command as mentioned in this blog post and this stackoverflow question.

Glue

  1. build data lake with AWS Glue and S3 blog post

OpenSearch

  1. developer guide doc
  2. quick start guide post

Redshift

  1. SQL COPY from data files (in S3, EMR cluster, or a remote host accessed with ssh) or DynamoDB table.
  2. COPY from parquet and orc file formats blog post
  3. redshift node type details doc

S3

# delete files and folder
$ aws s3 rm --profile <profile_name> s3://<path/prefix>/ --recursive

cross-account access

AWS repost article

1, Create an S3 bucket in Account A. 2, Create an IAM role or user in Account B. 3, Give the IAM role in Account B permission to download (GET Object) and upload (PUT Object) objects to and from a specific bucket. Use the following IAM policy to also grant the IAM role in Account B permissions to call PutObjectAcl, granting object permissions to the bucket owner:

{
   "Version": "2012-10-17",
   "Statement": [
      { "Effect": "Allow",
         "Action": [ "s3:GetObject", "s3:PutObject", "s3:PutObjectAcl" ],
         "Resource": "arn:aws:s3:::AccountABucketName/*"}
   ]
}

for read-only access

{
   "Version": "2012-10-17",
   "Statement": [
      { "Effect": "Allow",
         "Action": [ "s3:Get*", "s3:List*" ],
         "Resource": [ "arn:aws:s3:::AccountABucketName/folder/*", "arn:aws:s3:::AccountABucketName/folder",]
      }
   ]
}

Note: Make sure to update the policy to include your user variables (such as account ID, bucket name, and ARN). Also, you can limit access to a specific bucket folder in Account A. To limit access to a specific bucket folder, define the folder name in the resource element, such as “arn:aws:s3:::AccountABucketName/FolderName/*”. For more information, see How can I use IAM policies to grant user-specific access to specific folders? You can also create an IAM identity-based policy using the AWS CLI command: example create-policy.

4, Configure the bucket policy for Account A to grant permissions to the IAM role or user that you created in Account B. Use this bucket policy to grant a user the permissions to GetObject and PutObject for objects in a bucket owned by Account A:

{
   "Version": "2012-10-17",
   "Statement": [
      { "Effect": "Allow",
         "Principal": {"AWS": "arn:aws:iam::AccountB:user/AccountBUserName" },
         "Action": [ "s3:GetObject", "s3:PutObject", "s3:PutObjectAcl" ],
         "Resource": [ "arn:aws:s3:::AccountABucketName/*" ]
      }
   ]
}

SQS

Redrive from Dead-Letter Queue to Source Queue

  1. blog post1
  2. blog post2

VPC

Reference: VPC quota limits doc.

Each AWS account is allowed for five (5) VPCs and five (5) Elastic IPs per region.

References

  1. How formal methods helped AWS to design amazing services post

Previous Post
HackerRank GET Requests for GIF Images
Next Post
LintCode 26 Inner Product