Table of contents
Open Table of contents
Regions and Availability Zones
References
Region Examples
- North America
- us-east-1, norther virginia
- us-west-2, oregon
- us-west-1, northern california
- us-east-2, ohio
Commercial Regions
- AMER: North, Central, and South America
- APAC: Asia and Pacific
- EMEA: Europe, the Middle East, and Africa. AWS Inc, AWS EMEA, AWS South Africa, AWS Brazil, Korea
References
Amazon DCV
How to use a GUI client with AWS EC2?
We could use the Amazon DCV client as mentioned here.
API Gateway
API Doc Generation
- Redoc: generate API doc for OpenAPI and Swagger, https://github.com/Redocly/redoc
Auth
Credential Configuration
How to perform auth to run aws cli commands?
We can get the aws access key, secret access key, and the session token and use them as environment variables in the current shell session. See this blog post on where to find and update the access keys. Alternatively, you can use a config file for frequently used credentials, see this doc for configuring.
# get access key, secret, and token from aws console
export AWS_ACCESS_KEY_ID=<>
export AWS_SECRET_ACCESS_KEY=<>
export AWS_SESSION_TOKEN=<>
# assume role, optional
aws sts assume-role --role-arn "arn:aws:iam::547*****681:role/Admin" --role-session-name <session_name>
Refreshing Credentials
How to refresh credentials so a long-running program can continue to get access?
Sometimes a program may continue to run for extended time until something else finishes. The default expiration time for an IAM role is not long enough and we need to refresh.
- refresh credentials in python autorefresh_session
- RefreshableCredentials
- python3 script running indefinitely stackoverflow question
Cloudformation
How to validate a cloudformation template?
aws cloudformation validate-template --template-body file://sampletemplate.json
# output
{
"Description": "AWS CloudFormation Sample Template S3_Bucket: Sample template showing how to create a publicly accessible S3 bucket. **WARNING** This template creates an S3 bucket. You will be billed for the AWS resources used if you create a stack from this template.",
"Parameters": [],
"Capabilities": []
}
How to import existing (perhaps manually created resource) into cloudformation?
We can use IaC generator or manually import the resource.
Cloudwatch
Agent user guide
- user guide doc
How to monitor across multiple AWS accounts?
log insight search tips
# search for failures (status code > 500)
fields @message, @timestamp, @logStream, requestId, status
| filter status >= 500
| display @message, @logStream, requestId, status
# search for request log failures
fields @timestamp, @message
| filter responseStatus >= 500
| stats count(*) by responseStatus fields @message, @timestamp
| parse @message "{ \"requestId\": \"*\", *, \"status\": \"*\", *}" as request_id, ignore1, status, ignore2
| filter abs(status) >= 500
| display @timestamp, request_id, status, @logStream
| sort by @timestamp desc
For example, we may find AWS integration endpoint request id for a 200 request in log group API-Gateway-Execution-Logs/beta.
fields @timestamp, @message
| filter @message like /cd6e1d24-8b5c-4434-969d-00db430cd03b/
| display @logStream, @message
| sort @timestamp desc
(cd6e1d24-8b5c-4434-969d-00db430cd03b) AWS Integration Endpoint RequestId : 2a3f76ac-1744-4e6c-91e2-7c1951808d1b
then search in the lambda log group /aws/lambda/<lambda_name>-beta-us-west-2
fields @timestamp, @message
| filter @message like /2a3f76ac-1744-4e6c-91e2-7c1951808d1b/
| sort @timestamp desc
| display @message
DynamoDB
Integration with Elasticsearch
Data Science and Machine Learning on AWS
- “Data Science on AWS” book, also has the “Generative AI on AWS” book.
EC2
How to find EC2 image information by AMI id?
Check for more at this “Query for the latest Amazon Linux AMI IDs using AWS Systems Manager Parameter Store” reference.
# find ec2 image info by AMI id
aws ec2 describe-images \
--image-id ami-0c7d8678e345b414c \
--query "Images[*].Description[]" \
--output text \
--region us-east-1
# query list of the AMI name properties
aws ec2 describe-images --owners amazon --filters "Name=name,Values=amzn*" \
--query 'sort_by(Images, &CreationDate)[].Name'
# query latest AMI with ssm
aws ssm get-parameters --names /aws/service/ami-amazon-linux-latest/amzn2-ami-hvm-x86_64-gp2 \
--region us-east-1
# use latest in cloud formation
# Use public Systems Manager Parameter
Parameters:
LatestAmiId:
Type: 'AWS::SSM::Parameter::Value<AWS::EC2::Image::Id>'
Default: '/aws/service/ami-amazon-linux-latest/amzn2-ami-hvm-x86_64-gp2'
Resources:
Instance:
Type: 'AWS::EC2::Instance'
Properties:
ImageId: !Ref LatestAmiId
ECS
Autoscaling
How to connect (ssh) to the containers with AWS ECS/Fargate/EC2?
We could use aws cli ecs execute-command
as mentioned in this blog post and this stackoverflow question.
Glue
- build data lake with AWS Glue and S3 blog post
OpenSearch
Redshift
- SQL COPY from data files (in S3, EMR cluster, or a remote host accessed with ssh) or DynamoDB table.
- COPY from parquet and orc file formats blog post
- redshift node type details doc
S3
# delete files and folder
$ aws s3 rm --profile <profile_name> s3://<path/prefix>/ --recursive
cross-account access
AWS repost article
1, Create an S3 bucket in Account A. 2, Create an IAM role or user in Account B. 3, Give the IAM role in Account B permission to download (GET Object) and upload (PUT Object) objects to and from a specific bucket. Use the following IAM policy to also grant the IAM role in Account B permissions to call PutObjectAcl, granting object permissions to the bucket owner:
{
"Version": "2012-10-17",
"Statement": [
{ "Effect": "Allow",
"Action": [ "s3:GetObject", "s3:PutObject", "s3:PutObjectAcl" ],
"Resource": "arn:aws:s3:::AccountABucketName/*"}
]
}
for read-only access
{
"Version": "2012-10-17",
"Statement": [
{ "Effect": "Allow",
"Action": [ "s3:Get*", "s3:List*" ],
"Resource": [ "arn:aws:s3:::AccountABucketName/folder/*", "arn:aws:s3:::AccountABucketName/folder",]
}
]
}
Note: Make sure to update the policy to include your user variables (such as account ID, bucket name, and ARN). Also, you can limit access to a specific bucket folder in Account A. To limit access to a specific bucket folder, define the folder name in the resource element, such as “arn:aws:s3:::AccountABucketName/FolderName/*”. For more information, see How can I use IAM policies to grant user-specific access to specific folders? You can also create an IAM identity-based policy using the AWS CLI command: example create-policy.
4, Configure the bucket policy for Account A to grant permissions to the IAM role or user that you created in Account B. Use this bucket policy to grant a user the permissions to GetObject and PutObject for objects in a bucket owned by Account A:
{
"Version": "2012-10-17",
"Statement": [
{ "Effect": "Allow",
"Principal": {"AWS": "arn:aws:iam::AccountB:user/AccountBUserName" },
"Action": [ "s3:GetObject", "s3:PutObject", "s3:PutObjectAcl" ],
"Resource": [ "arn:aws:s3:::AccountABucketName/*" ]
}
]
}
SQS
Redrive from Dead-Letter Queue to Source Queue
VPC
Reference: VPC quota limits doc.
Each AWS account is allowed for five (5) VPCs and five (5) Elastic IPs per region.
References
- How formal methods helped AWS to design amazing services post