Amazon Web Services - Cloud provider
Install aws cli v2
Install client
curl "https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" -o "awscliv2.zip" unzip awscliv2.zip sudo ./aws/install
With new access tokens created in IAM, and region
aws configure
2021 get ami images
Using aws cli
$ aws ec2 describe-images --region ap-southeast-2 --owners self amazon or $ aws ec2 describe-images --region ap-southeast-2 --filters "Name=name,Values=Windows_Server-2019-English-Full-Base*" --owners self amazon | grep "\"Name\""
2018 CloudWatch syslog
Install agent
$ sudo dpkg -iE amazon-cloudwatch-agent.deb $ sudo systemctl enable amazon-cloudwatch-agent.service $ sudo systemctl start amazon-cloudwatch-agent.service $ sudo systemctl status amazon-cloudwatch-agent.service
syslog
curl https://s3.amazonaws.com/aws-cloudwatch/downloads/latest/awslogs-agent-setup.py -O $ sudo python3 awslogs-agent-setup.py awslogs-agent-setup.py
aws cli cloudwatch log groups
$ aws logs describe-log-groups --profile nonprod
2018 aws tool, setup for s3 upload on Raspberry Pi3
- sudo pip3 install -U aws
- Error
- sudo apt install libffi-dev libssl-dev
2016
- on linux can use $s3cmd to backup to S3 storage in AWS.
- $ s3cmd --configure
get keys from http://aws.amazon.com/ User Name,Access Key Id,Secret Access Key
- s3cmd mb s3://backupVigor
Bucket 's3://backupVigor/' created
- Use tar with the incremental option to backup files, pipe through xz to compress, pgp to encrypt if needed and then pipe straight to s3cmd, into backup file.
- 20160322 - This works great, and no need for a local copy of the file while creating the backup.
- $ s3cmd --configure
- Then set AWS policy on s3 bucket to move older files to Glacier, and even delete very old files, e.g. after 700days.
Linux commandline bash upload to aws s3
...