Differences between revisions 14 and 15
Revision 14 as of 2021-11-27 08:10:26
Size: 2818
Editor: PieterSmit
Comment:
Revision 15 as of 2022-05-05 00:26:05
Size: 2851
Editor: PieterSmit
Comment:
Deletions are marked like this. Additions are marked like this.
Line 4: Line 4:
 * Links [[AWS/SSO]] , [[AWS/FlowLog]], [[https://www.apptio.com/blog/aws-ebs-performance-confused/|2021 apptio EBS performance]]  * Links [[AWS/SSO]] , [[AWS/CloudWatch/FlowLog]], [[https://www.apptio.com/blog/aws-ebs-performance-confused/|2021 apptio EBS performance]], [[AWS/LinuxNetwork]]

Amazon Web Services - Cloud provider

Install aws cli v2

  • Install client

    curl "https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" -o "awscliv2.zip"
    unzip awscliv2.zip
    sudo ./aws/install
  • With new access tokens created in IAM, and region

    aws configure

2021 get ami images

  • Using aws cli

    $ aws ec2 describe-images --region ap-southeast-2 --owners self amazon
    or
    $ aws ec2 describe-images --region ap-southeast-2 --filters "Name=name,Values=Windows_Server-2019-English-Full-Base*" --owners self amazon | grep "\"Name\""

2018 CloudWatch syslog

  • Install agent

    $ sudo dpkg -iE amazon-cloudwatch-agent.deb
    $ sudo systemctl enable amazon-cloudwatch-agent.service
    $ sudo systemctl start amazon-cloudwatch-agent.service
    $ sudo systemctl status amazon-cloudwatch-agent.service
  • syslog

    curl https://s3.amazonaws.com/aws-cloudwatch/downloads/latest/awslogs-agent-setup.py -O 
    $ sudo python3 awslogs-agent-setup.py
    awslogs-agent-setup.py

  • aws cli cloudwatch log groups

    $ aws logs describe-log-groups --profile nonprod

2018 aws tool, setup for s3 upload on Raspberry Pi3

  • sudo pip3 install -U aws
    • Error
    • sudo apt install libffi-dev libssl-dev

2016

  • 2016-aws-live

  • AWS-Partner-infor

  • on linux can use $s3cmd to backup to S3 storage in AWS.
    • $ s3cmd --configure
      • get keys from http://aws.amazon.com/ User Name,Access Key Id,Secret Access Key

      • s3cmd mb s3://backupVigor
        •  Bucket 's3://backupVigor/' created 

    • Use tar with the incremental option to backup files, pipe through xz to compress, pgp to encrypt if needed and then pipe straight to s3cmd, into backup file.
      • 20160322 - This works great, and no need for a local copy of the file while creating the backup.
  • Then set AWS policy on s3 bucket to move older files to Glacier, and even delete very old files, e.g. after 700days.

Linux commandline bash upload to aws s3

...


CategoryStorage CategoryDevelopement CategorySecurity

AWS (last edited 2022-05-05 00:26:05 by PieterSmit)