Differences between revisions 3 and 7 (spanning 4 versions)
Revision 3 as of 2016-03-19 09:06:37
Size: 399
Editor: PieterSmit
Comment:
Revision 7 as of 2018-06-23 12:32:35
Size: 1595
Editor: PieterSmit
Comment:
Deletions are marked like this. Additions are marked like this.
Line 4: Line 4:

== 2018 aws tool, setup for s3 upload on Raspberry Pi3 ==
 * sudo pip3 install -U aws
    * Error
    * sudo apt install libffi-dev libssl-dev


== 2016 ==
 * [[https://live.awsevents.com/?sc_channel=em&sc_campaign=chicagosummit2016&sc_publisher=aws&sc_medium=em_13869&sc_content=launch_t1launch_tier1&sc_country=global&sc_geo=global&sc_category=mult&sc_outcome=launch&trk=ema_13869&mkt_tok=eyJpIjoiTWpZMk1XTmtPVGRrTmpnMSIsInQiOiJVb3VIVURraVRGNTRKNEtWNzNjNTlJWmlPUmRwSDRyWFhzaG1PSHY1YXJcL0swRnpKd1BhUEFMdzNGMU53UTd4Mkd6WlJyM1htWGladlNBNHpZMk1sUTIxR3NzTlB4RDdOYnVvaDlZRHErXC9RPSJ9|2016-aws-live]]
 * [[https://aws.amazon.com/partners/success/infor/|AWS-Partner-infor]]
Line 9: Line 19:
   * Use tar with the incremental option to backup files, pipe through xz to compress, pgp to encrypt if needed and then pipe straight to s3cmd, into backup file.
     * 20160322 - This works great, and no need for a local copy of the file while creating the backup.
 * Then set AWS policy on s3 bucket to move older files to Glacier, and even delete very old files, e.g. after 700days.
== Linux commandline bash upload to aws s3 ==
 * https://geek.co.il/2014/05/26/script-day-upload-files-to-amazon-s3-using-bash

Amazon Web Services - Cloud provider

2018 aws tool, setup for s3 upload on Raspberry Pi3

  • sudo pip3 install -U aws
    • Error
    • sudo apt install libffi-dev libssl-dev

2016

  • 2016-aws-live

  • AWS-Partner-infor

  • on linux can use $s3cmd to backup to S3 storage in AWS.
    • $ s3cmd --configure
      • get keys from http://aws.amazon.com/ User Name,Access Key Id,Secret Access Key

      • s3cmd mb s3://backupVigor
        •  Bucket 's3://backupVigor/' created 

    • Use tar with the incremental option to backup files, pipe through xz to compress, pgp to encrypt if needed and then pipe straight to s3cmd, into backup file.
      • 20160322 - This works great, and no need for a local copy of the file while creating the backup.
  • Then set AWS policy on s3 bucket to move older files to Glacier, and even delete very old files, e.g. after 700days.

Linux commandline bash upload to aws s3

...


CategoryStorage CategoryDevelopement CategorySecurity

AWS (last edited 2022-05-05 00:26:05 by PieterSmit)