Man Linux: Main Page and Category List

NAME

       s3cmd - tool for managing Amazon S3 storage space and Amazon CloudFront
       content delivery network

SYNOPSIS

       s3cmd [OPTIONS] COMMAND [PARAMETERS]

DESCRIPTION

       s3cmd is a command line client for  copying  files  to/from  Amazon  S3
       (Simple  Storage  Service)  and  performing  other  related  tasks, for
       instance creating and removing buckets, listing objects, etc.

       s3cmd can do several actions specified by the following commands.

       mb s3://BUCKET
              Make bucket

       rb s3://BUCKET
              Remove bucket

       ls [s3://BUCKET[/PREFIX]]
              List objects or buckets

       la     List all object in all buckets

       put FILE [FILE...] s3://BUCKET[/PREFIX]
              Put file into bucket (i.e. upload to S3)

       get s3://BUCKET/OBJECT LOCAL_FILE
              Get file from bucket (i.e. download from S3)

       del s3://BUCKET/OBJECT
              Delete file from bucket

       sync LOCAL_DIR s3://BUCKET[/PREFIX]
              Backup a directory tree to S3

       sync s3://BUCKET[/PREFIX] LOCAL_DIR
              Restore a tree from S3 to local directory

       cp s3://BUCKET1/OBJECT1 s3://BUCKET2[/OBJECT2], mv s3://BUCKET1/OBJECT1
       s3://BUCKET2[/OBJECT2]
              Make a copy of a file (cp) or move a file (mv).  Destination can
              be in the same bucket with a different name or in another bucket
              with the same or different name.  Adding --acl-public will  make
              the destination object publicly accessible (see below).

       setacl s3://BUCKET[/OBJECT]
              Modify  Access  control  list  for  Bucket  or  Files.  Use with
              --acl-public or --acl-private

       info s3://BUCKET[/OBJECT]
              Get various information about a Bucket or Object

       du [s3://BUCKET[/PREFIX]]
              Disk usage - amount of data stored in S3

       Commands for CloudFront management

       cflist List CloudFront distribution points

       cfinfo [cf://DIST_ID]
              Display CloudFront distribution point parameters

       cfcreate s3://BUCKET
              Create CloudFront distribution point

       cfdelete cf://DIST_ID
              Delete CloudFront distribution point

       cfmodify cf://DIST_ID
              Change CloudFront distribution point parameters

OPTIONS

       Some of the below specified options can have their default  values  set
       in  s3cmd  config file (by default $HOME/.s3cmd). As it’s a simple text
       file feel free to open it with your favorite text  editor  and  do  any
       changes you like.

       Config file related options.

       --configure
              Invoke  interactive  (re)configuration  tool.  Don’t  worry, you
              won’t lose your settings on subsequent runs.

       -c FILE, --config=FILE
              Config file name. Defaults to $HOME/.s3cfg

       --dump-config
              Dump  current  configuration  after  parsing  config  files  and
              command line options and exit.

       Options specific for file transfer commands (sync, put and get):

       -n, --dry-run
              Only  show  what  should  be  uploaded  or  downloaded but don’t
              actually do it. May still perform  S3  requests  to  get  bucket
              listings and other in formation though.

       --delete-removed
              Delete  remote  objects  with  no  corresponding local file when
              syncing to S3 or delete local files with no corresponding object
              in S3 when syncing from S3.

       --no-delete-removed
              Don’t delete remote objects. Default for sync command.

       -p, --preserve
              Preserve  filesystem  attributes  (mode, ownership, timestamps).
              Default for sync command.

       --no-preserve
              Don’t store filesystem attributes with uploaded files.

       --exclude GLOB
              Exclude files matching GLOB (a.k.a. shell-style  wildcard)  from
              sync.  See  FILE TRANSFERS section and http://s3tools.org/s3cmd-
              sync for more information.

       --exclude-from FILE
              Same as --exclude but reads GLOBs from the given FILE instead of
              expecting them on the command line.

       --rexclude REGEXP
              Same  as --exclude but works with REGEXPs (Regular expressions).

       --rexclude-from FILE
              Same as --exclude-from but works with REGEXPs.

       --include=GLOB,         --include-from=FILE,         --rinclude=REGEXP,
       --rinclude-from=FILE
              Filenames and paths matching GLOB or  REGEXP  will  be  included
              even  if  previously  excluded  by  one  of  --(r)exclude(-from)
              patterns

       --continue
              Continue getting a  partially  downloaded  file  (only  for  get
              command). This comes handy once download of a large file, say an
              ISO image, from a S3 bucket fails  and  a  partially  downloaded
              file  is  left  on  the  disk. Unfortunately put command doesn’t
              support  restarting  of  failed  upload   due   to   Amazon   S3
              limitations.

       --skip-existing
              Skip  over files that exist at the destination (only for get and
              sync commands).

       -m MIME/TYPE, --mime-type=MIME/TYPE
              Default MIME-type to be set for objects stored.

       -M, --guess-mime-type
              Guess MIME‐type of files  by  their  extension.  Falls  back  to
              default MIME‐Type as specified by --mime-type option

       --add-header=NAME:VALUE
              Add  a  given  HTTP  header  to  the upload request. Can be used
              multiple times with different header  names.  For  instance  set
              ’Expires’  or  ’Cache-Control’  headers  (or  both)  using  this
              options if you like.

       -P, --acl-public
              Store objects with permissions allowing  read  for  anyone.  See
              http://s3tools.org/s3cmd-public   for   details  and  hints  for
              storing publicly accessible files.

       --acl-private
              Store objects with default ACL allowing access for you only.

       -e, --encrypt
              Use GPG encryption to protect stored objects  from  unauthorized
              access.  See  http://s3tools.org/s3cmd-public  for details about
              encryption.

       --no-encrypt
              Don’t encrypt files.

       Options for CloudFront commands:

       See http://s3tools.org/s3cmd-cloudfront for more details.

       --enable
              Enable given CloudFront distribution (only for cfmodify command)

       --disable
              Enable given CloudFront distribution (only for cfmodify command)

       --cf-add-cname=CNAME
              Add given CNAME to a CloudFront distribution (only for  cfcreate
              and cfmodify commands)

       --cf-remove-cname=CNAME
              Remove  given  CNAME  from  a  CloudFront distribution (only for
              cfmodify command)

       --cf-comment=COMMENT
              Set COMMENT  for  a  given  CloudFront  distribution  (only  for
              cfcreate and cfmodify commands)

       Options common for all commands (where it makes sense indeed):

       -r, --recursive
              Recursive upload, download or removal. When used with del it can
              remove all the files in a bucket.

       -f, --force
              Force overwrite and other dangerous operations. Can be  used  to
              remove a non-empty buckets with s3cmd rb --force s3://bkt

       --bucket-location=BUCKET_LOCATION
              Specify  datacentre  where to create the bucket. Possible values
              are US (default) or EU.

       -H, --human-readable-sizes
              Print sizes in human readable form.

       --list-md5
              Include MD5 sums in bucket listings (only for ls command).

       --progress, --no-progress
              Display or don’t display progress meter.  When  running  on  TTY
              (e.g.  console  or  xterm)  the  default  is to display progress
              meter. If not on TTY (e.g. output  is  redirected  somewhere  or
              running from cron) the default is to not display progress meter.

       --encoding=ENCODING
              Override   autodetected   terminal   and   filesystem   encoding
              (character set).

       -v, --verbose
              Enable verbose output.

       -d, --debug
              Enable debug output.

       -h, --help
              Show the help message and exit

       --version
              Show s3cmd version and exit.

FILE TRANSFERS

       One  of  the  most  powerful  commands  of s3cmd is s3cmd sync used for
       synchronising complete directory trees to or from remote S3 storage. To
       some  extent  s3cmd  put  and  s3cmd get share a similar behaviour with
       sync.

       Basic usage common in backup scenarios is as simple as:
            s3cmd sync /local/path/ s3://test-bucket/backup/

       This command will find all files under /local/path directory  and  copy
       them to corresponding paths under s3://test-bucket/backup on the remote
       side.  For example:
            /local/path/file1.ext         ->  s3://bucket/backup/file1.ext
            /local/path/dir123/file2.bin  ->  s3://bucket/backup/dir123/file2.bin

       However if the local path doesn’t end with a slash the last directory’s
       name  is  used  on  the  remote  side  as  well. Compare these with the
       previous example:
            s3cmd sync /local/path s3://test-bucket/backup/
       will sync:
            /local/path/file1.ext         ->  s3://bucket/backup/path/file1.ext
            /local/path/dir123/file2.bin  ->  s3://bucket/backup/path/dir123/file2.bin

       To retrieve the files back from S3 use inverted syntax:
            s3cmd sync s3://test-bucket/backup/ /tmp/restore/
       that will download files:
            s3://bucket/backup/file1.ext         ->  /tmp/restore/file1.ext
            s3://bucket/backup/dir123/file2.bin  ->  /tmp/restore/dir123/file2.bin

       Without the trailing slash on source the behaviour is similar  to  what
       has been demonstrated with upload:
            s3cmd sync s3://test-bucket/backup /tmp/restore/
       will download the files as:
            s3://bucket/backup/file1.ext         ->  /tmp/restore/backup/file1.ext
            s3://bucket/backup/dir123/file2.bin  ->  /tmp/restore/backup/dir123/file2.bin

       All source file names, the bold ones above, are matched against exclude
       rules and those that match are then re-checked against include rules to
       see whether they should be excluded or kept in the source list.

       For  the purpose of --exclude and --include matching only the bold file
       names above are  used.  For  instance  only  path/file1.ext  is  tested
       against the patterns, not /local/path/file1.ext

       Both  --exclude  and  --include work with shell-style wildcards (a.k.a.
       GLOB).  For a greater  flexibility  s3cmd  provides  Regular-expression
       versions  of  the  two exclude options named --rexclude and --rinclude.
       The options with ...-from suffix (eg --rinclude-from) expect a filename
       as an argument. Each line of such a file is treated as one pattern.

       There  is  only  one set of patterns built from all --(r)exclude(-from)
       options and similarly for include variant. Any file  excluded  with  eg
       --exclude can be put back with a pattern found in --rinclude-from list.

       Run s3cmd with --dry-run to verify that your rules  work  as  expected.
       Use  together with --debug get detailed information about matching file
       names against exclude and include rules.

       For example to exclude all files with  ".jpg"  extension  except  those
       beginning with a number use:

            --exclude ’*.jpg’ --rinclude ’[0-9].*.jpg’

SEE ALSO

       For the most up to date list of options run s3cmd --help
       For  more  info  about  usage,  examples  and  other related info visit
       project homepage at
       http://s3tools.org

AUTHOR

       Written by Michal Ludvig <michal@logix.cz>

CONTACT, SUPPORT

       Prefered   way    to    get    support    is    our    mailing    list:
       s3tools-general@lists.sourceforge.net

REPORTING BUGS

       Report bugs to s3tools-bugs@lists.sourceforge.net

COPYRIGHT

       Copyright © 2007,2008,2009 Michal Ludvig <http://www.logix.cz/michal>
       This  is  free  software.   You may redistribute copies of it under the
       terms   of   the    GNU    General    Public    License    version    2
       <http://www.gnu.org/licenses/gpl.html>.   There  is NO WARRANTY, to the
       extent permitted by law.

                                                                      s3cmd(1)