consistent control over more AWS services with aws-cli, a single, powerful command line tool from Amazon
Readers of this tech blog know that I am a fan of the power of the command line. I enjoy presenting functional command line examples that can be copied and pasted to experience services and features.
The Old World
Users of the various AWS legacy command line tools know that, though they get the job done, they are often inconsistent in where you get them, how you install them, how you pass options, how you provide credentials, and more. Plus, there are only tool sets for a limited number of AWS services.
I wrote an article that demonstrated the simplest approach I use to install and configure the legacy AWS command line tools, and it ended up being extraordinarily long.
I’ve been using the term “legacy” when referring to the various old AWS command line tools, which must mean that there is something to replace them, right?
The New World
The future of the AWS command line tools is aws-cli, a single, unified, consistent command line tool that works with almost all of the AWS services.
Here is a quick list of the services that aws-cli currently supports: Auto Scaling, CloudFormation, CloudSearch, CloudWatch, Data Pipeline, Direct Connect, DynamoDB, EC2, ElastiCache, Elastic Beanstalk, Elastic Transcoder, ELB, EMR, Identity and Access Management, Import/Export, OpsWorks, RDS, Redshift, Route 53, S3, SES, SNS, SQS, Storage Gateway, Security Token Service, Support API, SWF, VPC.
Support for the following appears to be planned: CloudFront, Glacier, SimpleDB.
aws-cli is currently in “developer preview” as it is still being built and has some rough edges, but progress is being made steadily and I’ve already found it extremely useful, especially as it supports AWS services that have no other command line tools.
The aws-cli software is being actively developed as an open source project on Github, with a lot of support from Amazon. You’ll note that the biggest contributors to aws-cli are Amazon employees with Mitch Garnaat leading. Mitch is also the author of boto, the amazing Python library for AWS.
I recommend reading the aws-cli documentation as it has complete instructions for various ways to install and configure the tool, but for convenience, here are the steps I use on Ubuntu:
sudo apt-get install -y python-pip sudo pip install awscli
Add your Access Key ID and Secret Access Key to
$HOME/.aws-config using this format:
[default] aws_access_key_id = <access key id> aws_secret_access_key = <secret access key> region = us-east-1
Protect the config file:
chmod 600 $HOME/.aws-config
Set an environment variable pointing to config file. For future convenience, also add this line to your $HOME/.bashrc
Now, wasn’t that a lot easier than installing and configuring all of the old tools?
Test your installation and configuration:
aws ec2 describe-regions
The default output is in JSON. You can try out other output formats:
aws ec2 describe-regions --output text aws ec2 describe-regions --output table
I posted this brief mention of aws-cli because I expect some of my future articles are going to make use of it instead of the legacy command line tools.
So go ahead and install aws-cli, read the docs, and start to get familiar with this valuable tool.
Some folks might already have a command line tool installed with the name “aws”. This is likely Tim Kay’s “aws” tool. I would recommend renaming that to another name so that you don’t run into conflicts and confusion with the “aws” command from the aws-cli software.
[Update 2013-10-09: Rename awscli to aws-cli as that seems to be the direction it’s heading.]