When Are Your SSL Certificates Expiring on AWS?

If you uploaded SSL certificates to Amazon Web Services for ELB (Elastic Load Balancing) or CloudFront (CDN), then you will want to keep an eye on the expiration dates and renew the certificates well before to ensure uninterrupted service.

If you uploaded the SSL certificates yourself, then of course at that time you set an official reminder to make sure that you remembered to renew the certificate. Right?

However, if you inherited an AWS account and want to review your company or client’s configuration, then here’s an easy command to get a list of all SSL certificates in IAM, sorted by expiration date.

aws iam list-server-certificates \
  --output text \
  --query 'ServerCertificateMetadataList[*].[Expiration,ServerCertificateName]' \
  | sort

To get more information on an individual certificate, you might use something like:

Throw Away The Password To Your AWS Account

reduce the risk of losing control of your AWS account by not knowing the root account password

As Amazon states, one of the best practices for using AWS is:

We strongly recommend that you do not use the root user for your everyday tasks, even the administrative ones. Instead, use your root user credentials only to create your IAM admin user. Then securely lock away the root user credentials and use them to perform only a few account and service management tasks.

The root account credentials are the email address and password that you used when you first registered for AWS. These credentials have the ultimate authority to create and delete IAM users, change billing, close the account, and perform all other actions on your AWS account.

You can create a separate IAM user with near-full permissions for use when you need to perform admin tasks, instead of using the AWS root account. If the credentials for the admin IAM user are compromised, you can use the AWS root account to disable those credentials to prevent further harm, and create new credentials for ongoing use.

However, if the credentials for your AWS root account are compromised, the person who stole them can take over complete control of your account, change the associated email address, and lock you out.

I have consulted for companies who lost control over the root AWS account which contained their assets. You want to avoid this.

Proposal

Given:

  • The AWS root account is not required for regular use as long as you have created an IAM user with admin privileges

  • Amazon recommends not using your AWS root account

  • You can’t accidentally expose your AWS root account password if you don’t know it and haven’t saved it anywhere

  • You can always reset your AWS root account password as long as you have access to the email address associated with the account

Consider this approach to improving security:

AWS Community Heroes Program

Amazon Web Services recently announced an AWS Community Heroes Program where they are starting to recognize publicly some of the many individuals around the world who contribute in so many ways to the community that has grown up around the services and products provided by AWS.

It is fun to be part of this community and to share the excitement that so many have experienced as they discover and promote new ways of working and more efficient ways of building projects and companies.

Here are some technologies I have gotten the most excited about over the decades. Each of these changed my life in a significant way as I invested serious time and effort learning and using the technology. The year represents when I started sharing the “good news” of the technology with people around me, who at the time usually couldn’t have cared less.

Finding the Region for an AWS Resource ID

use concurrent AWS command line requests to search the world for your instance, image, volume, snapshot, …

Background

Amazon EC2 and many other AWS services are divided up into various regions across the world. Each region is a separate geographic area and is completely independent of other regions.

Though this is a great architecture for preventing global meltdown, it can occasionally make life more difficult for customers, as we must interact with each region separately.

One example of this is when we have the id for an AMI, instance, or other EC2 resource and want to do something with it but don’t know which region it is in.

This happens on ServerFault when a poster presents a problem with an instance, provides the initial AMI id, but forgets to specify the EC2 region. In order to find and examine the AMI, you need to look in each region to discover where it is.

New c3.* Instance Types on Amazon EC2 - Nice!

Worth switching.

Amazon shared that the new c3.* instance types have been in high demand on EC2 since they were released.

I finally had a minute to take a look at the specs for the c3.* instances which were just announced at AWS re:Invent, and it is obvious why they are popular and why they should probably be even more popular than they are.

Let’s just take a look at the cheapest of these, the c3.large, and compare it to the older generation c1.medium, which is similar in price:

Using aws-cli --query Option To Simplify Output

My favorite session at AWS re:Invent was James Saryerwinnie’s clear, concise, and informative tour of the aws-cli (command line interface), which according to GitHub logs he is enhancing like crazy.

I just learned about a recent addition to aws-cli: The --query option lets you specify what parts of the response data structure you want output.

Instead of wading through pages of JSON output, you can select a few specific values and output them as JSON, table, or simple text. The new --query option is far easier to use than jq, grep+cut, or Perl, my other fallback tools for parsing the output.

aws --query Examples

The following sample aws-cli commands use the --query and --output options to extract the desired output fields so that we can assign them to shell variables:

Installing aws-cli, the New AWS Command Line Tool

consistent control over more AWS services with aws-cli, a single, powerful command line tool from Amazon

Readers of this tech blog know that I am a fan of the power of the command line. I enjoy presenting functional command line examples that can be copied and pasted to experience services and features.

The Old World

Users of the various AWS legacy command line tools know that, though they get the job done, they are often inconsistent in where you get them, how you install them, how you pass options, how you provide credentials, and more. Plus, there are only tool sets for a limited number of AWS services.

I wrote an article that demonstrated the simplest approach I use to install and configure the legacy AWS command line tools, and it ended up being extraordinarily long.

I’ve been using the term “legacy” when referring to the various old AWS command line tools, which must mean that there is something to replace them, right?

The New World

The future of the AWS command line tools is aws-cli, a single, unified, consistent command line tool that works with almost all of the AWS services.

Cost of Transitioning S3 Objects to Glacier

how I was surprised by a large AWS charge and how to calculate the break-even point

Glacier Archival of S3 Objects

Amazon recently introduced a fantastic new feature where S3 objects can be automatically migrated over to Glacier storage based on the S3 bucket, the key prefix, and the number of days after object creation.

This makes it trivially easy to drop files in S3, have fast access to them for a while, then have them automatically saved to long-term storage where they can’t be accessed as quickly, but where the storage charges are around a tenth of the price.

…or so I thought.

Running Ubuntu on Amazon EC2 in Sydney, Australia

Amazon has announced a new AWS region in Sydney, Australia with the name ap-southeast-2.

The official Ubuntu AMI lookup pages (1, 2) don’t seem to be showing the new location yet, but the official Ubuntu AMI query API does seem to be working, so the new ap-southeast-2 Ubuntu AMIs are available for lookup on Alestic.com.

[Update 2012-11-13: Canonical has fixed the primary Ubuntu AMI lookup page and I understand it should remain more up to date going forward, but the other page is still missing ap-southeast-2]

Point and Click

At the top right of most pages on Alestic.com is an “Ubuntu AMIs” section. Simply select the EC2 region from the pulldown (say “ap-southeast-2” for Sydney, Australia) and you will see a list of the official 64-bit Ubuntu AMI ids for the various active Ubuntu releases.

Installing AWS Command Line Tools from Amazon Downloads
This article describes how to install the old generation of AWS command line tools. For the most part, these have been replaced with the new AWS cli that is easier to install and more comprehensive:

When you need an AWS command line toolset not provided by Ubuntu packages, you can download the tools directly from Amazon and install them locally.

In a previous article I provided instructions on how to install AWS command line tools using Ubuntu packages. That method is slightly easier to set up and easier to upgrade when Ubuntu releases updates. However, the Ubuntu packages aren’t always up to date with the latest from Amazon and there are not yet Ubuntu packages published for every AWS command line tools you might want to use.

Unfortunately, Amazon does not have one single place where you can download all the command line tools for the various services, nor are all of the tools installed in the same way, nor do they all use the same format for accessing the AWS credentials.

The following steps show how I install and configure the AWS command line tools provided by Amazon when I don’t use the packages provided by Ubuntu.

Which EC2 Availability Zone is Affected by an Outage?

Did you know that Amazon includes status messages about the health of availability zones in the output of the ec2-describe-availability-zones command, the associated API call, and the AWS console?

Right now, Amazon is restoring power to a “large number of instances” in one availability zone in the us-east-1 region due to “electrical storms in the area”.

Since the names used for specific availability zones differ between AWS accounts, Amazon can’t just say that the affected zone is us-east-1c as it might be us-east-1e in another account.

During this outage, you can find out what the name of the affected availability zone is in your AWS account by running this command (installation instructions):

Installing AWS Command Line Tools Using Ubuntu Packages

See also: Installing AWS Command Line Tools from Amazon Downloads

Here are the steps for installing the AWS command line tools that are currently available as Ubuntu packages. These include:

  • EC2 API tools
  • EC2 AMI tools
  • IAM - Identity and Access Management
  • RDS - Relational Database Service
  • CloudWatch
  • Auto Scaling
  • ElastiCache

Starting with Ubuntu 12.04 LTS Precise, these are also available:

  • CloudFormation
  • ELB - Elastic Load Balancer

Install Packages

Ubuntu Developer Summit, May 2012 (Oakland)

I will be attending the Ubuntu Developer Summit (UDS) next week in Oakland, CA.  This event brings people from around the world together in one place every six months to discuss and plan for the next release of Ubuntu.  The May 2012 UDS is for Ubuntu-Q which will eventually be named and become Ubuntu 12.10 when it is released in October (2012-10).

Seeding Torrents with Amazon S3 and s3cmd on Ubuntu

Amazon Web Services is such a huge, complex service with so many products and features that sometimes very simple but powerful features fall through the cracks when you’re reading the extensive documentation.

One of these features, which has been around for a very long time, is the ability to use AWS to seed (serve) downloadable files using the BitTorrentâ„¢ protocol. You don’t need to run EC2 instances and set up software. In fact, you don’t need to do anything except upload your files to S3 and make them publicly available.

Any file available for normal HTTP download in S3 is also available for download through a torrent. All you need to do is append the string ?torrent to the end of the URL and Amazon S3 takes care of the rest.

Steps

Let’s walk through uploading a file to S3 and accessing it with a torrent client using Ubuntu as our local system. This approach uses s3cmd to upload the file to S3, but any other S3 software can get the job done, too.

CloudCamp

There are a number of CloudCamp events coming up in cities around the world. These are free events, organized around the various concepts, technologies, and services that fall under the “cloud” term.

There’s always some discussion about my favorite topic, Amazon AWS and EC2, but there are sure to be experts and beginners for every other cloud-related flavor as well. You can attend presentations, join in discussions, or hang out in the hallway and make connections with local folks who are interested in the same things you are.

CloudCamp follows somewhat of an unconference format, though the couple I’ve been to in LA tended to have more pre-planned elements than, say, a BarCamp. Glancing through the schedules, it looks like each city also has their own twist and personality for CloudCamp.

Here are two upcoming CloudCamps that are of particular interest to me: