Install and Configure Boto3 with AWS

Introduction

AWS (Amazon Web Services) is an ecosystem with an abundace of services to fufill many of our development needs. There are also some great ways for us to interact with these services. We can simply log in to the web browser console(https://aws.amazon.com
). Or we can use the command line with AWS-CLI to executing commands on services from the terminal.

There are also many API's that allow code to interact with AWS programmatically. And that is what we are going to do in the following article. Creating a Python script that uses the Boto3 library to connect to an amazon web service. We will use the following objectives to reach this goal:

  1. Install the Python libraries needed to connect to AWS within a virtual environment
  2. Configure credentials to gain access to AWS
  3. Execute code to interact with AWS

Prerequisites

Virtual Environment

What is a Virtual Environment?

A Virtual Environment is a self contained directory tree that contains a Python installation for a particular version of Python, plus a number of additional packages.

Why should I use a Virtual Environment?

A Virtual Environment keeps all dependencies for the Python project separate from dependencies of other projects. This has a few advantages:

It makes dependency management for the project easy.
It enables using and testing of different library versions by quickly spinning up a new environment and verifying the compatibility of the code with the different version.

Mac OS X Setup

  1. Create a folder where the virtual environments will reside $ mkdir ~/python-envs
  2. To create a new environment named sample-env execute $ python3 -m venv ~/python-envs/sample-env
  3. To activate the environment execute $ source ~/python-envs/sample-env/bin/activate
  4. Install BOT03 package using $ pip3 install boto3
  5. To deactivate the environment execute $ deactivate

Configuration

To use BOTO3 you will need a credentials file to verify you identity so that you may access your account services. We previously covered this in an article for seting up AWS-CLI. AWS-CLI will create this credentials file for you during the setup, head over here if you haven't set up AWS-CLI.

Code

We will create a simple python script to connect to S3 and upload a file

Begin by create a directory to write your files to. I will be using the one created earlier in this article ~/python-env/sample-env

Now create a new file name 'storage_servie.py' inside the 'sample-env' directory

Add the below lines of code to the new file

import boto3	// 1

client = boto3.client('s3') // 2
client.create_bucket(Bucket="your.first.boto.s3.bucket")	// 3

buckets = client.list_buckets()	// 4
for i in buckets['Buckets']:	// 5
    print(i['Name'])
  1. Import the boto3 library into the program
  2. This will return an object that will allow for interaction with S3
  3. Use the newly acquired S3 client to create a bucket
  4. Return a list of the current buckets in use and accessable to the user credentials that were used in the configuration section
  5. Iterate through the results and print the name of each bucket

Finally, we will start up the virtual environment

Start up the virtual environment
$ source ~/python-envs/sample-env/bin/activate

If you haven't done so, install the boto3 package into the environment $ pip3 install boto3

Run the script
$ python3 storage_service.py

If everything works out properly you should see a list of S3 bucket names

your.first.boto.s3.bucket

Conclusion

We have taken the first steps of opening up our Python applications to an ever expanding set of services through AWS.

We used a Python Virtual Enviroment to create an independent development space for our code, and installed the BOTO3 library. Using AWSCLI we were able to set up the credentials we needed to access AWS. Finally, we created a script that executed commands to create and query resources on s3.