Create an EC2 inventory report in a multi-account AWS environment

Table of Contents

Problem statement

When a company has ~100 AWS Accounts (12 AWS regions), many different departments, environments, and sandboxes, it may become difficult to look after budgets and resource usage. In this case, we had to check all accounts and regions weekly and create a CSV report with the following fields: Account ID, Account name, Region, EC2 Name, EC2 ID, and Public IP. The first idea was to use the Lambda function, but it was dismissed due to the execution duration limit (15 minutes) which is not enough for so many accounts and regions. The second idea was to use ECS Fargate, which does not have a duration limit, but it requires VPC, subnets, and other networking components that are overkill. The accepted design was the following.

Solution design

AWS CodeBuild was chosen for the CronJob, as a serverless and cost-effective service, which does not require VPC. So, how it works:

  1. Amazon EventBridge is used to launch jobs weekly
  2. When CodeBuild is started, it downloads Python script from the S3 bucket and executes it.
  3. Python script goes in a loop “account by account”, assumes IAM Role there, and goes in another loop “region by region”, calling AWS EC2 API and gathering all required information.
  4. Python script generates a CSV file and uploads it S3 bucket.
  5. Python script creates a pre-signed URL and sends it to the dedicated email address via SNS.

Python code

Two Python files should be uploaded to the S3 bucket. The S3 bucket name will be provided as input to the CloudFormation template.

ec2_report_public_ip.py

import time
import csv
import logging
import json
import re
import boto3
import os
import awshelpers
from botocore.exceptions import ClientError
from botocore.config import Config

config = Config(
    retries={'max_attempts': 3, 'mode': 'legacy'},
    connect_timeout=30
)

fmt = '[%(asctime)s] %(lineno)d %(levelname)s - %(message)s'
logging.basicConfig(format=fmt, datefmt='%m/%d/%Y %I:%M:%S')
logger = logging.getLogger(__name__)
logger.setLevel(logging.INFO)

accounts_without_connection = []
exceptions_accounts = ['*********053']
filename = 'alogosec_publicIP_report_'+time.strftime("%Y%m%d-%H%M")+'.csv'
regions = [
    "us-east-1",
    "us-east-2",
    "us-west-2",
    "eu-west-1",
    "ap-southeast-1",
    "ap-southeast-2",
    "eu-central-1",
    "eu-west-2",
    "ca-central-1",
    "ap-south-1",
    "ap-northeast-1",
    "sa-east-1"
]


def role_arn_to_session(**args):
    session = boto3.Session()
    client = session.client('sts', config=config)
    try:
        response = client.assume_role(**args)
        return boto3.Session(
            aws_access_key_id=response['Credentials']['AccessKeyId'],
            aws_secret_access_key=response['Credentials']['SecretAccessKey'],
            aws_session_token=response['Credentials']['SessionToken'],
            region_name='us-east-1')
    except ClientError as error:
        if error.response['Error']['Code'] == 'AccessDenied':
            logger.error(error.response['Error'])


def find_IPs(region, session, account_id, account_name):
    client = session.client('ec2', region_name=region, config=config)
    api_request = awshelpers.AwsApiRequest(client=client, filters=[{
        'Name': "instance-state-name", 'Values': ["running", "stopped", "shutting-down", "stopping", "pending"]
    }])
    instances_ids = (api_request.create_instances_list_and_ids(region, account_id, account_name))
    return instances_ids


def main(accountid, region, account_name):
    session = role_arn_to_session(RoleArn='arn:aws:iam::%s:role/AWSControlTowerExecution' % accountid,
                                  RoleSessionName='EC2_PUBLIC_IP_SCAN')
    try:
        return find_IPs(region, session, account_id, account_name)
    except TypeError as exc:
        logger.error(f"ERROR IN CREATING INVENTORY {exc}")
        return


def write_to_csv(list_to_csv):
    with open(filename, 'a', newline='') as file:
        writer = csv.writer(file)
        writer.writerows(list_to_csv)


def generate_report(csv_list):
    for accounts_info in csv_list:
        write_to_csv(create_rows_to_csv(accounts_info))


def generate_file_capture():
    with open(filename, 'w', newline='') as file:
        writer = csv.writer(file)
        writer.writerow(
            ["Account ID", "Account name", "Region", "EC2 ID", "EC2 Name", "public IP"])


###--------------------uploadS3---------------------(use if needed)

def upload_file(filename, bucket, bucket_accountid, object_name=None):
    """Upload a file to an S3 bucket
    :param filename: File to upload
    :param bucket: Bucket to upload to
    :param object_name: S3 object name. If not specified then filename is used
    """
    try:
        session = role_arn_to_session(RoleArn='arn:aws:iam::%s:role/AWSControlTowerExecution' % bucket_accountid,
                                      RoleSessionName='EC2_PUBLIC_IP_SCAN')
    except Exception as exc:
        logger.error(f"COULD NOT ESTABLISH SESSION WITH {bucket_accountid} {exc}")
        return
    s3client = session.client('s3')

    # If S3 object_name was not specified, use filename
    if object_name is None:
        object_name = filename

    # Upload the file
    logger.info(f"Uploading report {filename} to {bucket} bucket")
    try:
        response = s3client.upload_file(filename, bucket, object_name)
    except ClientError as exc:
        logger.error(exc)


def create_presigned_url(bucket, object_name, bucket_accountid, expiration=43200):
    logger.info(f"Creating presigned URL for {object_name} in {bucket} bucket")
    try:
        session = role_arn_to_session(RoleArn='arn:aws:iam::%s:role/AWSControlTowerExecution' % bucket_accountid,
                                      RoleSessionName='EC2_PUBLIC_IP_SCAN')
    except Exception as exc:
        logger.error(f"COULD NOT ESTABLISH SESSION WITH {bucket_accountid} {exc}")
        return
    s3client = session.client('s3')
    """Generate a presigned URL to share an S3 object
    :param bucket_name: string
    :param object_name: string
    :param expiration: Time in seconds for the presigned URL to remain valid
    :return: Presigned URL as string. If error, returns None.
    """
    try:
        response = s3client.generate_presigned_url('get_object',
                                                    Params={'Bucket': bucket,
                                                            'Key': object_name},
                                                    ExpiresIn=expiration)
    except ClientError as e:
        logging.error(e)
        return None
    return response

def send_sns_message(sns_topic_arn, bucket, object_name, url):
    sns_client = boto3.client('sns')
    logger.info(f"Sending notification to {sns_topic_arn} SNS topic")
    try:
        response = sns_client.publish(
                TopicArn=sns_topic_arn,
                Message='New EC2 public IP\'s report for AWS Organization is ready. \n\nHere is the link for download: \n\n {} \n\nPlease, note, link will expire in 12 hours. \nReport could also be found in S3 bucket {}/{}, in "*********053" AWS account. \nThank you. \n'.format(url, bucket, object_name),
                Subject='AWS Organization public IP\'s report - {}'.format(object_name)
        )
    except ClientError as e:
        logging.error(e)


if __name__ == "__main__":
    generate_file_capture()
    csv_list = []
    account = boto3.session.Session()
    client = account.client('organizations', config=config)
    account_list = awshelpers.AwsApiRequest(client).create_account_list()
    for region in regions:
        for account_id in account_list:
            logger.info(f"Running for account {account_id, region}")
            account_details = client.describe_account(AccountId=account_id)
            account_name = account_details['Account']['Name']
            if account_id not in exceptions_accounts and account_details['Account']['Status'] == 'ACTIVE':
                try:
                    result = main(account_id, region, account_name)
                    logger.info(f"Result for account {account_id} {region} {result}")
                    if result:
                        write_to_csv(result)
                except Exception as exc:
                    logging.error(f"EXCEPTION IN MAIN {exc}")
                    accounts_without_connection.append(account_id)
    logger.info(f"Accounts_without_connection {accounts_without_connection}")
    

    bucket_accountid = boto3.client("sts").get_caller_identity()["Account"]
    bucket = "*******-ec2-public-ips-report"
    sns_topic_arn = "arn:aws:sns:eu-west-1:*********053:AWSOrgEC2PublicIPsReportSNSTopic"
    upload_file(filename=filename, bucket = bucket, bucket_accountid = bucket_accountid, object_name="reports/"+filename)
    url = create_presigned_url(bucket = bucket, object_name="reports/"+filename, bucket_accountid = bucket_accountid)
    send_sns_message(sns_topic_arn=sns_topic_arn, bucket=bucket, object_name="reports/"+filename, url = url)

awshelpers.py

def paginate_resources(arg):
    def out(func):
        def wrap(*args, **kwargs):
            response = func(*args, **kwargs)
            result = [response.get(arg)]
            while 'NextToken' in response:
                NextToken = response['NextToken']
                response = func(*args, **kwargs, NextToken=NextToken)
                result.append(response.get(arg))
            return result
        return wrap
    return out


class AwsApiRequest:
    def __init__(self, client, filters=None):
        self.client = client
        self.filters = filters

    def create_account_list(self):
        return [
            account['Id'] for accounts in self.__map_accounts() for account in accounts if
            account['Status'] == 'ACTIVE'
        ]

    def create_instances_list_and_ids(self, region, account_id, account_name):
        instances_list = []
        for sublist in self.__map_instances():
            for items in sublist:
                for item in items["Instances"]:
                    try:
                        PublicIp = item["PublicIpAddress"]
                    except KeyError:
                        continue
                    instance_name = "Instance is not tagged"
                    try:
                        for tag in item['Tags']:
                            if tag['Key'] == 'Name':
                                instance_name = tag['Value']
                    except Exception:
                        pass
                    instances_list.append(
                        (account_id, account_name, region, instance_name, item["InstanceId"], PublicIp)
                    )
        return instances_list

    @paginate_resources("Accounts")
    def __map_accounts(self, NextToken=''):
        if NextToken:
            return self.client.list_accounts(NextToken=NextToken)
        return self.client.list_accounts()

    @paginate_resources("Reservations")
    def __map_instances(self, NextToken=''):
        if NextToken:
            return self.client.describe_instances(Filters=self.filters, NextToken=NextToken)
        return self.client.describe_instances(Filters=self.filters)

CloudFormation template

The solution was deployed by CloudFormation:

AWSTemplateFormatVersion: '2010-09-09'
Description: CodeBuild CloudFormation template

Parameters:
  CodeBuildProjectName:
    Description: CodeBuild Project to get organization public IPs list
    Type: String
    MinLength: 1
    MaxLength: 255
    AllowedPattern: ^[a-zA-Z][-a-zA-Z0-9]*$
    Default: AWSOrgEC2PublicIPsReport
  S3BuckeName:
    Description: S3 Bucket Name
    Type: String
    MinLength: 1
    MaxLength: 255
    AllowedPattern: ^[a-zA-Z][-a-zA-Z0-9]*$
    Default: *******-ec2-public-ips-report
  SNSSubscriptionEmail:
    Type: String
    Description: Email address to notify
    Default: user@example.com

Resources:
  CodeBuildProject:
    Type: AWS::CodeBuild::Project
    Properties:
      Name: !Ref CodeBuildProjectName
      Source:
        Type: S3
        Location: !Sub "${S3BuckeName}/scripts/"
        BuildSpec: !Sub |
          version: 0.2
          phases:
            install:
              commands:
                - pip3 install boto3
            build:
              commands:
                - python3 ec2_report_public_ip.py
      Environment:
        Type: LINUX_CONTAINER
        Image: public.ecr.aws/sam/build-python3.9
        ComputeType: BUILD_GENERAL1_SMALL
      ServiceRole: !Ref AWSOrgPublicIPsReportCodeBuildServiceRole
      Artifacts:
        Type: NO_ARTIFACTS
      LogsConfig:
        CloudWatchLogs:
          Status: ENABLED
          GroupName: !Sub ${CodeBuildProjectName}

  AWSOrgPublicIPsReportCodeBuildServiceRole:
    Type: AWS::IAM::Role
    Properties:
      Path: /
      AssumeRolePolicyDocument:
        Statement:
          - Effect: Allow
            Principal:
              Service:
                - codebuild.amazonaws.com
            Action:
              - sts:AssumeRole
      MaxSessionDuration: 43200
      Policies:
        - PolicyName: !Sub ${CodeBuildProjectName}CodebuildBasePolicy
          PolicyDocument:
            Version: 2012-10-17
            Statement:
              - Effect: Allow
                Resource:
                  - !Sub arn:aws:logs:${AWS::Region}:${AWS::AccountId}:log-group:${CodeBuildProjectName}
                  - !Sub arn:aws:logs:${AWS::Region}:${AWS::AccountId}:log-group:${CodeBuildProjectName}:*
                Action:
                  - logs:CreateLogGroup
                  - logs:CreateLogStream
                  - logs:PutLogEvents
              - Effect: Allow
                Resource:
                  - !Sub arn:aws:s3:::codepipeline-${AWS::Region}-*
                Action:
                  - s3:PutObject
                  - s3:GetObject
                  - s3:GetObjectVersion
                  - s3:GetBucketAcl
                  - s3:GetBucketLocation
        - PolicyName: !Sub ${CodeBuildProjectName}S3Policy
          PolicyDocument:
            Version: 2012-10-17
            Statement:
              - Effect: Allow
                Resource:
                  - !Sub arn:aws:s3:::${S3BuckeName}
                  - !Sub arn:aws:s3:::${S3BuckeName}/*
                Action:
                  - s3:*
        - PolicyName: !Sub ${CodeBuildProjectName}OrganizationReadonlyPolicy
          PolicyDocument:
            Version: 2012-10-17
            Statement:
              - Effect: Allow
                Resource:
                  - "*"
                Action:
                  - organizations:Describe*
                  - organizations:List*
        - PolicyName: !Sub ${CodeBuildProjectName}AllowAssumeAWSControlTowerExecutionRole
          PolicyDocument:
            Version: 2012-10-17
            Statement:
              - Effect: Allow
                Resource:
                  - arn:aws:iam::*:role/AWSControlTowerExecution
                Action:
                  - sts:AssumeRole
        - PolicyName: !Sub ${CodeBuildProjectName}AllowSNSNotification
          PolicyDocument:
            Version: 2012-10-17
            Statement:
              - Effect: Allow
                Resource:
                  - !GetAtt SNSTopic.TopicArn
                Action:
                  - sns:Publish


  EventBridgeRule:
    Type: AWS::Scheduler::Schedule
    Properties: 
      Description: Event Bridge cronjob for CodeBuild Project
      FlexibleTimeWindow: 
        Mode: "OFF"
      Name: !Sub ${CodeBuildProjectName}
      ScheduleExpression: cron(0 10 ? * MON *)
      ScheduleExpressionTimezone: UTC
      State: ENABLED
      Target: 
        Arn: !GetAtt CodeBuildProject.Arn
        RoleArn: !GetAtt EventBridgeRole.Arn
  EventBridgeRole:
    Type: AWS::IAM::Role
    Properties:
      Path: /
      AssumeRolePolicyDocument:
        Statement:
          - Effect: Allow
            Principal:
              Service:
                - scheduler.amazonaws.com
            Action:
              - sts:AssumeRole
      Policies:
        - PolicyName: !Sub ${CodeBuildProjectName}AllowTriggerCodebuild
          PolicyDocument:
            Version: 2012-10-17
            Statement:
              - Effect: Allow
                Resource:
                  - !GetAtt CodeBuildProject.Arn
                Action:
                  - "*"
  SNSTopic:
    Type: AWS::SNS::Topic
    Properties:
      DisplayName: !Sub ${CodeBuildProjectName}SNSTopic
      Subscription:
        - Endpoint: !Ref SNSSubscriptionEmail
          Protocol: email
      TopicName: !Sub ${CodeBuildProjectName}SNSTopic

Results and conclusion

Here is an example of an SNS notification that is sent to the responsible person’s email. It contains a pre-signed URL that is valid for 12 hours. If the person, could not use the URL before expiration, the notification contains a file location as well:

Here is an example of the report that is generated and sent weekly:

In his post, we demonstrated a serverless and cost-effective solution for collecting information from many AWS accounts and regions within the AWS organization. This solution helps identify missed instances, understand cost and security risks, and remediate them in time.