Skip to content

SpillwaveSolutions/localstack_pro_project

Repository files navigation

I'll create a comprehensive README for your LocalStack Pro project:

LocalStack Pro Project

Overview

This project demonstrates how to develop and test AWS applications locally using LocalStack Pro. It provides a simple workflow to upload and retrieve CSV files from an emulated S3 bucket, avoiding the costs and delays of working with the real AWS cloud.

Features

  • Local AWS S3 emulation using LocalStack Pro
  • Command-line tools for uploading and downloading CSV files
  • Environment-based configuration for easy switching between environments
  • Clean project structure following Python best practices

Prerequisites

  • Python 3.12 or higher
  • Docker and Docker Compose
  • LocalStack Pro subscription (and authentication token)
  • Poetry (for dependency management)

Installation

  1. Clone this repository:

    git clone https://github.com/yourusername/localstack-pro-project.git
    cd localstack-pro-project
  2. Set up Python environment with Poetry:

    poetry install
  3. Create a .env file in the project root with your configuration:

    LOCALSTACK_ENDPOINT=http://localhost:4566
    LOCALSTACK_AUTH_TOKEN=your_localstack_pro_token
    AWS_ACCESS_KEY_ID=test
    AWS_SECRET_ACCESS_KEY=test
    AWS_REGION=us-east-1
    S3_BUCKET_NAME=todo-list-bucket
    S3_DEFAULT_KEY=todo-list.csv
    CSV_FILE_PATH=./data/sample-tasks.csv
  4. Start LocalStack Pro using Docker Compose:

    docker compose up -d localstack

Usage

Setting up LocalStack

Initialize the S3 bucket in LocalStack:

poetry run setup-localstack

Uploading a CSV file

Upload a todo list CSV file to the S3 bucket:

poetry run upload-todo

You can also specify a different CSV file:

poetry run upload-todo /path/to/your/file.csv

Downloading and viewing a CSV file

Download and display the todo list from S3:

poetry run read-todo

Project Structure

localstack-pro-project/
├── .env                  # Environment variables (create this)
├── .gitignore            # Git ignore file
├── data/                 # Sample data files
│   └── sample-tasks.csv  # Example todo list
├── pyproject.toml        # Poetry configuration
├── README.md             # This file
├── src/                  # Source code
│   └── localstack_pro_project/
│       ├── __init__.py
│       ├── downloader.py # Download CSV from S3
│       ├── setup_localstack.py # Set up LocalStack
│       └── uploader.py   # Upload CSV to S3
└── tests/                # Test files

Development

Adding new dependencies

poetry add package-name

Running tests

poetry run pytest

How It Works

This project demonstrates the local-first development workflow described in "The Local Cloud Revolution: Rethinking AWS Development Workflows" chapter:

  1. We use LocalStack Pro to emulate AWS S3 on your machine
  2. The setup_localstack.py script creates the necessary S3 bucket
  3. The uploader.py script pushes a CSV file to the emulated S3 bucket
  4. The downloader.py script retrieves and displays the CSV data

This workflow provides instant feedback, costs nothing to run, and allows risk-free experimentation with AWS services.

Troubleshooting

  • "Bucket already exists" error: This is normal if you've already created the bucket. The script handles this gracefully.
  • Connection issues: Ensure LocalStack is running with docker ps. The container should be up and the 4566 port should be exposed.
  • Authentication errors: Check that your LOCALSTACK_AUTH_TOKEN is correctly set in the .env file.

License

This project is licensed under the MIT License - see the LICENSE file for details.

Acknowledgments

  • LocalStack team for providing the local AWS emulation
  • AWS for their comprehensive cloud services and SDKs
  • Docker and Docker Compose for making development simple and efficient

About

LocalStack Pro Project

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages