I'll create a comprehensive README for your LocalStack Pro project:
This project demonstrates how to develop and test AWS applications locally using LocalStack Pro. It provides a simple workflow to upload and retrieve CSV files from an emulated S3 bucket, avoiding the costs and delays of working with the real AWS cloud.
- Local AWS S3 emulation using LocalStack Pro
- Command-line tools for uploading and downloading CSV files
- Environment-based configuration for easy switching between environments
- Clean project structure following Python best practices
- Python 3.12 or higher
- Docker and Docker Compose
- LocalStack Pro subscription (and authentication token)
- Poetry (for dependency management)
-
Clone this repository:
git clone https://github.com/yourusername/localstack-pro-project.git cd localstack-pro-project -
Set up Python environment with Poetry:
poetry install
-
Create a
.envfile in the project root with your configuration:LOCALSTACK_ENDPOINT=http://localhost:4566 LOCALSTACK_AUTH_TOKEN=your_localstack_pro_token AWS_ACCESS_KEY_ID=test AWS_SECRET_ACCESS_KEY=test AWS_REGION=us-east-1 S3_BUCKET_NAME=todo-list-bucket S3_DEFAULT_KEY=todo-list.csv CSV_FILE_PATH=./data/sample-tasks.csv
-
Start LocalStack Pro using Docker Compose:
docker compose up -d localstack
Initialize the S3 bucket in LocalStack:
poetry run setup-localstackUpload a todo list CSV file to the S3 bucket:
poetry run upload-todoYou can also specify a different CSV file:
poetry run upload-todo /path/to/your/file.csvDownload and display the todo list from S3:
poetry run read-todolocalstack-pro-project/
├── .env # Environment variables (create this)
├── .gitignore # Git ignore file
├── data/ # Sample data files
│ └── sample-tasks.csv # Example todo list
├── pyproject.toml # Poetry configuration
├── README.md # This file
├── src/ # Source code
│ └── localstack_pro_project/
│ ├── __init__.py
│ ├── downloader.py # Download CSV from S3
│ ├── setup_localstack.py # Set up LocalStack
│ └── uploader.py # Upload CSV to S3
└── tests/ # Test files
poetry add package-namepoetry run pytestThis project demonstrates the local-first development workflow described in "The Local Cloud Revolution: Rethinking AWS Development Workflows" chapter:
- We use LocalStack Pro to emulate AWS S3 on your machine
- The
setup_localstack.pyscript creates the necessary S3 bucket - The
uploader.pyscript pushes a CSV file to the emulated S3 bucket - The
downloader.pyscript retrieves and displays the CSV data
This workflow provides instant feedback, costs nothing to run, and allows risk-free experimentation with AWS services.
- "Bucket already exists" error: This is normal if you've already created the bucket. The script handles this gracefully.
- Connection issues: Ensure LocalStack is running with
docker ps. The container should be up and the 4566 port should be exposed. - Authentication errors: Check that your LOCALSTACK_AUTH_TOKEN is correctly set in the .env file.
This project is licensed under the MIT License - see the LICENSE file for details.
- LocalStack team for providing the local AWS emulation
- AWS for their comprehensive cloud services and SDKs
- Docker and Docker Compose for making development simple and efficient