Enterprise-grade, production-ready NestJS boilerplate with modern architecture patterns
| Statements | Branches | Functions | Lines |
|---|---|---|---|
- Architecture Overview
- Architecture Comparison
- Layer Communication Rules
- Project Structure
- Quick Start
- User Flow
- Documentation Guides
- Key Features
- Tech Stack
- Contributing
- License
This project implements a pragmatic architecture that combines the best ideas from Clean Architecture, Domain-Driven Design (DDD), and Hexagonal Architecture. Rather than strictly following one pattern, it takes a practical approach: powerful enough for enterprise applications, yet simple enough for any developer to understand and maintain.
The architecture is built around one fundamental principle: protect your business logic. Your domain rules should never depend on frameworks, databases, or external services. If you decide to switch from PostgreSQL to MongoDB, or from Redis to Memcached, your core business logic remains untouched.
| Pattern | This Project | Key Difference |
|---|---|---|
| Clean Architecture | β Implements | Simplified layers without over-engineering |
| Domain-Driven Design | β Implements | Entities and Use Cases without complex aggregates |
| Hexagonal Architecture | β Implements | Ports (interfaces) and Adapters (implementations) |
| Onion Architecture | β Implements | Core at center, dependencies point inward |
Clean Architecture organizes code into concentric circles where dependencies point inward. The innermost circle contains business rules, and outer circles contain implementation details.
How we implement it:
- Entities live in
src/core/*/entityβ pure business objects - Use Cases live in
src/core/*/use-casesβ application-specific business rules - Interfaces live in
src/core/*/repositoryβ contracts for external dependencies - Frameworks live in
src/modulesandsrc/infraβ NestJS controllers and database implementations
DDD focuses on modeling your business domain. It introduces concepts like Entities, Value Objects, Aggregates, and Repositories.
How we implement it:
- Entities: Objects with identity that persist over time (
UserEntity,RoleEntity) - Repository Pattern: Abstract interfaces defining data access contracts
- Use Cases: Encapsulate business operations (similar to Application Services in DDD)
- Bounded Contexts: Each module represents a bounded context
What we simplified:
- Simplified Aggregates β entities can be grouped but without strict root enforcement
- No Domain Events infrastructure β use the event system in
libs/when needed - No Value Objects as separate classes β Zod schemas handle validation
Hexagonal Architecture separates the application from external concerns through Ports (interfaces) and Adapters (implementations).
How we implement it:
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β ADAPTERS β
β βββββββββββββββ βββββββββββββββ βββββββββββββββββββββββ β
β β Controllers β β Repositoriesβ β External Services β β
β β (modules/) β β (modules/) β β (infra/) β β
β ββββββββ¬βββββββ ββββββββ¬βββββββ ββββββββββββ¬βββββββββββ β
β β β β β
β βΌ βΌ βΌ β
β βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ β
β β PORTS β β
β β (core/*/repository interfaces) β β
β βββββββββββββββββββββββββββ¬ββββββββββββββββββββββββββββββββ β
β β β
β βΌ β
β βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ β
β β CORE β β
β β Entities + Use Cases + Interfaces β β
β β (src/core/) β β
β βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ β
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
Ports (Interfaces):
ICatRepositoryβ defines what operations are availableIHttpAdapterβ defines HTTP client contractICacheAdapterβ defines caching contract
Adapters (Implementations):
CatRepositoryinmodules/β implementsICatRepositorywith TypeORM/MongooseHttpServiceininfra/β implementsIHttpAdapterwith AxiosRedisServiceininfra/β implementsICacheAdapterwith Redis
β οΈ Important Notes About This ArchitectureThis section explains some deliberate choices that differ from traditional implementations. Understanding these decisions will help you work with the codebase effectively.
You may notice that some interfaces in this project use the word "Adapter" (e.g., IHttpAdapter, ICacheAdapter). In traditional Hexagonal Architecture:
- Port: An interface that defines a contract (what operations are available)
- Adapter: A concrete implementation that fulfills that contract (how it's done)
The academic distinction:
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β HEXAGONAL (Traditional) β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ€
β β
β Port (Interface) Adapter (Implementation) β
β βββββββββββββββββ ββββββββββββββββββββββββ β
β IUserRepository β PostgresUserRepository β
β IEmailService β SendGridEmailService β
β ICacheService β RedisCacheService β
β β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
Our simplified approach:
We use "Adapter" in interface names because, conceptually, both are abstractions. The fundamental principle is the same: decouple your core business logic from implementation details. Whether you call the interface a "Port" or "Adapter" doesn't change how the pattern works.
// Traditional naming
interface IUserRepository { } // Port
class PostgresUserRepository { } // Adapter
// Our naming (simplified)
interface IHttpAdapter { } // Still an abstraction (contract)
class HttpService { } // Still an implementationWhy this simplification?
- Reduced cognitive load β One less concept to explain to new developers
- Practical focus β The behavior is identical regardless of naming
- Consistency β All abstractions follow the same
I*Adapterpattern
The key takeaway: if it's an interface, it's a contract. If it's a class implementing that interface, it's the implementation. The names are just labels.
You may notice that repository contracts use abstract class instead of TypeScript interface:
// What we use
export abstract class ICatRepository extends IRepository<CatEntity> {
abstract findByBreed(breed: string): Promise<CatEntity[]>
}
// Instead of
export interface ICatRepository extends IRepository<CatEntity> {
findByBreed(breed: string): Promise<CatEntity[]>
}Why? This is a NestJS/Node.js limitation.
TypeScript interfaces are erased at runtime β they don't exist in the compiled JavaScript. NestJS dependency injection relies on runtime tokens to resolve providers. If we used interfaces, we would need to pass a string token:
// β With interface β requires string token
@Module({
providers: [
{
provide: 'ICatRepository', // String token (error-prone, no type safety)
useClass: CatRepository,
},
],
})
// β
With abstract class β class itself is the token
@Module({
providers: [
{
provide: ICatRepository, // Class reference (type-safe, refactorable)
useClass: CatRepository,
},
],
})Benefits of abstract classes:
- Type safety β No magic strings, refactoring tools work correctly
- Runtime existence β The class exists in compiled JavaScript
- Same behavior β Acts as a contract just like an interface
- Better DX β IDE autocomplete and "Go to Definition" work properly
The trade-off: Abstract classes can have implementation details (which interfaces cannot). We simply don't use that feature β our abstract classes are pure contracts.
Yes, we know. The src/middlewares/ folder contains:
- Middlewares (authentication)
- Guards (authorization)
- Interceptors (logging, tracing)
- Filters (exception handling)
Why didn't we split them?
Honestly? We couldn't find a better name. π€·
We tried:
http-pipeline/β too genericrequest-handlers/β not quite rightcross-cutting/β sounds like a buzzword bingo winnerstuff-that-runs-before-and-after-your-code/β accurate but... no
So we stuck with middlewares/ because:
- They all operate in the HTTP request/response lifecycle
- They're all "things that wrap your controller logic"
- Everyone knows where to find them
If you have a better name, PRs are welcome! Until then, just accept that middlewares/ is a "creative interpretation" of the term. π
This is a fundamental difference from many Clean Architecture implementations.
The traditional approach (Clean Architecture):
Controller β Validates Input β Use Case β Business Logic
In traditional Clean Architecture, input validation happens in the Controller or a dedicated Validation layer before reaching the Use Case. The Use Case assumes it receives valid data.
Our approach:
Controller β Use Case (Validates + Business Logic)
We validate inputs inside the Use Case using Zod schemas.
Why we made this choice:
-
Testability
When you test a Use Case, you should test the complete behavior β including validation. It's unacceptable to have a Use Case that passes tests but fails in production because validation was bypassed.
// Our tests validate the complete use case behavior it('should throw validation error for invalid email', async () => { const input = { email: 'invalid-email', name: 'John' }; await expect(useCase.execute(input)).rejects.toThrow(ValidationException); });
-
Use Case Integrity
A Use Case is a complete unit of business logic. If
CreateUserUseCaserequires a valid email, that validation IS part of the use case β not something external to it. -
Self-Documenting Code
Looking at a Use Case, you immediately see what inputs it expects and how they're validated. No need to hunt through multiple layers.
-
Reduced Duplication
If multiple controllers call the same Use Case, validations are automatically applied. No risk of one controller forgetting to validate.
Comparison with other approaches:
| Approach | Validation Location | Pros | Cons |
|---|---|---|---|
| Traditional Clean | Controller/Validator layer | Thin use cases | Validation can be bypassed, harder to test |
| DDD | Domain entities (Value Objects) | Rich domain model | Complex, verbose |
| Our Approach | Inside Use Case | Complete testability, self-contained | See trade-offs below |
The trade-off:
If you need to consume a Use Case from multiple entry points with different validation rules, the Use Case validations might be too restrictive.
Solution: For those cases, move specific validations to the Application layer (Controller/Adapter). The Use Case can have minimal validations (or none), and each consumer applies its own rules:
// Controller A - Web API (strict validation)
@Post()
async create(@Body() input: CreateUserInput): Promise<UserCreateOutput> {
// Validate for web context
const validated = WebUserSchema.parse(input);
return this.useCase.execute(validated);
}
// Controller B - Internal service (different validation)
async createFromInternal(input: InternalUserInput): Promise<UserCreateOutput> {
// Validate for internal context
const validated = InternalUserSchema.parse(input);
return this.useCase.execute(validated);
}Our recommendation: Start with validations inside Use Cases. Only move them out when you have a concrete need for different validation rules per consumer.
The core/ folder is sacred β it contains your business logic and must remain pure and independent. Here are the key rules to follow:
An anemic entity is just a data container with no behavior β essentially a DTO. This is an anti-pattern because business logic ends up scattered across use cases and services.
| β Avoid | β Prefer |
|---|---|
| Entity with only properties | Entity with properties and behavior |
| Business logic in Use Cases | Business logic in the Entity when it relates to state |
| Calculations outside entity | Calculations as entity methods |
Ask yourself: "Does this logic relate to the entity's state?" If yes, it belongs in the entity.
π See detailed examples: Entity Guide β includes Rich Entity vs Anemic Entity comparison
Every Entity must extend the BaseEntity class. This is mandatory in this project.
| β Avoid | β Prefer |
|---|---|
class UserEntity { } |
class UserEntity extends BaseEntity<UserEntity>() { } |
export class CatEntity { } |
export class CatEntity extends BaseEntity<CatEntity>() { } |
// β WRONG - Not extending BaseEntity
export class CatEntity {
id!: string
name!: string
breed!: string
age!: number
createdAt?: Date
updatedAt?: Date
deletedAt?: Date
constructor(entity: Cat) {
Object.assign(this, entity)
}
}
// β
CORRECT - Extends BaseEntity
import { BaseEntity } from '@/utils/entity'
export class CatEntity extends BaseEntity<CatEntity>() {
name!: Cat['name']
breed!: Cat['breed']
age!: Cat['age']
constructor(entity: Cat) {
super(CatEntitySchema)
this.validate(entity)
this.ensureID()
}
}What BaseEntity provides:
- Common properties β
id,createdAt,updatedAt,deletedAtare inherited - Validation β
validate(entity)method validates input against Zod schema - ID generation β
ensureID()generates UUID if not provided - Status methods β
isActive(),isDeleted(),activate(),deactivate() - Serialization β
toObject()returns plain object,clone()creates a copy - Type safety β
nameOf()provides type-safe property names
Constructor pattern:
Every entity constructor must follow this pattern:
constructor(entity: Cat) {
super(CatEntitySchema) // 1. Pass Zod schema to parent
this.validate(entity) // 2. Validate and assign properties
this.ensureID() // 3. Generate ID if not provided
}π See detailed examples: Entity Guide β includes full entity implementation
A Use Case must never, absolutely never know about concrete implementations. It should only work with abstractions (interfaces).
This is the most important rule: the Use Case receives abstractions, never implementations.
- β
Entities (
core/*/entity) - β
Repository interfaces (
core/*/repository) - β
Adapter interfaces (
IHttpAdapter,ICacheAdapter, etc.) - β
Utils and decorators (
utils/) - β Types and interfaces
| β Avoid | β Prefer |
|---|---|
import { Controller } from '@nestjs/common' |
No framework imports |
import { UserRepository } from 'modules/user/repository' |
import { IUserRepository } from 'core/user/repository' |
import { HttpService } from 'infra/http' |
import { IHttpAdapter } from 'infra/http' (interface only) |
| Direct database calls (TypeORM, Mongoose) | Repository interface methods |
new RedisService() |
Receive ICacheAdapter via constructor |
The golden rule:
// β WRONG - Use Case knows the implementation
import { HttpService } from '@/infra/http/service';
class MyUseCase {
constructor(private http: HttpService) {} // Concrete class!
}
// β
CORRECT - Use Case only knows the abstraction
import { IHttpAdapter } from '@/infra/http/adapter';
class MyUseCase implements IUsecase {
constructor(private http: IHttpAdapter) {} // Interface!
}Why? The Use Case should work identically whether:
- Called from a REST controller, GraphQL resolver, CLI, or message queue
- Using Redis or Memcached for cache
- Using Axios or Fetch for HTTP
- Running in tests with mocks
Every Use Case must implement the IUsecase interface. This is mandatory in this project.
| β Avoid | β Prefer |
|---|---|
class MyUseCase { } |
class MyUseCase implements IUsecase { } |
export class CreateUserUseCase { } |
export class CreateUserUseCase implements IUsecase { } |
// β WRONG - Not implementing IUsecase
export class CatCreateUsecase {
constructor(private readonly catRepository: ICatRepository) {}
async execute(input: CatCreateInput): Promise<CatCreateOutput> {
// ...
}
}
// β
CORRECT - Implements IUsecase
import { IUsecase } from '@/utils/usecase';
export class CatCreateUsecase implements IUsecase {
constructor(private readonly catRepository: ICatRepository) {}
@ValidateSchema(CatCreateSchema)
async execute(input: CatCreateInput): Promise<CatCreateOutput> {
// ...
}
}Why this matters:
- Contract enforcement β Ensures all Use Cases have the same structure
- Dependency injection β NestJS can properly inject and resolve Use Cases
- Type safety β TypeScript validates that
execute()method exists - Consistency β Every Use Case follows the same pattern across the project
π See detailed patterns: Use Case Guide β includes architecture diagrams and testing patterns
The repository interface should only declare methods that don't exist in the generic IRepository<T>. The generic repository already provides 20+ methods:
| β Avoid | β Prefer |
|---|---|
Declaring create(), findById(), update() |
Already inherited from IRepository<T> |
| Duplicating generic query methods | Only add domain-specific queries |
// β Wrong - These already exist in IRepository
export abstract class ICatRepository extends IRepository<CatEntity> {
abstract create(entity: CatEntity): Promise<CatEntity> // Already exists!
abstract findById(id: string): Promise<CatEntity> // Already exists!
}
// β
Correct - Only domain-specific methods
export abstract class ICatRepository extends IRepository<CatEntity> {
abstract paginate(input: CatListInput): Promise<CatListOutput>
abstract findByBreed(breed: string): Promise<CatEntity[]>
}π See full method list: Repository Guide β includes IRepository<T> generic methods and examples
Controllers and Adapters must never contain business logic. Their responsibility is limited to:
- Orchestration β Receive request, call use case, return response
- Input standardization β Transform and normalize inputs for the use case
| β Avoid | β Prefer |
|---|---|
| Calculations in controller | Move to Use Case or Entity |
| Conditional business rules | Move to Use Case |
| Data manipulation | Move to Use Case |
| Multiple repository calls | Move to Use Case |
When input standardization is OK:
We standardize listing inputs (pagination, sorting, search) in the Controller before calling the Use Case:
// β
OK - Standardizing pagination inputs (not business logic)
@Get()
@Version('1')
@Permission('cat:list')
async list(@Req() { query }: ApiRequest): Promise<CatListOutput> {
const input: CatListInput = {
sort: SortHttpSchema.parse(query.sort),
search: SearchHttpSchema.parse(query.search),
limit: Number(query.limit),
page: Number(query.page)
}
return await this.listUsecase.execute(input)
}
// β WRONG - Business logic in controller
@Post()
@Version('1')
@Permission('cat:create')
async create(@Req() { body }: ApiRequest): Promise<CatCreateOutput> {
// DON'T DO THIS - business logic belongs in Use Case
if (body.age > 10) {
body.status = 'senior';
}
const discount = body.price * 0.1; // Business calculation!
return await this.createUsecase.execute({ ...body, discount });
}π See detailed patterns: Controller Guide and Adapter Guide β includes examples and best practices
The core/ folder must remain pure and framework-agnostic. Never import external libraries directly into entities or use cases.
| β Avoid in Core | β Prefer |
|---|---|
import axios from 'axios' |
Use IHttpAdapter interface |
import { Repository } from 'typeorm' |
Use IRepository<T> interface |
import moment from 'moment' |
Use utils/date or native Date |
import _ from 'lodash' |
Use utils/collection or native methods |
import Redis from 'ioredis' |
Use ICacheAdapter interface |
Why?
If you import axios directly into a Use Case:
- You can't easily test it (need to mock axios globally)
- You can't swap to
fetchor another HTTP client - Your core business logic is coupled to a specific library
// β WRONG - External library in Use Case
import axios from 'axios';
export class GetExternalDataUseCase {
async execute(): Promise<ExternalData> {
const response = await axios.get('https://api.example.com/data');
return response.data;
}
}
// β
CORRECT - Use abstraction
import { IHttpAdapter } from '@/infra/http/adapter';
import { IUsecase } from '@/utils/usecase';
export class GetExternalDataUseCase implements IUsecase {
constructor(private readonly http: IHttpAdapter) {}
async execute(): Promise<ExternalData> {
const response = await this.http.get({ url: 'https://api.example.com/data' });
return response.data;
}
}Allowed in Core:
- β Zod (validation is part of domain logic)
- β Native Node.js/JavaScript APIs
- β
Your own
utils/functions
Need an external library? If you need functionality from an external library, create a centralized wrapper in libs/ or utils/:
// β WRONG - Using lodash directly in Use Case
import _ from 'lodash';
export class MyUseCase {
execute(data: Product[]): Record<string, Product[]> {
return _.groupBy(data, 'category'); // Direct lodash usage
}
}
// β
CORRECT - Create a centralized wrapper
// utils/collection.ts
import _ from 'lodash';
export const CollectionUtil = {
groupBy: <T>(array: T[], key: keyof T) => _.groupBy(array, key),
uniqBy: <T>(array: T[], key: keyof T) => _.uniqBy(array, key),
// ... expose only what you need
};
// Then in Use Case
import { CollectionUtil } from '@/utils/collection';
import { IUsecase } from '@/utils/usecase';
export class MyUseCase implements IUsecase {
execute(data: Product[]): Record<string, Product[]> {
return CollectionUtil.groupBy(data, 'category'); // β
Uses wrapper
}
}Benefits of centralization:
- Single point of change if you need to swap libraries
- Easier to mock in tests
- Controls which functions are exposed
- Documents which external libs are used in the project
When creating Input/Output types, always derive them from the Entity. This is mandatory to avoid property duplication.
// β WRONG - Duplicating properties that exist in Entity
type UserCreateInput = {
name: string; // Already in UserEntity!
email: string; // Already in UserEntity!
password: string; // Already in UserEntity!
};
// β
CORRECT - Compose from Entity
type UserCreateInput = Pick<UserEntity, 'name' | 'email' | 'password'>;
// β
CORRECT - Extend when needed
type UserUpdateInput = Pick<UserEntity, 'id'> & Partial<Pick<UserEntity, 'name' | 'email'>>;
// β
CORRECT - Omit sensitive fields for output
type UserOutput = Omit<UserEntity, 'password' | 'deletedAt'>;When you need runtime validation, use Zod schema with z.infer. Zod has built-in pick and omit methods for composition:
// β
Schema with validation using pick (cleaner)
const UserCreateSchema = UserEntitySchema.pick({
name: true,
email: true,
password: true,
});
// β
Schema using omit (exclude fields)
const UserOutputSchema = UserEntitySchema.omit({
password: true,
deletedAt: true,
});
// β
Infer type from schema
type UserCreateInput = z.infer<typeof UserCreateSchema>;
type UserOutput = z.infer<typeof UserOutputSchema>;π See detailed patterns: Entity Guide β includes schema composition examples
Never use prefixes or suffixes like DTO, ViewModel, Request, Response. The standard naming convention is:
| β Avoid | β Use |
|---|---|
CreateUserDTO |
UserCreateInput |
UserResponseDTO |
UserCreateOutput |
UserViewModel |
UserOutput |
GetUserRequest |
UserGetInput |
UserListResponse |
UserListOutput |
Pattern: {Entity}{Action}{Input|Output}
// Naming examples
type UserCreateInput = Pick<UserEntity, 'name' | 'email' | 'password'>;
type UserCreateOutput = Pick<UserEntity, 'id' | 'name' | 'email' | 'createdAt'>;
type UserUpdateInput = Pick<UserEntity, 'id'> & Partial<Pick<UserEntity, 'name'>>;
type UserUpdateOutput = Pick<UserEntity, 'id' | 'name' | 'updatedAt'>;
type UserListInput = { pagination: PaginationInput; search?: string };
type UserListOutput = { data: UserOutput[]; pagination: PaginationOutput };The any type defeats the purpose of TypeScript. Always provide explicit types when it makes sense and doesn't create unnecessary complexity.
| β Avoid | β Prefer |
|---|---|
function process(data: any) |
function process(data: UserEntity) |
const result: any = await fetch() |
const result: ApiResponse = await fetch() |
items.map((item: any) => ...) |
items.map((item: OrderItem) => ...) |
When to type:
- Function parameters β Always type them
- Function return types β Type when not obvious from implementation
- Variables β Type when TypeScript can't infer correctly
- Generics β Use generics instead of
anyfor flexible types
// β WRONG - Using any
const processItems = (items: any[]): any => {
return items.map((item: any) => item.value);
}
// β
CORRECT - Properly typed
const processItems = <T extends { value: number }>(items: T[]): number[] => {
return items.map((item) => item.value);
}
// β
CORRECT - Using unknown when type is truly unknown
const parseJson = (json: string): unknown => {
return JSON.parse(json);
}When any is unavoidable:
Sometimes you genuinely can't type something properly (third-party libraries, complex dynamic types, etc.). In these cases, use eslint-disable to acknowledge the exception:
// β
OK - Acknowledged exception with eslint-disable
// eslint-disable-next-line @typescript-eslint/no-explicit-any
const handleLegacyApi = (response: any): ProcessedData => {
// Legacy API with unpredictable structure
return transformLegacyResponse(response);
}
// β
OK - Type assertion after validation
const processExternalData = (data: unknown): UserData => {
if (!isValidUserData(data)) {
throw new Error('Invalid data');
}
return data as UserData;
}The rule of thumb: If you're reaching for any, ask yourself:
- Can I use a specific type? β Use it
- Can I use a generic? β Use
<T> - Can I use
unknown? β Safer thanany - None of the above work? β Use
anywitheslint-disable
This is a project standard: every function must have an explicit return type. TypeScript can infer return types, but explicit declarations improve code readability and catch errors earlier.
| β Avoid | β Prefer |
|---|---|
async getUser() |
async getUser(): Promise<UserEntity> |
const sum = (a, b) => |
const sum = (a: number, b: number): number => |
execute(input) |
execute(input: CreateInput): Promise<void> |
// β WRONG - No explicit return type
async getById(id: string) {
return await this.repository.findById(id)
}
// β WRONG - Missing Promise<void>
async delete(id: string) {
await this.repository.delete(id)
}
// β
CORRECT - Explicit return types
async getById(id: string): Promise<UserEntity> {
return await this.repository.findById(id)
}
async delete(id: string): Promise<void> {
await this.repository.delete(id)
}
// β
CORRECT - Even for simple functions
const calculateTotal = (items: OrderItem[]): number => {
return items.reduce((sum, item) => sum + item.price, 0)
}Why this matters:
- Self-documentation β Reading the function signature tells you exactly what to expect
- Earlier error detection β TypeScript catches mismatches at compile time
- Refactoring safety β Changing implementation won't accidentally change return type
- API contracts β Makes interfaces and abstractions crystal clear
Common return types:
| Scenario | Return Type |
|---|---|
| Async operation that returns data | Promise<EntityType> |
| Async operation with no return | Promise<void> |
| Sync function returning value | string, number, boolean, etc. |
| Function returning nothing | void |
| Function that may return null | Promise<Entity | null> |
In DDD, an Aggregate is a cluster of related entities that are treated as a single unit. When entities belong to the same aggregate, they can live together in the same folder.
Example: User Aggregate
If User and Address are always created/updated together and Address has no meaning without a User, they belong to the same aggregate:
core/
βββ user/
βββ entity/
β βββ user.ts # Aggregate Root
β βββ address.ts # Belongs to User aggregate
βββ repository/
β βββ user.ts # Main repository
β βββ address.ts # Can have its own repository if needed
βββ use-cases/
βββ user-create.ts # May create User + Address together
βββ address-update.ts # Can update Address independently
When to use aggregates:
| Scenario | Same Folder (Aggregate) | Separate Folders |
|---|---|---|
| Entities always created together | β | β |
| Child has no meaning without parent | β | β |
| Shared business rules | β | β |
| Entities are independent | β | β |
| Different lifecycles | β | β |
Key rules:
- Aggregate Root β One entity is the "root" (e.g.,
User). External access should go through it - Transactional consistency β Operations within an aggregate should be atomic
- Own rules β Each entity can still have its own validation and behavior
- Separate repositories are OK β
Addresscan have its own repository for specific queries
Practical example:
// user-create.ts - Creates User with Address in same transaction
import { IUsecase } from '@/utils/usecase';
export class UserCreateUseCase implements IUsecase {
constructor(
private readonly userRepository: IUserRepository,
private readonly addressRepository: IAddressRepository,
) {}
async execute(input: UserCreateInput): Promise<UserCreateOutput> {
// Create both as part of the same aggregate operation
const user = new UserEntity(input.user);
const address = new AddressEntity({ ...input.address, userId: user.id });
await this.userRepository.create(user);
await this.addressRepository.create(address);
return new UserCreateOutput(user);
}
}Don't over-engineer: Not everything needs to be an aggregate. Start simple β if you notice entities are always manipulated together, then group them.
Understanding which layers can communicate with which is crucial for maintaining the architecture.
Dependencies always point inward. Inner layers never know about outer layers.
| Layer | Can Access | Cannot Access |
|---|---|---|
| Core (Entities) | Nothing | Everything else |
| Core (Use Cases) | Entities, Repository Interfaces | Modules, Infra, Libs |
| Core (Repositories) | Entities | Everything else (it's just an interface) |
| Modules | Core (all), Infra, Libs | β |
| Infra | Core Interfaces | Core Use Cases, Modules |
| Libs | Nothing from src/ | β |
βββββββββββββββββββββββ
β MODULES β
β (Controllers, β
β Adapters) β
ββββββββββββ¬βββββββββββ
β uses
βΌ
ββββββββββββββββ βββββββββββββββββββββββ ββββββββββββββββ
β INFRA ββββββββ CORE βββββββΊβ LIBS β
β (Database, β β (Entities, Use β β (Tokens, β
β Cache, β β Cases, Repo β β Events, β
β HTTP) β β Interfaces) β β i18n) β
ββββββββββββββββ βββββββββββββββββββββββ ββββββββββββββββ
β β²
β β
ββββββββββββββββββββββββ
implements
When a user creates a new cat:
1. Controller (modules/cat/controller.ts)
βββ receives HTTP request
2. Adapter (modules/cat/adapter.ts)
βββ transforms request, calls use case
3. Use Case (core/cat/use-cases/cat-create.ts)
βββ contains business logic
βββ calls repository interface
4. Repository Interface (core/cat/repository/cat.ts)
βββ defines contract (what, not how)
5. Repository Implementation (modules/cat/repository.ts)
βββ implements the interface
βββ uses TypeORM/Mongoose to persist data
The use case never knows if data goes to PostgreSQL, MongoDB, or a mock. It only knows it has a repository that can create(), update(), delete(), and findById().
src/
βββ core/ # π§ Business Logic (Framework-agnostic)
β βββ [module]/
β βββ entity/ # Domain entities with Zod validation
β βββ repository/ # Repository interfaces (contracts)
β βββ use-cases/ # Business rules and operations
β βββ __tests__/ # Unit tests for use cases
β
βββ modules/ # π NestJS Application Layer
β βββ [module]/
β βββ adapter.ts # Connects controllers to use cases
β βββ controller.ts # HTTP endpoints
β βββ module.ts # NestJS module definition
β βββ repository.ts # Repository implementation
β βββ swagger.ts # API documentation
β
βββ infra/ # π§ Infrastructure Layer
β βββ database/ # Database connections and schemas
β βββ cache/ # Redis and in-memory cache
β βββ http/ # HTTP client with circuit breaker
β βββ logger/ # Pino logger configuration
β βββ secrets/ # Environment variables management
β βββ repository/ # Base repository implementations
β
βββ libs/ # π Shared Libraries
β βββ event/ # Event emitter system
β βββ i18n/ # Internationalization
β βββ token/ # JWT management
β βββ metrics/ # Prometheus metrics
β
βββ utils/ # π οΈ Utility Functions
βββ decorators/ # Custom decorators
βββ middlewares/ # HTTP middlewares
βββ interceptors/ # NestJS interceptors
βββ filters/ # Exception filters
| Folder | Responsibility | Can Import From |
|---|---|---|
core/ |
Pure business logic, entities, use cases, repository contracts | Only itself |
modules/ |
NestJS controllers, dependency injection, route handling | core/, infra/, libs/ |
infra/ |
External services, databases, cache, HTTP clients | core/ (interfaces only) |
libs/ |
Reusable libraries, framework-agnostic utilities | Nothing from src/ |
utils/ |
Helper functions, decorators, middlewares | Anything |
- Node.js >= 22.0.0
- Docker >= 20.x
- Docker Compose >= 2.x
git clone https://github.com/mikemajesty/nestjs-microservice-boilerplate-api.git
cd nestjs-microservice-boilerplate-api
# Use correct Node version
nvm install && nvm use
# Install dependencies
npm installnpm run setupThis starts PostgreSQL, MongoDB (replica set), Redis, Zipkin, Prometheus, Grafana, and more.
npm run start:devThe API will be available at http://localhost:5000
Login with default credentials:
curl -X 'POST' \
'http://localhost:5000/api/v1/login' \
-H 'accept: application/json' \
-H 'Content-Type: application/json' \
-d '{
"email": "admin@admin.com",
"password": "admin"
}'Open Swagger documentation: http://localhost:5000/api-docs
The following diagram illustrates how a request flows through the system:
Flow explanation:
- Client sends HTTP request
- Controller receives and validates input
- Adapter transforms request and calls use case
- Use Case executes business logic
- Repository (via interface) persists/retrieves data
- Response flows back through the same layers
Complete documentation for every aspect of this project is available in the guides/ folder. Each guide provides in-depth explanations, examples, and best practices.
π Core
Business logic layer documentation.
| Guide | Description |
|---|---|
| Entity | Domain entities with Zod validation |
| Use Case | Business rules and operations |
| Repository | Repository interface patterns |
| Test | Testing use cases |
π Modules
NestJS application layer documentation.
| Guide | Description |
|---|---|
| Module | NestJS module structure |
| Controller | HTTP endpoints |
| Adapter | Use case adapters |
| Repository | Repository implementations |
| Test | Module testing |
π Infrastructure
External services and integrations.
| Guide | Description |
|---|---|
| Database | PostgreSQL and MongoDB setup |
| Cache | Redis and in-memory caching |
| HTTP | HTTP client with circuit breaker |
| Logger | Pino logging configuration |
| Secrets | Environment variables |
| Repository | Base repository patterns |
| Email sending with templates |
π Libraries
Shared libraries and utilities.
| Guide | Description |
|---|---|
| Token | JWT management |
| Event | Event emitter system |
| i18n | Internationalization |
| Metrics | Prometheus metrics |
π Decorators
Custom decorators for common patterns.
| Guide | Description |
|---|---|
| Circuit Breaker | Resilience pattern |
| Permission | Authorization decorator |
| Validate Schema | Input validation |
| Log Execution Time | Performance logging |
| Request Timeout | Timeout handling |
| Process | Background processing |
| Thread | Worker threads |
π Middlewares
HTTP middleware components.
| Guide | Description |
|---|---|
| Authentication | JWT authentication |
| Authorization | Role-based access |
| HTTP Logger | Request/response logging |
| Tracing | Distributed tracing |
| Exception Handler | Error handling |
π Tests
Testing utilities and patterns.
| Guide | Description |
|---|---|
| Mock | Mock data generation |
| Containers | Testcontainers setup |
| Util | Test utilities |
π Setup
Project configuration and setup.
| Guide | Description |
|---|---|
| Environment | Environment variables |
| Docker | Docker configuration |
| Husky | Git hooks |
| Package | NPM scripts |
π Deploy
Deployment and CI/CD documentation.
| Guide | Description |
|---|---|
| Readme | Complete deployment guide |
| Action | GitHub Actions workflows |
π Utils
Utility functions and helpers.
| Guide | Description |
|---|---|
| Pagination | Pagination utilities |
| Exception | Exception handling |
| Crypto | Encryption utilities |
| Date | Date manipulation |
| Validator | Validation helpers |
| Collection | Array utilities |
| Search | Search utilities |
- JWT-based authentication with refresh tokens
- Role-Based Access Control (RBAC)
- Permission system with granular control
- Password reset flow with email
- PostgreSQL with TypeORM for relational data
- MongoDB with Mongoose (3-node replica set)
- Automatic migrations
- Distributed Tracing with OpenTelemetry and Zipkin
- Logging with Pino and Loki
- Metrics with Prometheus and Grafana
- Health Checks for all services
- CRUD Scaffolding β generate complete modules with
npm run scaffold - 100% Test Coverage β comprehensive test suites
- Type Safety β full TypeScript with Zod validation
- API Documentation β Swagger UI with TypeSpec
- Circuit Breaker pattern for external calls
- Retry Logic with exponential backoff
- Request Timeout handling
| Category | Technologies |
|---|---|
| Framework | NestJS 11.x, TypeScript 5.9.3 |
| Databases | PostgreSQL (TypeORM), MongoDB (Mongoose), Redis |
| Observability | OpenTelemetry, Zipkin, Pino, Prometheus, Grafana, Loki |
| Testing | Jest, Supertest, Testcontainers |
| Code Quality | ESLint, Prettier, Husky, Commitlint |
| DevOps | Docker, Docker Compose, PM2, GitHub Actions |
| Documentation | Swagger, TypeSpec |
Contributions are welcome! Please read our contributing guidelines before submitting a PR.
- Fork the repository
- Create your feature branch (
git checkout -b feature/amazing-feature) - Commit your changes using conventional commits (
git commit -m 'feat: add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.
- β Star this repository if you find it useful
- π Report bugs
- π‘ Request features
- π Read the guides
Built with β€οΈ by Mike Lima

