Behat
Vortex uses Behat for Behavior-Driven Development (BDD) testing. Behat allows to write human-readable stories that describe the behavior of the application. Behat tests primarily focus on critical user journeys, serving as comprehensive end-to-end validations.
Vortex provides full Behat support, including configuration in behat.yml
and a browser container to run tests interactively in a real browser with
a VNC viewer.
Additional features include:
- Behat Drupal Extension - an extension to work with Drupal.
- Behat Steps - a library of re-usable Behat steps.
- Behat Screenshot - extension to capture screenshots on-demand and on failure.
- Behat Progress formatter - extension to show progress as TAP and failures inline.
- Parallel profiles - configuration to allow running tests in parallel.
Running tests
- Ahoy
- Docker Compose
# Run all Behat tests
ahoy test-bdd
# Run specific feature file
ahoy test-bdd tests/behat/features/homepage.feature
# Run scenarios with specific tag
ahoy test-bdd -- --tags=@smoke
# Run all Behat tests
docker compose exec cli vendor/bin/behat
# Run specific feature file
docker compose exec cli vendor/bin/behat tests/behat/features/homepage.feature
# Run scenarios with specific tag
docker compose exec cli vendor/bin/behat -- --tags=@smoke
Discovering available step definitions
- Ahoy
- Docker Compose
# Generate step definitions reference
ahoy test-bdd -- --definitions=l
# Generate step definitions reference
docker compose exec cli vendor/bin/behat -- --definitions=l
FeatureContext
The FeatureContext.php file comes with
included steps from Behat steps package.
You can add your custom steps into this file.
Profiles
Behat's default profile configured with sensible defaults to allow running it
with provided extensions.
In continuous integration environment, the profile can be overridden using
$VORTEX_CI_BEHAT_PROFILE environment variable.
Parallel runs
In continuous integration pipeline, Behat tests can run within multiple runners
to increase the speed of the test suite. To achieve this, Behat tags are used to
mark features and scenarios with @p* tags.
Out of the box, Vortex provides support for unlimited parallel
runners, but only 2 parallel profiles p0 and p1: a feature can be tagged by
either @p0 or @p1 to run in a dedicated runner, or with both tags to run in
both runners.
Note that you can easily add more p* profiles in your behat.yml by copying
existing p1 profile and changing several lines of configuration.
Features without @p* tags will always run in the first CI pipeline runner, so
even if you forget to tag the feature, it will still be allocated to a runner.
If CI pipeline has only one runner - a default profile will be used and all tests
(except for those that tagged with @skipped) will be run.
Skipping tests
Add @skipped tag to a feature or scenario to exclude it from the test run.
Screenshots
Test screenshots are stored into .logs/screenshots location by default,
which can be overwritten using $BEHAT_SCREENSHOT_DIR variable (courtesy of
Behat Screenshot package).
In continuous integration pipeline, screenshots are stored as build artifacts.
In GitHub Actions, they can be downloaded from the Summary tab.
In CircleCI they are accessible in the Artifacts tab.
Format
Out of the box, Vortex comes with Behat Progress formatter output formatter to show progress as TAP and failures inline. This allows to continue test runs after a failure while maintaining a minimal output.
Reporting
Test reports are stored in .logs/behat directory.
Continuous integration pipeline usually uses them to track test performance and stability.
Boilerplate test features
Vortex provides BDD tests boilerplate for homepage and login user journeys.
These boilerplate tests run in continuous integration pipeline when you install Vortex and can be used as a starting point for writing your own.
Writing tests
For project-specific test writing conventions (user story format, standard user
types, test data conventions), see your project's docs/testing.md file.
The docs/testing.md file is scaffolded when you install Vortex and should
be maintained by your project team to document agreed-upon testing practices.