When creating a new test, we want to quickly understand where it is issued. For that matter we have created the following definition to set as describes
.
Class
> #method
> expected result
> conditions for that result
For example:
describe('User', () => {
describe('#create', () => {
describe('with valid data', () => {
it('creates a user', async () => {});
});
describe('with invalid data', () => {
it('throws an error with invalid name', async () => {});
});
});
});
- In the case of controllers the
#method
is replaced withVERB /endpoint
asGET /users/:id
. - Avoid wrapping too many definitions under the same describe.
To run integrations tests, from command line execute:
npm run test:integration
Currently the initial configuration wipes the information from the database before each test, creating a data isolation for each test to avoid taking into account any other data. Each test will have to create their own data to test specifically the purpose of that definition.
- Keep in mind that the test are run under the
NODE_ENV=test
, which could affect the way the classes are initialized. All changes will be reflected on theenv.test
database. - When running integrations test outside docker, you will need to update the
DB_HOST
tolocalhost
since it won't be running under thedb
name, which is the specific database name under the docker network.
Factories work as Fixtures
for tests, they centralize the Model
build process and creation, generating fake random data for the necessary properties.
All factories should define their model based on the factory
from factory-bot
dependency, to set the props builders.
As an example, check out the Property.definition
.
.
├── controllers # Tests for controllers
├── factories # Model's Factories
├── models # Tests for Models
├── setup # Configurations for tests
├── setup/index # Entrypoint for jest configuration.
└── README.md
- Test Specs - Testing specs definitions.
- Faker - Data faker.