homeresume
 
   

Integration testing Node.js apps

January 25, 2023

Integration testing means testing a component with multiple sub-components and how they interact. Some sub-components can be external services, databases, and message queues.

External services are running, but their business logic is mocked based on received parameters (request headers, query parameters, etc.). Databases and message queues are spun up using test containers.

This post covers testing service as a component and its API endpoints. This approach can be used with any framework and language. NestJS and Express are used in the examples below.

API endpoints

Below is the controller for two endpoints. First communicates with an external service and retrieves some data based on the sent parameter. The second one retrieves the data from the database.

// users.controller.ts
@Controller('users')
export class UsersController {
constructor(private userService: UsersService) {}
@Get()
async getAll(@Query('type') type: string) {
return this.userService.findAll(type);
}
@Get(':id')
async getById(@Param('id', new ParseUUIDPipe()) id: string) {
return this.userService.findById(id);
}
}

External dependencies

External service is mocked to send data based on the received parameter.

export const createDummyUserServiceServer = async (): Promise<DummyServer> => {
return createDummyServer((app) => {
app.get('/users', (req, res) => {
if (req.query.type !== 'user') {
return res.status(403).send('User type is not valid');
}
res.json(usersResponse);
});
});
};

Tests setup

Tests for endpoints can be split into two parts. The first is related to the external dependencies setup.

The example below creates a mocked service and spins up the database using test containers. The environment variables are set for before mentioned dependencies, and the leading service starts running.

The database is cleaned before every test run. External dependencies (mocked service and database) are closed after tests finish.

// test/users.spec.ts
describe('UsersController (integration)', () => {
let app: INestApplication;
let dummyUserServiceServerClose: () => void;
let postgresContainer: StartedTestContainer;
let usersRepository: Repository<UsersEntity>;
const databaseConfig = {
databaseName: 'nestjs-starter-db',
databaseUsername: 'user',
databasePassword: 'some-r4ndom-pasS',
databasePort: 5432,
}
beforeAll(async () => {
const dummyUserServiceServer = await createDummyUserServiceServer();
dummyUserServiceServerClose = dummyUserServiceServer.close;
postgresContainer = await new GenericContainer('postgres:15-alpine')
.withEnvironment({
POSTGRES_USER: databaseConfig.databaseUsername,
POSTGRES_PASSWORD: databaseConfig.databasePassword,
POSTGRES_DB: databaseConfig.databaseName,
})
.withExposedPorts(databaseConfig.databasePort)
.start();
const moduleFixture: TestingModule = await Test.createTestingModule({
imports: [AppModule],
})
.overrideProvider(ConfigService)
.useValue({
get: (key: string): string => {
const map: Record<string, string | undefined> = process.env;
map.USER_SERVICE_URL = dummyUserServiceServer.url;
map.DATABASE_HOSTNAME = postgresContainer.getHost();
map.DATABASE_PORT = `${postgresContainer.getMappedPort(databaseConfig.databasePort)}`;
map.DATABASE_NAME = databaseConfig.databaseName;
map.DATABASE_USERNAME = databaseConfig.databaseUsername;
map.DATABASE_PASSWORD = databaseConfig.databasePassword;
return map[key] || '';
},
})
.compile();
app = moduleFixture.createNestApplication();
usersRepository = app.get(getRepositoryToken(UsersEntity));
await app.init();
});
beforeEach(async () => {
await usersRepository.delete({});
});
afterAll(async () => {
await app.close();
dummyUserServiceServerClose();
await postgresContainer.stop();
});
// ...
});

Tests

The second part covers tests for the implemented endpoints. The first test suite asserts retrieving data from the external service based on the sent type as a query parameter.

// test/users.spec.ts
describe('/users (GET)', () => {
it('should return list of users', async () => {
return request(app.getHttpServer())
.get('/users?type=user')
.expect(HttpStatus.OK)
.then((response) => {
expect(response.body).toEqual(usersResponse);
});
});
it('should throw an error when type is forbidden', async () => {
return request(app.getHttpServer())
.get('/users?type=admin')
.expect(HttpStatus.FORBIDDEN);
});
});

The second test suite asserts retrieving the data from the database.

// test/users.spec.ts
describe('/users/:id (GET)', () => {
it('should return found user', async () => {
const userId = 'b618445a-0089-43d5-b9ca-e6f2fc29a11d';
const userDetails = {
id: userId,
firstName: 'tester',
};
const newUser = await usersRepository.create(userDetails);
await usersRepository.save(newUser);
return request(app.getHttpServer())
.get(`/users/${userId}`)
.expect(HttpStatus.OK)
.then((response) => {
expect(response.body).toEqual(userDetails);
});
});
it('should return 404 error when user is not found', async () => {
const userId = 'b618445a-0089-43d5-b9ca-e6f2fc29a11d';
return request(app.getHttpServer())
.get(`/users/${userId}`)
.expect(HttpStatus.NOT_FOUND);
});
});

Boilerplate

Here is the link to the boilerplate with the mentioned examples.

2022

Debugging Node.js apps with Visual Studio Code debugger

December 28, 2022

Rather than doing it with console logs, debugging with a debugger and breakpoints is recommended. VSCode provides a built-in debugger for JavaScript-based apps. This post covers configuring and running a debugger for different kinds of Node.js apps in VSCode.

Configuration basics

VSCode configurations can use runtime executables like npm and ts-node. The beforementioned executables should be installed globally before running the configurations. A configuration can use the program field to point to the binary executable package inside node_modules directory to avoid installing packages globally. Runtime executables and programs can have arguments defined in runtimeArgs and args fields, respectively.

A configuration can have different requests: attach - the debugger is attached to the running process, and launch - the debugger launches a new process and wraps it. There are multiple configurations: node - runs the program from program field, and logs are shown in debug console, and node-terminal - runs the command from command field and shows the logs in the terminal.

The configuration file is .vscode/launch.json. The selected configuration on the Run and Debug tab is used as the default one.

Launch configs

Node

Below are examples of configurations for running Node processes with the debugger.

{
"version": "0.2.0",
"configurations": [
// ...
{
"name": "Launch script in debug console",
"program": "index.js", // update entry point
"request": "launch",
"type": "node",
"skipFiles": [
"<node_internals>/**"
]
},
{
"name": "Launch script in the terminal",
"command": "node index.js", // update entry point
"request": "launch",
"type": "node-terminal",
"skipFiles": [
"<node_internals>/**"
]
}
]
}

Running npm scripts in debug mode

The debugger launches the following script in both configurations, dev in this case.

{
"version": "0.2.0",
"configurations": [
// ...
{
"name": "Launch dev script in debug console",
"runtimeExecutable": "npm",
"runtimeArgs": [
"run",
"dev"
],
"request": "launch",
"type": "node",
"skipFiles": [
"<node_internals>/**"
]
},
{
"name": "Launch dev script in the terminal",
"command": "npm run dev",
"request": "launch",
"type": "node-terminal"
}
]
}

ts-node

The following configurations will wrap the debugger around ts-node entry point.

{
"version": "0.2.0",
"configurations": [
// ...
{
"name": "Launch ts-node script in debug console",
"program": "node_modules/.bin/ts-node",
"args": ["index.ts"], // update entry point
"request": "launch",
"type": "node",
"skipFiles": [
"<node_internals>/**"
]
},
{
"name": "Launch ts-node script in the terminal",
"command": "ts-node index.ts", // update entrypoint
"request": "launch",
"type": "node-terminal",
"skipFiles": [
"<node_internals>/**"
]
}
]
}

@babel/node

The following configuration will wrap the debugger around babel-node entry point.

{
"version": "0.2.0",
"configurations": [
// ...
{
"name": "Launch babel-node script in debug console",
"program": "node_modules/.bin/babel-node",
"args": ["src"], // update entry point
"request": "launch",
"type": "node",
"skipFiles": [
"<node_internals>/**"
]
}
]
}

Nodemon

Add new configuration with Run Add configuration option, select Node.js: Nodemon Setup, update program field to point to the nodemon executable package, also add arguments with args field to point to the entry point.

{
"version": "0.2.0",
"configurations": [
// ...
{
"name": "Launch nodemon script in debug console",
"program": "node_modules/.bin/nodemon",
"args": ["-r", "dotenv/config", "--exec", "babel-node", "src/index.js"], // update entry point
"request": "launch",
"type": "node",
"console": "integratedTerminal",
"internalConsoleOptions": "neverOpen",
"restart": true,
"skipFiles": [
"<node_internals>/**"
]
}
]
}

Mocha

Add a new configuration, choose Node.js: Mocha Tests configuration, and replace tdd with bdd as u parameter and program field to point to the mocha executable package.

{
"version": "0.2.0",
"configurations": [
// ...
{
"name": "Launch mocha tests",
"program": "node_modules/.bin/mocha",
"args": [
"-u",
"bdd",
"--timeout",
"999999",
"--colors",
"test"
],
"request": "launch",
"type": "node",
"internalConsoleOptions": "openOnSessionStart",
"skipFiles": [
"<node_internals>/**"
]
}
]
}

Attach configs

Auto Attach should be activated in settings with the With Flag value. In that case, auto-attaching is done when --inspect flag is given. The debugger should be attached when some of the following scripts are executed.

{
// ...
"scripts": {
// ...
"start:debug": "node --inspect index.js", // update entry point
}
}
Jest
{
// ...
"scripts": {
// ...
"test:debug": "node --inspect -r tsconfig-paths/register -r ts-node/register node_modules/.bin/jest --runInBand",
}
}
NestJS
{
// ...
"scripts": {
// ...
"start:debug": "nest start --debug --watch"
}
}

Debugging basics

During the debugging, the variables tab shows local variables. The step over option goes to the following statement in the codebase, while step into option goes deeper into the current statement. Log points can add logs in debug console when a certain part of the codebase is executed without pausing the process.

Timeout with Fetch API

November 2, 2022

Setting up a timeout for HTTP requests can prevent the connection from hanging forever, waiting for the response. It can be set on the client side to improve user experience, and on the server side to improve inter-service communication. Fetch API is fully available in Node as well from version 18.

AbortController can be utilized to set up timeouts. Instantiated abort controller has a signal property which represents reference to its associated AbortSignal object. Abort signal object is used as a signal parameter in the request with Fetch API, so HTTP request is aborted when abort method is called.

const HTTP_TIMEOUT = 3000;
const URL = 'https://www.google.com:81';
(async () => {
const controller = new AbortController();
const timeoutId = setTimeout(() => controller.abort(), HTTP_TIMEOUT);
try {
const response = await fetch(URL, {
signal: controller.signal
}).then((res) => res.json());
console.log(response);
} catch (error) {
console.error(error);
} finally {
clearTimeout(timeoutId);
}
})();

Further reading

Push notifications with Firebase

May 29, 2022

Push notifications are a great alternative to email notifications. There is no need for a verification step, UX is improved, and user engagement with the app is increased. This post covers both front-end and back-end setups.

Requirements for the push notifications

  • firebase package installed
  • Created Firebase project
  • Firebase configuration can be found on Project Overview Project settings General Your apps
  • Public Vapid key can be found on Project Overview Project settings Cloud Messaging Web Push certificates (used on the front-end)
  • Firebase messaging service worker
  • Project ID can be found on Project Overview Project settings General tab
  • Server key for sending the push notifications (used on the back end)
  • HTTPS connection (localhost for local development)

Front-end setup

Firebase messaging service worker

The following service worker should be registered for handling background notifications. A custom notificationclick handler should be implemented before importing firebase libraries. The below implementation opens a new window with the defined URL if it is not already open. Firebase automatically checks for service workers at /firebase-messaging-sw.js, so it should be publicly available.

// /firebase-messaging-sw.js
/* eslint-disable no-unused-vars */
self.addEventListener('notificationclick', (event) => {
event.notification.close();
const DEFAULT_URL = '<URL>';
const url =
event.notification?.data?.FCM_MSG?.notification?.click_action ||
DEFAULT_URL;
event.waitUntil(
clients.matchAll({ type: 'window' }).then((clientsArray) => {
const hadWindowToFocus = clientsArray.some((windowClient) =>
windowClient.url === url ? (windowClient.focus(), true) : false
);
if (!hadWindowToFocus)
clients
.openWindow(url)
.then((windowClient) => (windowClient ? windowClient.focus() : null));
})
);
});
importScripts('https://www.gstatic.com/firebasejs/9.15.0/firebase-app-compat.js');
importScripts('https://www.gstatic.com/firebasejs/9.15.0/firebase-messaging-compat.js');
const firebaseApp = initializeApp({
apiKey: 'xxxxxx',
authDomain: 'xxxxxx',
projectId: 'xxxxxx',
storageBucket: 'xxxxxx',
messagingSenderId: 'xxxxxx',
appId: 'xxxxxx',
measurementId: 'xxxxxx'
});
const messaging = getMessaging(firebaseApp);

Helper functions

getToken
  • generates a unique registration token for the browser or gets an already generated token
  • requests permission to receive push notifications
  • triggers the Firebase messaging service worker

There are multiple types of errors as response:

  • code messaging/permission-blocked - user blocks the notifications
  • code messaging/unsupported-browser - user's browser doesn't support the APIs required to use the Firebase SDK
  • code messaging/failed-service-worker-registration - there's an issue with the Firebase messaging service worker

The access token is invalidated when a user manually blocks the notifications in the browser settings.

isSupported
  • checks if all required APIs for push notifications are supported
  • returns Promise<boolean>

It should be used in useEffect hooks.

import { isSupported } from 'firebase/messaging';
// ...
useEffect(() => {
isSupported()
.then((isAvailable) => {
if (isAvailable) {
// ...
}
})
.catch(console.error);
}, []);
// ...
initializeApp
  • should be called before the app starts
import { initializeApp } from 'firebase/app';
import { getMessaging, getToken } from 'firebase/messaging';
import { firebaseConfig } from 'constants/config';
export const initializeFirebase = () => initializeApp(firebaseConfig);
export const getTokenForPushNotifications = async () => {
const messaging = getMessaging();
const token = await getToken(messaging, {
vapidKey: process.env.NEXT_PUBLIC_VAPID_KEY
});
return token;
};

Back-end setup

Server keys

The server key for API v1 can be derived from the service account key JSON file. In that case, the JSON file should be encoded and stored in the environment variable to prevent exposing credentials in the repository codebase. The service account key JSON file can be downloaded by clicking Generate new private key on the Project Overview Project settings Service accounts tab.

import * as serviceAccountKey from './service-account-key.json';
const encodedServiceAccountKey = Buffer.from(
JSON.stringify(serviceAccountKey)
).toString('base64');
process.env.SERVICE_ACCOUNT_KEY = encodedServiceAccountKey;
import 'dotenv/config';
import * as googleAuth from 'google-auth-library';
(async () => {
const serviceAccountKeyEncoded = process.env.SERVICE_ACCOUNT_KEY;
const serviceAccountKeyDecoded = JSON.parse(
Buffer.from(serviceAccountKeyEncoded, 'base64').toString('ascii')
);
const jwt = new googleAuth.JWT(
serviceAccountKeyDecoded.client_email,
null,
serviceAccountKeyDecoded.private_key,
['https://www.googleapis.com/auth/firebase.messaging'],
null
);
const tokens = await jwt.authorize();
const authorizationHeader = `Bearer ${tokens.access_token}`;
console.log(authorizationHeader);
})();

Manually sending the push notification

The icon URL should be covered with HTTPS, so the notification correctly shows it.

  • API v1
// ...
try {
const response = await axios.post(
`https://fcm.googleapis.com/v1/projects/${projectId}/messages:send`,
{
message: {
notification: {
title: 'Push notifications with Firebase',
body: 'Push notifications with Firebase body'
},
webpush: {
fcmOptions: {
link: 'http://localhost:3000'
},
notification: {
icon: 'https://picsum.photos/200'
}
},
token: registrationToken
}
},
{
headers: {
Authorization: authorizationHeader
}
}
);
console.log(response.data);
} catch (error) {
console.error(error?.response?.data?.error);
}
// ...

A successful response returns an object with name key, which presents the notification id in the format projects/{project_id}/messages/{message_id}.

There are multiple types of errors in the response:

  • code 400 - request body is not valid
  • code 401 - the derived token is expired
  • code 404 - registration token was not found

Further reading

Sending e-mails in different environments

March 8, 2022

For local testing or testing in general, there is no need to send e-mails to real e-mail addresses. Mailtrap service can preview the e-mails for sending. The inbox page can show credentials (username and password). A list of inboxes is available at Projects page. Use other services to send e-mails in a production environment (Sendgrid, e.g.). Create SendGrid API key at API keys page.

require('dotenv/config');
const nodemailer = require('nodemailer');
(async () => {
const emailConfiguration = {
auth: {
user: process.env.EMAIL_USERNAME, // 'apikey' for Sendgrid
pass: process.env.EMAIL_PASSWORD
},
host: process.env.EMAIL_HOST, // 'smtp.mailtrap.io' for Mailtrap, 'smtp.sendgrid.net' for Sendgrid
port: process.env.EMAIL_PORT, // 2525 for Mailtrap, 465 for Sendgrid
secure: process.env.EMAIL_SECURE // true for Sendgrid
};
const transport = nodemailer.createTransport(emailConfiguration);
const info = await transport.sendMail({
from: '"Sender" <sender@example.com>',
to: 'recipient1@example.com, recipient2@example.com',
subject: 'Subject',
text: 'Text',
html: '<b>Text</b>'
});
console.log('Message sent: %s', info.messageId);
})();
2021

Spies and mocking with Jest

August 19, 2021

Unit testing, in addition to output testing, includes the usage of spies and mocking. Spies are functions that let you spy on the behavior of functions called indirectly by some other code. Spy can be created by using jest.fn(). Mocking injects test values into the code during the tests. Some of the use cases will be presented below.

  • Async function and its resolved value can be mocked using mockResolvedValue. Another way to mock it is by using mockImplementation and providing a function as an argument.
const calculationService = {
calculate: jest.fn()
};
jest.spyOn(calculationService, 'calculate').mockResolvedValue(value);
jest
.spyOn(calculationService, 'calculate')
.mockImplementation(async (a) => Promise.resolve(a));
  • Rejected async function can be mocked using mockRejectedValue and mockImplementation.
jest
.spyOn(calculationService, 'calculate')
.mockRejectedValue(new Error(errorMessage));
jest
.spyOn(calculationService, 'calculate')
.mockImplementation(async () => Promise.reject(new Error(errorMessage)));
await expect(calculateSomething(calculationService)).rejects.toThrowError(
Error
);
  • Sync function and its return value can be mocked using mockReturnValue and mockImplementation.
jest.spyOn(calculationService, 'calculate').mockReturnValue(value);
jest.spyOn(calculationService, 'calculate').mockImplementation((a) => a);
  • Chained methods can be mocked using mockReturnThis.
// calculationService.get().calculate();
jest.spyOn(calculationService, 'get').mockReturnThis();
  • Async and sync functions called multiple times can be mocked with different values using mockResolvedValueOnce and mockReturnValueOnce, respectively, and mockImplementationOnce.
jest
.spyOn(calculationService, 'calculate')
.mockResolvedValueOnce(value)
.mockResolvedValueOnce(otherValue);
jest
.spyOn(calculationService, 'calculate')
.mockReturnValueOnce(value)
.mockReturnValueOnce(otherValue);
jest
.spyOn(calculationService, 'calculate')
.mockImplementationOnce((a) => a + 3)
.mockImplementationOnce((a) => a + 5);
  • External modules can be mocked similarly to spies. For the following example, let's suppose axios package is already used in one function. The following example represents a test file where axios is mocked using jest.mock().
import axios from 'axios';
jest.mock('axios');
// within test case
axios.get.mockResolvedValue(data);
  • Manual mocks are resolved by writing corresponding modules in __mocks__ directory, e.g., fs/promises mock will be stored in __mocks__/fs/promises.js file. fs/promises mock will be resolved using jest.mock() in the test file.
jest.mock('fs/promises');
  • To assert called arguments for a mocked function, an assertion can be done using toHaveBeenCalledWith matcher.
const spy = jest.spyOn(calculationService, 'calculate');
expect(spy).toHaveBeenCalledWith(firstArgument, secondArgument);
  • To assert skipped call for a mocked function, an assertion can be done using not.toHaveBeenCalled matcher.
const spy = jest.spyOn(calculationService, 'calculate');
expect(spy).not.toHaveBeenCalled();
  • To assert called arguments for the exact call when a mocked function is called multiple times, an assertion can be done using toHaveBeenNthCalledWith matcher.
const argumentsList = [0, 1];
argumentsList.forEach((argumentList, index) => {
expect(calculationService.calculate).toHaveBeenNthCalledWith(
index + 1,
argumentList
);
});
  • Methods should be restored to their initial implementation before each test case.
// package.json
"jest": {
// ...
"restoreMocks": true
}
// ...

Further reading

Server-Sent Events 101

August 18, 2021

Server-Sent Events (SSE) is a unidirectional communication between the client and server. The client initiates the connection with the server using EventSource API. The previously mentioned API can also listen to the events from the server, listen for errors, and close the connection.

const eventSource = new EventSource(url);
eventSource.onmessage = ({ data }) => {
const eventData = JSON.parse(data);
// handling the data from the server
};
eventSource.onerror = () => {
// error handling
};
eventSource.close();

A server can send the events in text/event-stream format to the client once the client establishes the client-server connection. A server can filter clients by query parameter and send them only the appropriate events. In the following example, the NestJS server sends the events only to a specific client distinguished by its e-mail address.

import { Controller, Query, Sse } from '@nestjs/common';
import { EventEmitter2 } from '@nestjs/event-emitter';
import { Observable, Subject } from 'rxjs';
import { map } from 'rxjs/operators';
import { MessageEvent, MessageEventData } from './message-event.interface';
import { SseQueryDto } from './sse-query.dto';
@Controller()
export class AppController {
constructor(private readonly eventService: EventEmitter2) {}
@Sse('sse')
sse(@Query() sseQuery: SseQueryDto): Observable<MessageEvent> {
const subject$ = new Subject();
this.eventService.on(FILTER_VERIFIED, data => {
if (sseQuery.email !== data.email) return;
subject$.next({ isVerifiedFilter: data.isVerified });
});
return subject$.pipe(
map((data: MessageEventData): MessageEvent => ({ data })),
);
}
// ...
}

Emitting the event mentioned above is done in the following way.

const filterVerifiedEvent = new FilterVerifiedEvent();
filterVerifiedEvent.email = user.email;
filterVerifiedEvent.isVerified = true;
this.eventService.emit(FILTER_VERIFIED, filterVerifiedEvent);
2020

 

© 2023