homeresume
 
   
🔍

Streaming binary and base64 files

June 25, 2023

Streaming is useful when dealing with big files in web apps. Instead of loading the entire file into memory before sending it to the client, streaming allows you to send it in small chunks, improving memory efficiency and reducing response time.

The code snippet below shows streaming the binary CSV and base64-encoded PDF files with NestJS. Use the same approach for other types of files, like JSON files.

Set content type and filename headers so files are streamed and downloaded correctly. Base64 file is converted to a buffer and streamed afterward. Read files from a file system or by API calls.

import { Controller, Get, Param, Res } from '@nestjs/common';
import { Response } from 'express';
import { createReadStream } from 'fs';
import { readFile } from 'fs/promises';
import { join } from 'path';
import { Readable } from 'stream';
@Controller('templates')
export class TemplatesController {
@Get('csv')
getCsvTemplate(@Res() res: Response): void {
const file = createReadStream(join(process.cwd(), 'template.csv'));
res.set({
'Content-Type': 'text/csv',
'Content-Disposition': 'attachment; filename="template.csv"'
});
file.pipe(res);
}
@Get('pdf/:id')
async getPdfTemplate(
@Param('id') id: string,
@Res() res: Response
): Promise<void> {
const fileBase64 = await readFile(
join(process.cwd(), 'template.pdf'),
'base64'
);
// const fileBase64 = await apiCall();
const fileBuffer = Buffer.from(fileBase64, 'base64');
const fileStream = Readable.from(fileBuffer);
res.set({
'Content-Type': 'application/pdf',
'Content-Disposition': `attachment; filename="template_${id}.pdf"`
});
fileStream.pipe(res);
}
}

Boilerplate

Here is the link to the boilerplate I use for the development.

Async API documentation 101

May 21, 2023

Async API documentation is used for documenting events in event-driven systems, like Kafka events. All of the event DTOs are stored in one place. It supports YAML and JSON formats.

It contains information about channels and components. Channels and components are defined with their messages and DTO schemas, respectively.

{
"asyncapi": "2.6.0",
"info": {
"title": "Events docs",
"version": "1.0.0"
},
"channels": {
"topic_name": {
"publish": {
"message": {
"schemaFormat": "application/vnd.oai.openapi;version=3.0.0",
"payload": {
"type": "object",
"properties": {
"counter": {
"type": "number"
}
},
"required": ["counter"]
}
}
}
}
},
"components": {
"schemas": {
"EventDto": {
"type": "object",
"properties": {
"counter": {
"type": "number"
}
},
"required": ["counter"]
}
}
}
}

Autogeneration

Async API docs can be autogenerated by following multiple steps:

  • define DTOs and their required and optional fields with ApiProperty and ApiPropertyOptional decorators (from the @nestjs/swagger package), respectively
  • generate OpenAPI docs from the defined DTOs
  • parse and reuse component schemas from generated OpenAPI documentation to build channel messages and component schemas for Async API docs

Validation

Use AsyncAPI Studio to validate the written specification.

Preview

There are multiple options

  • AsyncAPI Studio

  • VSCode extension asyncapi-preview, open the command palette, and run the Preview AsyncAPI command.

UI generation

  • Install @asyncapi/cli and corresponding template package (e.g., @asyncapi/html-template, @asyncapi/markdown-template)
  • Update package.json with scripts
{
"scripts": {
// ...
"generate-docs:html": "asyncapi generate fromTemplate ./asyncapi/asyncapi.json @asyncapi/html-template --output ./docs/html",
"generate-docs:markdown": "asyncapi generate fromTemplate ./asyncapi/asyncapi.json @asyncapi/markdown-template --output ./docs/markdown"
}
}

Boilerplate

Here is the link to the boilerplate I use for the development.

Health checks with Terminus

April 14, 2023

Monitoring tools use health checks to check if service and external dependencies (like a database) are up and running and take some action (like sending alerts) for the unhealthy state.

Terminus provides a set of health indicators.

Liveness probe

An HTTP endpoint checks if the service is up and running.

// health.controller.ts
import { Controller, Get } from '@nestjs/common';
import { ApiTags } from '@nestjs/swagger';
import {
HealthCheck,
HealthCheckResult,
HealthCheckService,
HealthIndicatorResult,
TypeOrmHealthIndicator
} from '@nestjs/terminus';
import { CustomConfigService } from 'common/config/custom-config.service';
@ApiTags('health')
@Controller('health')
export class HealthController {
constructor(
private readonly healthCheckService: HealthCheckService,
private readonly configService: CustomConfigService,
private readonly database: TypeOrmHealthIndicator
) {}
@Get('liveness')
@HealthCheck()
async check(): Promise<HealthCheckResult> {
return this.healthCheckService.check([
async (): Promise<HealthIndicatorResult> => ({
[this.configService.SERVICE_NAME]: { status: 'up' }
})
]);
}
// ...
}

A successful response is like the one below.

{
"status": "ok",
"info": {
"nestjs-starter": {
"status": "up"
}
},
"error": {},
"details": {
"nestjs-starter": {
"status": "up"
}
}
}

Readiness probe

An HTTP endpoint checks if the service is ready to receive the traffic and if all external dependencies are running.

// health.controller.ts
import { Controller, Get } from '@nestjs/common';
import { ApiTags } from '@nestjs/swagger';
import {
HealthCheck,
HealthCheckResult,
HealthCheckService,
HealthIndicatorResult,
TypeOrmHealthIndicator
} from '@nestjs/terminus';
import { CustomConfigService } from 'common/config/custom-config.service';
@ApiTags('health')
@Controller('health')
export class HealthController {
constructor(
private readonly healthCheckService: HealthCheckService,
private readonly configService: CustomConfigService,
private readonly database: TypeOrmHealthIndicator
) {}
// ...
@Get('readiness')
@HealthCheck()
async checkReadiness(): Promise<HealthCheckResult> {
return this.healthCheckService.check([
async (): Promise<HealthIndicatorResult> =>
this.database.pingCheck('postgres')
]);
}
}

Responses

  • Successful response
{
"status": "ok",
"info": {
"postgres": {
"status": "up"
}
},
"error": {},
"details": {
"postgres": {
"status": "up"
}
}
}
  • Response when the database is down
{
"status": "error",
"info": {},
"error": {
"postgres": {
"status": "down"
}
},
"details": {
"postgres": {
"status": "down"
}
}
}

Demo

The demo with the mentioned examples is available here.

Documenting REST APIs with OpenAPI specs (NestJS/Swagger)

March 16, 2023

OpenAPI is a language-agnostic specification for declaring API documentation for REST APIs. It contains the following information:

  • API information like title, description, version
  • endpoints definitions with request and response parameters
  • DTOs and security schemas
openapi: 3.0.0
paths:
/users:
post:
operationId: UsersController_createUser
summary: Create user
description: Create a new user
parameters: []
requestBody:
required: true
content:
application/json:
schema:
$ref: '#/components/schemas/CreateUserDto'
responses:
'201':
description: 'User is created'
info:
title: nestjs-starter
description: Minimal NestJS boilerplate
version: 0.1.0
contact: {}
tags: []
servers: []
components:
securitySchemes:
token:
type: apiKey
scheme: api_key
in: header
name: auth-token
schemas:
CreateUserDto:
type: object
properties:
firstName:
type: string
example: tester
description: first name of the user
required:
- firstName

NestJS provides a Swagger plugin for generating the API docs.

Setup

Configure API documentation with the specified endpoint, like /api-docs, which shows the generated docs.

const SWAGGER_API_ENDPOINT = '/api-docs';
// ...
export const setupApiDocs = (app: INestApplication): void => {
const options = new DocumentBuilder()
.setTitle(SWAGGER_API_TITLE)
.setDescription(SWAGGER_API_DESCRIPTION)
.setVersion(SWAGGER_API_VERSION)
.addSecurity('token', {
type: 'apiKey',
scheme: 'api_key',
in: 'header',
name: 'auth-token'
})
.addBearerAuth()
.build();
const document = SwaggerModule.createDocument(app, options);
SwaggerModule.setup(SWAGGER_API_ENDPOINT, app, document);
};

Configure the plugin in the NestJS config file.

{
"compilerOptions": {
"plugins": ["@nestjs/swagger"]
}
}

JSON and YAML formats are generated at /api-docs-json and /api-docs-yaml endpoints, respectively.

Decorators

  • ApiTags groups endpoints
@ApiTags('users')
@Controller('users')
export class UsersController {
// ...
}
  • ApiOperation provides more details like a summary and description of the endpoint
@ApiOperation({
summary: 'Get user',
description: 'Get user by id',
})
@Get(':id')
async getById(
@Param('id', new ParseUUIDPipe()) id: string,
): Promise<UserDto> {
// ...
}
  • ApiOperation can be used to mark an endpoint as deprecated
@ApiOperation({ deprecated: true })
  • @ApiProperty and @ApiPropertyOptional should be used for request and response DTOs fields. Example and description values will be shown in the generated documentation.
export class CreateUserDto {
@ApiProperty({ example: 'John', description: 'first name of the user' })
// ...
public firstName: string;
@ApiPropertyOptional({ example: 'Doe', description: 'last name of the user' })
// ...
public lastName?: string;
}
  • ApiHeader documents endpoint headers
@ApiHeader({
name: 'correlation-id',
required: false,
description: 'unique id for correlated logs',
example: '7ea2c7f7-8b46-475d-86f8-7aaaa9e4a35b',
})
@Get()
getHello(): string {
// ...
}
  • ApiResponse specifies which responses are expected, like error responses. NestJS' Swagger package provides decorators for specific status codes like ApiBadRequestResponse.
// ...
@ApiResponse({ type: NotFoundException, status: HttpStatus.NOT_FOUND })
@ApiBadRequestResponse({ type: BadRequestException })
@Get(':id')
async getById(
@Param('id', new ParseUUIDPipe()) id: string,
): Promise<UserDto> {
return this.userService.findById(id);
}
// ...
  • ApiSecurity('token') uses a custom-defined security strategy, token in this case. Other options are to use already defined strategies like ApiBearerAuth.
@ApiSecurity('token')
@Controller()
export class AppController {
// ...
}
// ...
@ApiBearerAuth()
@Controller()
export class AppController {
// ...
}
  • ApiExcludeEndpoint and ApiExcludeController exclude one endpoint and the whole controller, respectively.
export class AppController {
@ApiExcludeEndpoint()
@Get()
getHello(): string {
// ...
}
}
// ...
@ApiExcludeController()
@Controller()
export class AppController {
// ...
}
  • ApiBody with ApiExtraModels add an example for the request body
const CreateUserDtoExample = {
firstName: 'Tester',
};
@ApiExtraModels(CreateUserDto)
@ApiBody({
schema: {
oneOf: refs(CreateUserDto),
example: CreateUserDtoExample,
},
})
@Post()
async createUser(@Body() newUser: CreateUserDto): Promise<UserDto> {
// ...
}

Importing API to Postman

Import JSON version of API docs as Postman API with Import Link option (e.g., URL http://localhost:8081/api-docs-json). Imported API collection will be available in the APIs tab.

Boilerplate

Here is the link to the boilerplate I use for the development. It contains the examples mentioned above with more details.

Integration testing Node.js apps

January 25, 2023

Integration testing means testing a component with multiple sub-components and how they interact. Some sub-components can be external services, databases, and message queues.

External services are running, but their business logic is mocked based on received parameters (request headers, query parameters, etc.). Databases and message queues are spun up using test containers.

This post covers testing service as a component and its API endpoints. This approach can be used with any framework and language. NestJS and Express are used in the examples below.

API endpoints

Below is the controller for two endpoints. First communicates with an external service and retrieves some data based on the sent parameter. The second one retrieves the data from the database.

// users.controller.ts
@Controller('users')
export class UsersController {
constructor(private userService: UsersService) {}
@Get()
async getAll(@Query('type') type: string) {
return this.userService.findAll(type);
}
@Get(':id')
async getById(@Param('id', new ParseUUIDPipe()) id: string) {
return this.userService.findById(id);
}
}

External dependencies

External service is mocked to send data based on the received parameter.

export const createDummyUserServiceServer = async (): Promise<DummyServer> => {
return createDummyServer((app) => {
app.get('/users', (req, res) => {
if (req.query.type !== 'user') {
return res.status(403).send('User type is not valid');
}
res.json(usersResponse);
});
});
};

Tests setup

Tests for endpoints can be split into two parts. The first is related to the external dependencies setup.

The example below creates a mocked service and spins up the database using test containers. The environment variables are set for before mentioned dependencies, and the leading service starts running.

The database is cleaned before every test run. External dependencies (mocked service and database) are closed after tests finish.

// test/users.spec.ts
describe('UsersController (integration)', () => {
let app: INestApplication;
let dummyUserServiceServerClose: () => void;
let postgresContainer: StartedTestContainer;
let usersRepository: Repository<UsersEntity>;
const databaseConfig = {
databaseName: 'nestjs-starter-db',
databaseUsername: 'user',
databasePassword: 'some-r4ndom-pasS',
databasePort: 5432,
}
beforeAll(async () => {
const dummyUserServiceServer = await createDummyUserServiceServer();
dummyUserServiceServerClose = dummyUserServiceServer.close;
postgresContainer = await new GenericContainer('postgres:15-alpine')
.withEnvironment({
POSTGRES_USER: databaseConfig.databaseUsername,
POSTGRES_PASSWORD: databaseConfig.databasePassword,
POSTGRES_DB: databaseConfig.databaseName,
})
.withExposedPorts(databaseConfig.databasePort)
.start();
const moduleFixture: TestingModule = await Test.createTestingModule({
imports: [AppModule],
})
.overrideProvider(ConfigService)
.useValue({
get: (key: string): string => {
const map: Record<string, string | undefined> = process.env;
map.USER_SERVICE_URL = dummyUserServiceServer.url;
map.DATABASE_HOSTNAME = postgresContainer.getHost();
map.DATABASE_PORT = `${postgresContainer.getMappedPort(databaseConfig.databasePort)}`;
map.DATABASE_NAME = databaseConfig.databaseName;
map.DATABASE_USERNAME = databaseConfig.databaseUsername;
map.DATABASE_PASSWORD = databaseConfig.databasePassword;
return map[key] || '';
},
})
.compile();
app = moduleFixture.createNestApplication();
usersRepository = app.get(getRepositoryToken(UsersEntity));
await app.init();
});
beforeEach(async () => {
await usersRepository.delete({});
});
afterAll(async () => {
await app.close();
dummyUserServiceServerClose();
await postgresContainer.stop();
});
// ...
});

Tests

The second part covers tests for the implemented endpoints. The first test suite asserts retrieving data from the external service based on the sent type as a query parameter.

// test/users.spec.ts
describe('/users (GET)', () => {
it('should return list of users', async () => {
return request(app.getHttpServer())
.get('/users?type=user')
.expect(HttpStatus.OK)
.then((response) => {
expect(response.body).toEqual(usersResponse);
});
});
it('should throw an error when type is forbidden', async () => {
return request(app.getHttpServer())
.get('/users?type=admin')
.expect(HttpStatus.FORBIDDEN);
});
});

The second test suite asserts retrieving the data from the database.

// test/users.spec.ts
describe('/users/:id (GET)', () => {
it('should return found user', async () => {
const userId = 'b618445a-0089-43d5-b9ca-e6f2fc29a11d';
const userDetails = {
id: userId,
firstName: 'tester',
};
const newUser = await usersRepository.create(userDetails);
await usersRepository.save(newUser);
return request(app.getHttpServer())
.get(`/users/${userId}`)
.expect(HttpStatus.OK)
.then((response) => {
expect(response.body).toEqual(userDetails);
});
});
it('should return 404 error when user is not found', async () => {
const userId = 'b618445a-0089-43d5-b9ca-e6f2fc29a11d';
return request(app.getHttpServer())
.get(`/users/${userId}`)
.expect(HttpStatus.NOT_FOUND);
});
});

Boilerplate

Here is the link to the boilerplate I use for the development. It contains the examples mentioned above with more details.

2022

TypeORM with NestJS

December 1, 2022

This post covers TypeORM examples with the NestJS framework, from setting up the connection with the Postgres database to working with transactions. The following snippets can be adjusted and reused with other frameworks like Express. The same applies to SQL databases.

Prerequisites

  • NestJS app bootstrapped
  • Postgres database running
  • @nestjs/typeorm, typeorm and pg packages installed

Database connection

It requires the initialization of the DataSource configuration.

// app.module.ts
const typeOrmConfig = {
imports: [
ConfigModule.forRoot({
load: [databaseConfig]
})
],
inject: [ConfigService],
useFactory: async (configService: ConfigService) =>
configService.get('database'),
dataSourceFactory: async (options) => new DataSource(options).initialize()
};
@Module({
imports: [TypeOrmModule.forRootAsync(typeOrmConfig)]
})
export class AppModule {}

DataSource configuration contains elements for the connection string, migration details, etc.

// config/database.ts
import path from 'path';
import { registerAs } from '@nestjs/config';
import { PostgresConnectionOptions } from 'typeorm/driver/postgres/PostgresConnectionOptions';
export default registerAs(
'database',
(): PostgresConnectionOptions =>
({
logging: false,
entities: [path.resolve(`${__dirname}/../../**/**.entity{.ts,.js}`)],
migrations: [
path.resolve(`${__dirname}/../../../database/migrations/*{.ts,.js}`)
],
migrationsRun: true,
migrationsTableName: 'migrations',
keepConnectionAlive: true,
synchronize: false,
type: 'postgres',
host: process.env.DATABASE_HOSTNAME,
port: Number(process.env.DATABASE_PORT),
username: process.env.DATABASE_USERNAME,
password: process.env.DATABASE_PASSWORD,
database: process.env.DATABASE_NAME
} as PostgresConnectionOptions)
);

Migrations and seeders

Migrations are handled with the following scripts for generation, running, and reverting.

// package.json
{
"scripts": {
"migration:generate": "npm run typeorm -- migration:create",
"migrate": "npm run typeorm -- migration:run -d src/common/config/ormconfig-migration.ts",
"migrate:down": "npm run typeorm -- migration:revert -d src/common/config/ormconfig-migration.ts",
"typeorm": "ts-node -r tsconfig-paths/register ./node_modules/typeorm/cli.js"
}
}

A new migration is generated at the provided path with the following command. The filename of it is in the format <TIMESTAMP>-<MIGRATION_NAME>.ts.

npm run migration:generate database/migrations/<MIGRATION_NAME>

Here is the example for the migration which creates a new table. A table is dropped when the migration is reverted.

// database/migrations/1669833880587-create-users.ts
import { MigrationInterface, QueryRunner, Table } from 'typeorm';
export class CreateUsers1669833880587 implements MigrationInterface {
public async up(queryRunner: QueryRunner): Promise<void> {
await queryRunner.createTable(
new Table({
name: 'users',
columns: [
{
name: 'id',
type: 'uuid',
default: 'uuid_generate_v4()',
generationStrategy: 'uuid',
isGenerated: true,
isPrimary: true
},
{
name: 'first_name',
type: 'varchar'
}
]
})
);
}
public async down(queryRunner: QueryRunner): Promise<void> {
await queryRunner.dropTable('users');
}
}

Scripts for running and reverting the migrations require a separate DataSource configuration, the migrations table name is migrations in this case. Running a migration adds a new row with the migration name while reverting removes it.

// config/ormconfig-migration.ts
import 'dotenv/config';
import * as path from 'path';
import { DataSource } from 'typeorm';
const config = new DataSource({
type: 'postgres',
host: process.env.DATABASE_HOSTNAME,
port: Number(process.env.DATABASE_PORT),
username: process.env.DATABASE_USERNAME,
password: process.env.DATABASE_PASSWORD,
database: process.env.DATABASE_NAME,
entities: [path.resolve(`${__dirname}/../../**/**.entity{.ts,.js}`)],
migrations: [
path.resolve(`${__dirname}/../../../database/migrations/*{.ts,.js}`)
],
migrationsTableName: 'migrations',
logging: true,
synchronize: false
});
export default config;

Seeder is a type of migration, seeders are handled with the following scripts for generation, running, and reverting.

// package.json
{
"scripts": {
"seed:generate": "npm run typeorm -- migration:create",
"seed": "npm run typeorm -- migration:run -d src/common/config/ormconfig-seeder.ts",
"seed:down": "npm run typeorm -- migration:revert -d src/common/config/ormconfig-seeder.ts"
}
}

A new seeder is generated at the provided path with the following command. The filename of it is in the format <TIMESTAMP>-<SEEDER_NAME>.ts.

npm run seeder:generate database/seeders/<SEEDER_NAME>

Here is the example for the seeder which inserts some data. A table data is removed when the seeder is reverted.

// database/seeders/1669834539569-add-users.ts
import { UsersEntity } from '../../src/modules/users/users.entity';
import { MigrationInterface, QueryRunner } from 'typeorm';
export class AddUsers1669834539569 implements MigrationInterface {
public async up(queryRunner: QueryRunner): Promise<void> {
await queryRunner.manager.insert(UsersEntity, [
{
firstName: 'tester'
}
]);
}
public async down(queryRunner: QueryRunner): Promise<void> {
await queryRunner.manager.clear(UsersEntity);
}
}

Scripts for running and reverting the seeders require a separate DataSource configuration, the seeders table name is seeders in this case. Running a seeder adds a new row with the seeder name while reverting removes it.

// config/ormconfig-seeder.ts
import 'dotenv/config';
import * as path from 'path';
import { DataSource } from 'typeorm';
const config = new DataSource({
type: 'postgres',
host: process.env.DATABASE_HOSTNAME,
port: Number(process.env.DATABASE_PORT),
username: process.env.DATABASE_USERNAME,
password: process.env.DATABASE_PASSWORD,
database: process.env.DATABASE_NAME,
entities: [path.resolve(`${__dirname}/../../**/**.entity{.ts,.js}`)],
migrations: [
path.resolve(`${__dirname}/../../../database/seeders/*{.ts,.js}`)
],
migrationsTableName: 'seeders',
logging: true,
synchronize: false
});
export default config;

Entities

Entities are specified with their columns and Entity decorator.

// users.entity.ts
import { Column, Entity, PrimaryGeneratedColumn } from 'typeorm';
@Entity({ name: 'users' })
export class UsersEntity {
@PrimaryGeneratedColumn('uuid')
public id: string;
@Column({ name: 'first_name' })
public firstName: string;
}

Entities should be registered with forFeature method.

// users.module
@Module({
imports: [TypeOrmModule.forFeature([UsersEntity])],
// ...
})
export class UsersModule {}

Custom repositories

Custom repositories extend the base repository class and enrich it with several additional methods.

// users.repository.ts
@Injectable()
export class UsersRepository extends Repository<UsersEntity> {
constructor(private dataSource: DataSource) {
super(UsersEntity, dataSource.createEntityManager());
}
async getById(id: string) {
return this.findOne({ where: { id } });
}
// ...
}

Custom repositories should be registered as a provider.

// users.module
@Module({
// ...
providers: [UsersService, UsersRepository],
// ...
})
export class UsersModule {}

Testing custom repositories

Testing custom repositories (NestJS/TypeORM) post covers more details about the unit and integration testing.

Transactions

typeorm-transactional library uses CLS (Continuation Local Storage) to handle and propagate transactions between different repositories and service methods.

@Injectable()
export class PostService {
constructor(
private readonly authorRepository: AuthorRepository,
private readonly postRepository: PostRepository
) {}
@Transactional() // will open a transaction if one doesn't already exist
async createPost(authorUsername: string, message: string): Promise<Post> {
const author = await this.authorRepository.create({
username: authorUsername
});
return this.postRepository.save({ message, author_id: author.id });
}
}

Initialization of transactional context should happen before starting the app.

// main.ts
async function bootstrap(): Promise<void> {
initializeTransactionalContext();
// ...
}

DataSource instance should be added to the transactional context.

const typeOrmConfig = {
// ...
dataSourceFactory: async (options) =>
addTransactionalDataSource(new DataSource(options)).initialize()
};

Boilerplate

Here is the link to the boilerplate I use for the development. It contains the examples mentioned above with more details.

2021

Redis as custom storage for NestJS rate limiter

September 14, 2021

A rate limiter is a standard protection technique against brute force and DDoS attacks. NestJS provides a module for it, and the default storage is in-memory. Custom storage, Redis in this case, should be injected inside ThrottlerModule configuration.

Configuration

The configuration should contain

  • TTL (time to live) in seconds
  • maximum number of requests within TTL
  • Redis URL
// app.module.ts
import { APP_GUARD } from '@nestjs/core';
import { ThrottlerGuard, ThrottlerModule } from '@nestjs/throttler';
import { ThrottlerStorageRedisService } from 'nestjs-throttler-storage-redis';
// ...
@Module({
imports: [
ThrottlerModule.forRootAsync({
inject: [CustomConfigService],
useFactory: (configService: CustomConfigService) => ({
ttl: configService.THROTTLER_TTL_SECONDS,
limit: configService.THROTTLER_LIMIT,
storage: new ThrottlerStorageRedisService(configService.REDIS_URL),
}),
}),
// ...
],
})
export class AppModule {}

API endpoints setup

Binding the throttler guard can be done in multiple ways.

  • guard is bound globally for every API endpoint.
// app.module
import { ThrottlerGuard } from '@nestjs/throttler';
// ...
@Module({
// ...
providers: [{
provide: APP_GUARD,
useClass: ThrottlerGuard,
}],
})
export class AppModule {}
  • global guard is overridden for the specific API endpoint with the Throttle decorator.
import { Throttle } from '@nestjs/throttler';
// ...
@Controller('users')
export class UsersController {
@Throttle(USERS_THROTTLER_LIMIT, USERS_THROTTLER_TTL_SECONDS)
@Get()
async getUsers() {}
}
  • global guard is skipped for the specific API endpoint with the SkipThrottle decorator.
import { SkipThrottle } from '@nestjs/throttler';
// ...
@Controller('posts')
export class PostsController {
@SkipThrottle()
@Get()
async getPosts() {}
}

Boilerplate

Here is the link to the template I use for the development.

Testing custom repositories (NestJS/TypeORM)

September 5, 2021

Custom repositories extend the base repository class and enrich it with several additional methods. This post covers the unit and integration testing.

// user.repository.ts
@Injectable()
export class UserRepository extends Repository<UserEntity> {
constructor(private dataSource: DataSource) {
super(UserEntity, dataSource.createEntityManager());
}
async getById(id: string) {
return this.findOne({ where: { id } });
}
// ...
}

Setup

Inject a custom repository into the service.

// user.service.ts
export class UserService {
constructor(private readonly userRepository: UserRepository) {}
async getById(id: string): Promise<User> {
return this.userRepository.getById(id);
}
// ...
}

Pass entity class to the forFeature method.

// user.module.ts
@Module({
imports: [
TypeOrmModule.forFeature([UserEntity])],
// ...
],
providers: [UserService, UserRepository],
// ...
})
export class UserModule {}

Unit testing

To properly unit-test the custom repository, mock some methods.

// user.repository.spec.ts
describe('UserRepository', () => {
let userRepository: UserRepository;
const dataSource = {
createEntityManager: jest.fn()
};
beforeEach(async () => {
const module: TestingModule = await Test.createTestingModule({
providers: [
UserRepository,
{
provide: DataSource,
useValue: dataSource
}
]
}).compile();
userRepository = module.get<UserRepository>(UserRepository);
});
describe('getById', () => {
it('should return found user', async () => {
const id = 'id';
const user = {
id
};
const findOneSpy = jest
.spyOn(userRepository, 'findOne')
.mockResolvedValue(user as UserEntity);
const foundUser = await userRepository.getById(id);
expect(foundUser).toEqual(user);
expect(findOneSpy).toHaveBeenCalledWith({ where: user });
});
});
});

Integration testing

Integration testing is more suitable when working with databases. Read more about it on Integration testing Node.js apps post

Boilerplate

Here is the link to the boilerplate I use for the development. It contains the examples mentioned above with more details.

Server-Sent Events 101

August 18, 2021

Server-Sent Events (SSE) is a unidirectional communication between the client and server. The client initiates the connection with the server using EventSource API.

The previously mentioned API can also listen to the events from the server, listen for errors, and close the connection.

const eventSource = new EventSource(url);
eventSource.onmessage = ({ data }) => {
const eventData = JSON.parse(data);
// handling the data from the server
};
eventSource.onerror = () => {
// error handling
};
eventSource.close();

A server can send the events in text/event-stream format to the client once the client establishes the client-server connection. A server can filter clients by query parameter and send them only the appropriate events.

In the following example, the NestJS server sends the events only to a specific client distinguished by its e-mail address.

import { Controller, Query, Sse } from '@nestjs/common';
import { EventEmitter2 } from '@nestjs/event-emitter';
import { Observable, Subject } from 'rxjs';
import { map } from 'rxjs/operators';
import { MessageEvent, MessageEventData } from './message-event.interface';
import { SseQueryDto } from './sse-query.dto';
@Controller()
export class AppController {
constructor(private readonly eventService: EventEmitter2) {}
@Sse('sse')
sse(@Query() sseQuery: SseQueryDto): Observable<MessageEvent> {
const subject$ = new Subject();
this.eventService.on(FILTER_VERIFIED, data => {
if (sseQuery.email !== data.email) return;
subject$.next({ isVerifiedFilter: data.isVerified });
});
return subject$.pipe(
map((data: MessageEventData): MessageEvent => ({ data })),
);
}
// ...
}

Emitting the event mentioned above is done in the following way.

const filterVerifiedEvent = new FilterVerifiedEvent();
filterVerifiedEvent.email = user.email;
filterVerifiedEvent.isVerified = true;
this.eventService.emit(FILTER_VERIFIED, filterVerifiedEvent);

Boilerplate

Here is the link to the template I use for the development.