; last updated - 10 minutes read

Angular has been designed with testing and mocking in mind. This includes mocking HTTP services in general, and mocking REST services in particular. The Angular team even provides a fairly generic mock HTTP service. That service is a horn of plenty of good ideas. The only problem is the limited scope of the solution. It's meant to serve the needs of the documentation of Angular. The team even promises that the service may break at any point in time.

In other words: it pays to implement your own generic mock HTTP service. That's what we'll do today.

The challenge

More often than you, developers have to work with a slow and unreliable back-end. Of course, the back-end is rock-solid in production. But the development environment is... well, it's developing. Sometimes it's fast, sometimes it's slow, and sometimes it's not available at all. For instance, it's not available when Jenkins deploys a new version. That's the definition of "development environment". So it's a good idea to decouple our front-end development from the back-end. This also gives you the option to develop the front-end and the back-end services simultaneously.

What do we want to achieve?

When we started our project, we implemented the mock HTTP service in a very static way. Basically, the mock HTTP service is a hash table consisting of URLs and static HTTP responses. However, I wanted to implement a more flexible approach. Mind you: typically, the URLs of REST calls form a tree. Why don't we map this tree to the file system of your development PC?

This idea works out great for static resources. It covers the GET requests of the REST paradigm. Just in case you want to mock PUT, POST and DELETE requests: My solution doesn't support write accesses yet. That would require a database. That, in turn, is beyond the current post's scope. However, it may be the topic of a follow-up post. So all we support today is static REST services, possibly including parameters, but without real write accesses.

Project setup

Let's start with the boiler-plate code. The classic approach implementing a mock HTTP service in Angular is to implement it in a module. In theory, the service should be implemented in a dedicated service class, and the module simply declares the service. However, I found that it's difficult to do so, and there's not enough code to make the effort the pain. We can live with the small violation of separation of concerns.

So we implement a mock HTTP module like so:

export function mockHttpFactory(mockBackend, options) { return new Http(mockBackend, options); } @NgModule({ imports: [CommonModule, HttpModule], providers: [ { provide: Http, useFactory: mockHttpFactory, deps: [MockBackend, BaseRequestOptions] }, MockBackend, BaseRequestOptions ] }) export class MockHttpModule { constructor(mockBackend: MockBackend) { ... // see below }

For the moment, I suggest you just take this code for granted. It's more or less boiler-plate code. It exposes the Http token, but replaces the standard Http implementation by an implementation using Angular's MockBackend class instead of the standard XHRBackend class.

By the way, I've published a running version of this code at GitHub, so you can clone this repository and debug it.

The constructor of the module contains the service implementation:

constructor(mockBackend: MockBackend) { mockBackend.connections .delay(500) .map((connection: MockConnection) => { return this.getMockRequest(connection); }).subscribe(); }

We hook into the MockBackend and observe the incoming requests. Instead of answering the request immediately, we add a small delay. That's a good idea to make sure that the code works asynchronously. After half a second, we return the response like so:

private getMockRequest(connection: MockConnection): void { let url = connection.request.url; // to test the idea, we simply return a static response: let result = This is the server's response to ${url}.; connection.mockRespond(new Response(new ResponseOptions({ status: 200, statusText: "", url: url, body: result }))); }

First approach: get the response using require.js

The first approach uses require.js in a clever way:

let responseTable = { '/rest/tictactoe/highscore': JSON.stringify(require('./rest/tictactoe/highscore.json')), '/rest/tictactoe/users': JSON.stringify(require('./rest/tictactoe/users.json')) }

It's a bit unusual to use require.js directly in TypeScript. Most of the time, you simply use the import statement and leave it to the TypeScript compiler to choose and implement the module loader. But of course, it's still possible to load files directly using require.js. You don't even need to import the require function in the header of the TypeScript file. Neither do you need to add require.js to the package.json. It's just there.

Actually, when I said that the files are loaded using require.js I wasn't honest. In reality, webpack is responsible for loading the files.[1] Putting it in simpler terms, the Angular compiler performs its magic here. The Json files are included with the compiled binary. They are part of the main.bundle.js file of the distribution package. So it's not entirely correct the files are loaded when the hash table is created. In reality, the files are compiled into the JavaScript code.

Pros and cons of using require.js

This makes the mock HTTP service extremely fast. However, this is one of the few cases performance is not important. Mind you, we've even added a delay because real HTTP request take some time, and we don't want to introduce error by assuming HTTP requests are served without delay.

Another advantage is that the entire mock HTTP module and the json files are eliminated in the production build. The tree shaking algorithm detects that the Http module and its dependencies are not needed and removes them from the deployment package.

However, even if this algorithm is simple, it turned out to become tedious to add one REST service after another. When the back-end team starts with the mass production of REST services, you end up looking for a simpler way to add services.

A more flexible approach

So I started looking for a way to simply put the json file into a directing matching the URL. That's possible using the real Http module and its XHRBackend. They can access files in the assets folder at runtime. So you don't have to declare the complete list of URLs beforehand.

Basically, reading such a Json file is very simple:

constructor(private http: Http) { } public readHighscore(): Observable { return this.http.get("/assets/rest/tictactoe/highscore.json") .map(x => x.json() as HighScore[]); }

The problem is we want to use this Http module in the mock Http module. So we want to both use and redefine Http at the same time. This requires a few extra lines:

import {OpaqueToken} from "@angular/core"; export const REAL_HTTP = new OpaqueToken("real http service"); export function httpFactory(mockBackend, options) { return new Http(mockBackend, options); } @NgModule({ imports: [ CommonModule, HttpModule ], providers: [ { provide: Http, useFactory: httpFactory, deps: [MockBackend, BaseRequestOptions] }, { provide: REAL_HTTP, useFactory: httpFactory, deps: [XHRBackend, BaseRequestOptions] }, MockBackend, XHRBackend, BaseRequestOptions ] }) export class MockHttpModule { ... }

What's new is the token HTTP_FACTORY. We use this as a qualifier to distinguish between the mock Http module we're exposing and the real Http module we need to access the assets folder. After these preparations, the rest is more or less straightforward:

export class MockHttpModule { constructor(mockBackend: MockBackend, @Inject(REAL_HTTP) realHttp: Http) { mockBackend.connections .delay(500) .flatMap((connection: MockConnection) => { return this.serveRequestFromAssetsFolder(connection, realHttp); }).subscribe(); } private serveRequestFromAssetsFolder(connection: MockConnection, realHttp: Http) { let url = connection.request.url; if (!!url) { if (url.startsWith("https://example.com/rest/")) { url = "/assets/mock/" + url.substring("https://example.com/rest/".length); } } const response$: Observable = realHttp.get(url).map( response => { connection.mockRespond(response); } ); return response$; } }

Pros and cons of reading the json files from the asset folder

Our second approach results in smaller JavaScript files. The Json files are copied into the assets folder of the distribution package.

At first glance, this sounds like a disadvantage. The tree shaking algorithm of the AOT compiler doesn't detect that the Json files aren't necessary. So the assets folder includes these files even when building the application with the --prod flag. However, there's a simple way to solve the problem: just don't copy the rest folder to the production server.

What about PUT, POST and DELETE?

When we're talking about REST calls, we usually distinguish between the HTTP functions GET, POST, PUT and DELETE. We can simulate this by using the URL as a folder name, and by putting up to four files into the folder: get.json, put.json, post.json, and delete.json. Like mentioned above, this doesn't really cover the use case of POST, PUT and DELETE. It's only a first approximation. It doesn't modify the result of the next GET request. But in many cases, that's already enough to be useful.

Parameter

Orthodoxy teaches you to map every parameter of a REST call to an URL fragment. "http://example.com/rest/users" gets the list of all users, and "http://example.com/rest/user/42" yields the user with primary key 42. Following this rule, it's easy to implement parameters. They are just directory names.

In reality, most projects stray away from orthodoxy and add query parameters like "http://example.com/rest/users?name=John". We've solved this challenge by replacing the question mark and the "=" by slashes. In other words, queries are translated into folder structures. This approach isn't perfect. Just consider multiple parameters. Now the result of the REST call depends on the order of the parameters. But in reality, it's good enough to be useful.

Switching between the real and the mock Http service

Maybe I should conclude the article telling you how to switch between the real and the simulated Http mode. The natural place to select the Http module is the central AppModule of your application (or the SharedModule if your application has grown enough to split it into multiple modules). A clever way to do this is to use the environment variable. In the simplest case, the simulation is always used in the developer mode, and the real Http calls are only made if the application has been compiled with the --prod flag:

@NgModule({ declarations: [ ... ], imports: [ ... environment.production ? HttpModule : MockHttpModule ], providers: [ ... ], bootstrap: [AppComponent] }) export class AppModule { }

The nice thing about Angular's AOT compiler is that it recognizes that the environment is a constant. This, in turn, allows is to tree-shake the MockHttpModule out of the production deployment package.

Truth to tell, it's a bit cumbersome to compile the application with the production flag each time you want to test the program with the real back-end service, so you'll probably want to add a third environment. If you're using the Angular CLI, simply add it to the environments section of the .angular-cli.json file:

"environments": { dev": "environments/environment.ts", "mock": "environments/environment.mock.ts", "prod": "environments/environment.prod.ts" }

Wrapping it up

It takes some preparation to use both the real Http service and to redefine it. However, Angular's dependency injection is powerful enough to allow us to do just that. This, in turn, enables us to implement a mock HTTP service that's more flexible than the mock HTTP service described in most tutorials.

Next steps might be to implement the methods PUT, POST and DELETE. I'll leave this to a follow-up post. Stay tuned!


Dig deeper

Tic-tac-toe game using our mock Http service


  1. At least using the standard setup of the Angular CLI. If you've configured your project differently, things may or may not look different.↩

Comments