Skip to main content
Ben Nadel at Scotch On The Rock (SOTR) 2010 (London) with: David Phipps
Ben Nadel at Scotch On The Rock (SOTR) 2010 (London) with: David Phipps

Learning Node.js: Building A Simple API Powered By MongoDB

By
Published in

For the next step in my Node.js learning journey, I wanted to try and build a simple API powered by MongoDB. As per my usual approach to learning, I am purposefully avoiding established libraries like Express (for routing and rendering) and Mongoose (for MongoDB connection management and object mapping); I like to feel the pain so that I may better understand what problems are being solved. To this effect, I've created a simple CRUD (Create, Read, Update, Delete) application in which you can manage a collection of friends using a client-side AngularJS application.

View this code in my Learning Node.js project on GitHub.

The first major hurdle that I ran into was connecting to the MongoDB instance. The native Node.js driver, for MongoDB, provides a fairly simple way of connecting - the MongoClient; but, you have to connect to the database before you can start issuing commands. This means that every part of the application that uses MongoDB has to either connect explicitly (using all the right connection settings); or, it has to use some sort of shared service that manages the MongoDB connection.

I went with the "shared service" approach so as to better encapsulate all of the connection information. But, this brought up another interesting question - when any Node.js module can simply require any other module, where and when do you handle dependencies that need to be explicitly configured? For me, this turned out to be the "server.js" file. The server.js file bootstraps the application and, as the first executed file, it has the opportunity to require and configure modules before they are exercised within the application.

Before we look at the server.js file, however, let's take a look at the mongo-gateway.js file. This is the file that houses the shared MongoClient instance that maintains the MongoDB connection pool for the application:

// Require the core node modules.
var MongoClient = require( "mongodb" ).MongoClient;
var Q = require( "q" );

// Require our core application modules.
var appError = require( "./app-error" ).createAppError;


// ----------------------------------------------------------------------------------- //
// ----------------------------------------------------------------------------------- //


// I am the shared MongoClient instance for this process.
var sharedMongoClient = null;

// Export the public methods.
exports.connect = connect;
exports.getResource = getResource;


// ---
// PUBLIC METHODS.
// ---


// I connect to the given MongoDB and store the database instance for use by any context
// that requires this module. Returns a promise.
function connect( connectionString ) {

	var deferred = Q.defer();

	MongoClient.connect(
		connectionString,
		function handleConnected( error, mongo ) {

			if ( error ) {

				deferred.reject( error );

			}

			deferred.resolve( sharedMongoClient = mongo );

		}
	);

	return( deferred.promise );

}


// I get the shared MongoClient resource.
function getResource() {

	if ( ! sharedMongoClient ) {

		throw(
			appError({
				type: "App.DatabaseNotConnected",
				message: "The MongoDB connection pool has not been established."
			})
		);

	}

	return( Q( sharedMongoClient ) );

}

Notice that there are two public methods for this module: connect() and getResource(). The connect() method is the one that our server.js file will call during the bootstrap and configuration phase of the Node.js application. This will establish the connection to the MongoDB instance and then cache the shared database object (which manages the connection pool). Other modules, that need to communicate with MongoDB, can then require this module and call getResource().

In order to further ensure that the application doesn't try to communicate with MongoDB before the connection is established, we can actually ignore HTTP requests until the MongoClient is instantiated and connected. In many server examples, you'll often see an HTTP server created and then immediately bound to a port (in order to listen for requests). In my server.js file, however, we'll create the HTTP server immediately; but, we won't bind to a port until we know that the MongoClient is connected. To this effect, we won't have to worry about fulfilling requests until we know that we can communicate with the database.

In the following server.js file, notice that we are requiring the mongo-gateway.js module. The server.js file then invokes the connect() method (at the bottom); and, only after the connection is established, does the server bind to a port:

// Require the core node modules.
var _ = require( "lodash" );
var http = require( "http" );
var url = require( "url" );
var querystring = require( "querystring" );
var Q = require( "q" );
var util = require( "util" );

// Require our core application modules.
var appError = require( "./lib/app-error" ).createAppError;
var friendController = require( "./lib/friend-controller" );
var friendService = require( "./lib/friend-service" );
var mongoGateway = require( "./lib/mongo-gateway" );
var requestBodyStream = require( "./lib/request-body-stream" );


// ----------------------------------------------------------------------------------- //
// ----------------------------------------------------------------------------------- //


// Create our server request / response handler.
// --
// NOTE: We are deferring the .listen() call until after we know that we have
// established a connection to the Mongo database instance.
var httpServer = http.createServer(
	function handleRequest( request, response ) {

		// Always set the CORS (Cross-Origin Resource Sharing) headers so that our client-
		// side application can make AJAX calls to this node app (I am letting Apache serve
		// the client-side app so as to keep this demo as simple as possible).
		response.setHeader( "Access-Control-Allow-Origin", "*" );
		response.setHeader( "Access-Control-Allow-Methods", "OPTIONS, GET, POST, DELETE" );
		response.setHeader( "Access-Control-Allow-Headers", "Origin, X-Requested-With, Content-Type, Accept" );

		// If this is the CORS "pre-flight" test, just return a 200 and halt the process.
		// This is just the browser testing to see it has permissions to make CORS AJAX
		// requests to the node app.
		if ( request.method === "OPTIONS" ) {

			return(
				response.writeHead( 200, "OK" ),
				response.end()
			);

		}

		// For non-GET requests, we will need to accumulate and parse the request body. The
		// request-body-stream will emit a "body" event when the incoming request has been
		// accumulated and parsed.
		var bodyWriteStream = requestBodyStream.createWriteStream()
			.on(
				"body",
				function haneleBodyEvent( body ) {

					// Now that we have the body, we're going to merge it together with
					// query-string (ie, search) values to provide a unified "request
					// collection" that can be passed-around.
					var parsedUrl = url.parse( request.url );

					// Ensure that the search is defined. If there is no query string,
					// search will be null and .slice() won't exist.
					var search = querystring.parse( ( parsedUrl.search || "" ).slice( 1 ) );

					// Merge the search and body collections into a single collection.
					// --
					// CAUTION: For this exploration, we are assuming that all POST
					// requests contain a serialized hash in JSON format.
					processRequest( _.assign( {}, search, body ) );

				}
			)
			.on( "error", processError )
		;

		request.pipe( bodyWriteStream );


		// Once both the query-string and the incoming request body have been
		// successfully parsed and merged, route the request into the core application
		// (via the Controllers).
		function processRequest( requestCollection ) {

			var route = ( request.method + ":" + ( requestCollection.action || "" ) );

			console.log( "Processing route:", route );

			// Default to a 200 OK response. Each route may override this when processing
			// the response from the Controller(s).
			var statusCode = 200;
			var statusText = "OK";

			// Since anything inside of the route handling may throw an error, catch any
			// error and parle it into an error response.
			try {

				if ( route === "GET:list" ) {

					var apiResponse = friendController.getFriends( requestCollection );

				} else if ( route === "GET:get" ) {

					var apiResponse = friendController.getFriend( requestCollection );

				} else if ( route === "POST:add" ) {

					var apiResponse = friendController.createFriend( requestCollection )
						.tap(
							function handleControllerResolve() {

								statusCode = 201;
								statusText = "Created";

							}
						)
					;

				} else if ( route === "POST:update" ) {

					var apiResponse = friendController.updateFriend( requestCollection );

				} else if ( route === "POST:delete" ) {

					var apiResponse = friendController.deleteFriend( requestCollection )
						.tap(
							function handleControllerResolve() {

								statusCode = 204;
								statusText = "No Content";

							}
						)
					;

				// If we made it this far, then we did not recognize the incoming request
				// as one that we could route to our core application.
				} else {

					throw(
						appError({
							type: "App.NotFound",
							message: "The requested route is not supported.",
							detail: util.format( "The route action [%s] is not supported.", route ),
							errorCode: "server.route.missing"
						})
					);

				}

				// Render the controller response.
				// --
				// NOTE: If the API response is rejected, it will be routed to the error
				// processor as the fall-through reject-binding.
				apiResponse
					.then(
						function handleApiResolve( result ) {

							var serializedResponse = JSON.stringify( result );

							response.writeHead(
								statusCode,
								statusText,
								{
									"Content-Type": "application/json",
									"Content-Length": serializedResponse.length
								}
							);

							response.end( serializedResponse );

						}
					)
					.catch( processError )
				;

			// Catch any top-level controller and routing errors.
			} catch ( controllerError ) {

				processError( controllerError );

			}

		}


		// I try to render any errors that occur during the API request routing.
		// --
		// CAUTION: This method assumes that the header has not yet been committed to the
		// response. Since the HTTP response stream never seems to cause an error, I think
		// it's OK to assume that any server-side error event would necessarily be thrown
		// before the response was committed.
		// --
		// Read More: http://www.bennadel.com/blog/2823-does-the-http-response-stream-need-error-event-handlers-in-node-js.htm
		function processError( error ) {

			console.error( error );
			console.log( error.stack );

			response.setHeader( "Content-Type", "application/json" );

			switch ( error.type ) {

				case "App.InvalidArgument":

					response.writeHead( 400, "Bad Request" );

				break;

				case "App.NotFound":

					response.writeHead( 404, "Not Found" );

				break;

				default:

					response.writeHead( 500, "Server Error" );

				break;

			}

			// We don't want to accidentally leak proprietary information back to the
			// user. As such, we only want to send back simple error information that
			// the client-side application can use to formulate its own error messages.
			response.end(
				JSON.stringify({
					type: ( error.type || "" ),
					code: ( error.errorCode || "" )
				})
			);

		}

	}
);

// Establish a connection to our database. Once that is established, we can start
// listening for HTTP requests on the API.
// --
// CAUTION: mongoGateway is a shared-resource module in our node application. Other
// modules will require("mongo-gateway") which exposes methods for getting resources
// out of the connection pool (which is managed automatically by the underlying
// MongoClient instance). It's important that we establish a connection before other
// parts of the application try to use the shared connection pool.
mongoGateway.connect( "mongodb://127.0.0.1:27017/node_mongodb" )
	.then(
		function handleConnectResolve( mongo ) {

			// Start listening for incoming HTTP requests.
			httpServer.listen( 8080 );

			console.log( "MongoDB connected, server now listening on port 8080." );

		},
		function handleConnectReject( error ) {

			console.log( "Connection to MongoDB failed." );
			console.log( error );

		}
	)
;

There's a lot going on in this file; but, the general workflow is that the server listens for a request, routes it to a controller, and then renders the response (as JavaScript Object Notation, or JSON). For each request, I am accumulating the request body, which I am assuming to be JSON, parsing it, and then merging it with the query-string to create a "request collection" object. This object is then passed-off to the controller:

// Require our core node modules.
var Q = require( "q" );

// Require our core application modules.
var friendService = require( "./friend-service" );


// ----------------------------------------------------------------------------------- //
// ----------------------------------------------------------------------------------- //


// Export the public methods.
exports.createFriend = createFriend;
exports.deleteFriend = deleteFriend;
exports.getFriend = getFriend;
exports.getFriends = getFriends;
exports.updateFriend = updateFriend;


// ---
// PUBLIC METHODS.
// ---


// I create a new friend.
function createFriend( requestCollection ) {

	var name = requestCollection.name;
	var description = requestCollection.description;

	return( friendService.createFriend( name, description ) );

}


// I delete the given friend.
function deleteFriend( requestCollection ) {

	var id = requestCollection.id;

	return( friendService.deleteFriend( id ) );

}


// I return the given friend.
function getFriend( requestCollection ) {

	var id = requestCollection.id;

	return( friendService.getFriend( id ) );

}


// I return all of the friends.
function getFriends( requestCollection ) {

	return( friendService.getFriends() );

}


// I update the given friend.
function updateFriend( requestCollection ) {

	var id = requestCollection.id;
	var name = requestCollection.name;
	var description = requestCollection.description;

	return( friendService.updateFriend( id, name, description ) );

}

The controller, in this demo, is incredibly light-weight. It really does nothing more than turn around and pass the request off to a service-layer object, friend-service.js, which interacts with the shared MongoDB instance.

In the friend-service.js module, you'll see that I am relying quite heavily on the use of Promises - in this case, the Q library. Promises are amazing and, for me, bring some sanity to all of the asynchronous workflows in Node.js. Plus, promises provide protection against errors. If an error is thrown within a promise, the error won't crash your Node.js process; rather, the promise will catch the error and turn it into a promise rejection. In the following code, you'll see that I make good use of this feature by throwing custom Error objects that contain a good deal of information about the error context.

// Require our core node modules.
var ObjectID = require( "mongodb" ).ObjectID;
var Q = require( "q" );
var util = require( "util" );

// Require our core application modules.
var appError = require( "./app-error" ).createAppError;
var mongoGateway = require( "./mongo-gateway" );


// ----------------------------------------------------------------------------------- //
// ----------------------------------------------------------------------------------- //


// Export the public methods.
exports.createFriend = createFriend;
exports.deleteFriend = deleteFriend;
exports.getFriend = getFriend;
exports.getFriends = getFriends;
exports.updateFriend = updateFriend;


// ---
// PUBLIC METHODS.
// ---


// I create a new friend with the given properties. Returns a promise that will resolve
// to the newly inserted friend ID.
function createFriend( name, description ) {

	// Test inputs (will throw error if any of them invalid).
	testName( name );
	testDescription( description );

	var promise = getDatabase()
		.then(
			function handleDatabaseResolve( mongo ) {

				var deferred = Q.defer();

				mongo.collection( "friend" ).insertOne(
					{
						name: name,
						description: description
					},
					deferred.makeNodeResolver()
				);

				return( deferred.promise );

			}
		)
		// When we insert a single document, the resulting object contains metadata about
		// the insertion. We don't want that information leaking out into the calling
		// context. As such, we want to unwrap that result, and return the inserted ID.
		// --
		// - result: Contains the operation result.
		// - + ok: 1
		// - + n: 1
		// - ops: Contains the documents inserted with added _id fields.
		// - insertedCount: 1
		// - insertedId: xxxxxxxxxxxx
		// - connection: Contains the connection used to perform the insert.
		.get( "insertedId" )
	;

	return( promise );

};


// I delete the friend with the given ID. Returns a promise.
// --
// CAUTION: If the given friend does not exist, promise will be rejected.
function deleteFriend( id ) {

	// Test inputs (will throw error if any of them invalid).
	testId( id );

	var promise = getDatabase()
		.then(
			function handleDatabaseResolve( db ) {

				var deferred = Q.defer();

				db.collection( "friend" ).deleteOne(
					{
						_id: ObjectID( id )
					},
					deferred.makeNodeResolver()
				);

				return( deferred.promise );

			}
		)
		// When we remove a document, the resulting object contains meta information
		// about the delete operation. We don't want that information to leak out into
		// the calling context; so, let's examine the result and unwrap it.
		// --
		// - result: Contains the information about the operation:
		// - + ok: 1
		// - + n: 1
		// - connection: Contains the connection used to perform the remove.
		// - deletedCount: 1
		.then(
			function handleResultResolve( result ) {

				// If the document was successfully deleted, just echo the ID.
				if ( result.deletedCount ) {

					return( id );

				}

				throw(
					appError({
						type: "App.NotFound",
						message: "Friend could not be deleted.",
						detail: util.format( "The friend with id [%s] could not be deleted.", id ),
						extendedInfo: util.inspect( result.result )
					})
				);

			}
		)
	;

	return( promise );

};


// I get the friend with the given id. Returns a promise.
function getFriend( id ) {

	// Test inputs (will throw error if any of them invalid).
	testId( id );

	var promise = getDatabase()
		.then(
			function handleDatabaseResolve( mongo ) {

				var deferred = Q.defer();

				mongo.collection( "friend" ).findOne(
					{
						_id: ObjectID( id )
					},
					deferred.makeNodeResolver()
				);

				return( deferred.promise );

			}
		)
		// If the read operation was a success, the result object will be the document
		// that we retrieved from the database. Unlike the WRITE operations, the result
		// of a READ operation doesn't contain metadata about the operation.
		.then(
			function handleResultResolve( result ) {

				if ( result ) {

					return( result );

				}

				throw(
					appError({
						type: "App.NotFound",
						message: "Friend could not be found.",
						detail: util.format( "The friend with id [%s] could not be found.", id )
					})
				);

			}
		)
	;

	return( promise );

};


// I get all the friends. Returns a promise.
function getFriends() {

	var promise = getDatabase().then(
		function handleDatabaseResolve( mongo ) {

			var deferred = Q.defer();

			mongo.collection( "friend" )
				.find({})
				.toArray( deferred.makeNodeResolver() )
			;

			return( deferred.promise );

		}
	);

	return( promise );

};


// I update the given friend, assigning the given properties.
// --
// CAUTION: If the given friend does not exist, promise will be rejected.
function updateFriend( id, name, description ) {

	// Test inputs (will throw error if any of them invalid).
	testId( id );
	testName( name );
	testDescription( description );

	var promise = getDatabase()
		.then(
			function handleDatabaseResolve( mongo ) {

				var deferred = Q.defer();

				mongo.collection( "friend" ).updateOne(
					{
						_id: ObjectID( id )
					},
					{
						$set: {
							name: name,
							description: description
						}
					},
					deferred.makeNodeResolver()
				);

				return( deferred.promise );

			}
		)
		// When we update a document, the resulting object contains meta information
		// about the update operation. We don't want that information to leak out into
		// the calling context; so, let's examine the result and unwrap it.
		// --
		// - result: Contains the information about the operation:
		// - + ok: 0
		// - + nModified: 0
		// - + n: 0
		// - connection: Contains the connection used to perform the update.
		// - matchedCount: 0
		// - modifiedCount: 0
		// - upsertedId: null
		// - upsertedCount: 0
		.then(
			function handleResultResolve( result ) {

				// If the document was successfully modified, just echo the ID.
				// --
				// CAUTION: If the update action doesn't result in modification of the
				// document (ie, the document existed, but not values were changed), then
				// the modifiedCount:0 but n:1. As such, we have to check n.
				if ( result.result.n ) {

					return( id );

				}

				throw(
					appError({
						type: "App.NotFound",
						message: "Friend could not be updated.",
						detail: util.format( "The friend with id [%s] could not be updated.", id ),
						extendedInfo: util.inspect( result.result )
					})
				);

			}
		)
	;

	return( promise );

};


// ---
// PRIVATE METHODS.
// ---


// I get a MongoDB connection from the resource pool. Returns a promise.
function getDatabase() {

	return( mongoGateway.getResource() );

}


// I test the given description for validity.
function testDescription( newDescription ) {

	if ( ! newDescription ) {

		throw(
			appError({
				type: "App.InvalidArgument",
				message: "Description must be a non-zero length.",
				errorCode: "friend.description.short"
			})
		);

	}

}


// I test the given ID for validity.
function testId( newId ) {

	if ( ! ObjectID.isValid( newId ) ) {

		throw(
			appError({
				type: "App.InvalidArgument",
				message: "Id is not valid.",
				detail: util.format( "The id [%s] is not a valid BSON ObjectID.", newId ),
				errorCode: "friend.id"
			})
		);

	}

}


// I test the given name for validity.
function testName( newName ) {

	if ( ! newName ) {

		throw(
			appError({
				type: "App.InvalidArgument",
				message: "Name must be a non-zero length.",
				errorCode: "friend.name.short"
			})
		);

	}

	if ( newName.length > 30 ) {

		throw(
			appError({
				type: "App.InvalidArgument",
				message: "Name must be less than or equal to 30-characters.",
				detail: util.format( "The name [%s] is too long.", newName ),
				errorCode: "friend.name.long"
			})
		);

	}

}

In retrospect, the friend-service.js module is probably too complicated. Having the business logic mixed in with the MongoDB logic makes for a lot of noise. What I'd like to do is refactor this so that all of the MongoDB interaction is handled inside of a "friend gateway." Then, the friend service would then call the gateway and get promises in return. This would allows the complexity of the service layer to grow without becoming overwhelming. And, I believe it would create a much nicer separation of concerns.

Slowly, I think I'm starting to see how Node.js applications fit together. I still have a lot of questions about how configuration should be done. But, I think that viewing the server.js file as the "bootstrapping" phase of the application is helpful. If nothing else, I'm starting to develop my own personal Node.js style. Onward and upward!

Want to use code from this post? Check out the license.

Reader Comments

I believe in love. I believe in compassion. I believe in human rights. I believe that we can afford to give more of these gifts to the world around us because it costs us nothing to be decent and kind and understanding. And, I want you to know that when you land on this site, you are accepted for who you are, no matter how you identify, what truths you live, or whatever kind of goofy shit makes you feel alive! Rock on with your bad self!
Ben Nadel