gRPC is an open source remote procedure call (RPC) framework developed by Google. It’s built on top of HTTP/2, and it uses Protocol Buffers as the underlying data serialization format.

This tutorial looks at how to implement an API with Node, gRPC, and Postgres.

Contents

Learning Objectives

  1. Explain what gRPC is and how it compares to REST
  2. Define the service definition and payload message structure using a Protocol Buffer
  3. Describe the four types of methods–unary, client streaming, server streaming, and bidirectional streaming
  4. Create a gRPC server and client in Node
  5. Invoke the gRPC server from the client via the gRPC stubs
  6. Describe the static and dynamic approaches to Protocol Buffers in Node

gRPC

As mentioned, gRPC is an RPC framework that leverages HTTP/2 and Protocol Buffers, making it a fast, secure, and reliable communication protocol for microservices.

New to gRPC? Start with the official What is gRPC? guide.

So, how does gRPC compare to REST?

We’ll use the most common implementation of REST for this comparison: An API that uses the HTTP verbs–GET, POST, PUT, DELETE–to accept and return JSON.

  1. Speed: HTTP/2 supports bidirectional streams (and flow control), so once a connection is established the client or server can push and pull data. Check out this demo of HTTP/1.1 vs. HTTP/2 for a performance comparison. Also, you can read all about the features in HTTP/2 here.
  2. Messages (instead of resources and verbs): With gRPC you are no longer constrained to a limited set of methods. You can create your own methods, like getCustomerList or emailCustomerOrderReceipt, that can be leveraged by simply calling the method. And, from a developer’s perspective, calling a gRPC method looks and feels just like calling any other native local method–it’s just making a network call beneath the scenes.
  3. Protocol Buffers: gRPC uses Protocol Buffers (strongly typed, binary-based) rather than JSON (loosely typed, text-based), which is much more efficient and introduces type safety. The service definitions themselves are defined declaratively and are used to generate client libraries. Further, Protocol Buffers, when compared to JSON, can decrease the size of the payloads being transmitted, which, at scale, can reduce bandwidth cost.

For more on gRPC vs REST, review the REST vs. gRPC: Battle of the APIs blog post.

Project Setup

Create a new directory to hold the project:

$ mkdir node-grpc-crud
$ cd node-grpc-crud

If you don’t want to code along, you can find the final code on GitHub.

Then, add the following files and folders:

├── client
│   ├── app.js
│   └── package.json
├── protos
└── server
    ├── index.js
    └── package.json

Finally, spin up PostgreSQL on port 5432 and create a new database called grpc_products.

gRPC Service

Throughout this tutorial, you’ll be building a product API backed by gRPC that handles the basic CRUD functionality.

Let’s start by implementing a gRPC Service, which is defined by the methods it exposes along with it’s associated parameters and return message types.

Add a new file to the “protos” folder called product.proto to hold the gRPC service definition and message type definitions:

syntax = "proto3";
package product;

// service definition

service ProductService {
  rpc listProducts(Empty) returns (ProductList) {}
  rpc readProduct(ProductId) returns (Product) {}
  rpc createProduct(newProduct) returns (result) {}
  rpc updateProduct(Product) returns (result) {}
  rpc deleteProduct(ProductId) returns (result) {}
}

// message type definitions

message Empty {}

message ProductList {
  repeated Product products = 1;
}

message ProductId {
  int32 id = 1;
}

message Product {
  int32 id = 1;
  string name = 2;
  string price = 3;
}

message newProduct {
  string name = 1;
  string price = 2;
}

message result {
  string status = 1;
}

Using VSCode? Check out the proto3 vscode extension.

First, we used the proto3 version of the Protocol Buffer language:

syntax = "proto3";

Then, we defined a package specifier:

package product;

The package specifier is optional but it’s generally a good idea to use to avoid name clashes. Review the JavaScript Generated Code reference guide for more info.

Next, we defined five RPC methods that align to the basic CRUD operations:

CRUD gRPC method
Read listProducts
Create createProduct
Read readProduct
Update updateProduct
Delete deleteProduct

There are four types of methods in gRPC-land:

Type Description Example
Unary client sends a single request and gets a single response rpc getData(req) returns (rsp) {}
Server streaming client sends a single request and gets back a stream rpc getData(req) returns (stream rsp) {}
Client streaming client sends stream and gets back a single response rpc getData(stream req) returns (rsp) {}
Bidirectional streaming both the client and server send and receive streams rpc getData(stream req) returns (stream rsp) {}

Our example uses the Unary approach.

The first method, listProducts, takes an Empty message and returns a ProductList message, which has a repeated Product field called products:

message ProductList {
  repeated Product products = 1;
}

The repeated keyword is used to define an array of objects.

We also created an Empty message as the empty stub for empty requests and responses:

message Empty {}

The next method, readProduct, takes a ProductId and returns a Product:

message Product {
  int32 id = 1;
  string name = 2;
  string price = 3;
}

This has three scalar value fields, each with a unique numbered tag:

  1. id (int32)
  2. name (string)
  3. price (string)

We created a ProductId as well with a single int32 field:

message ProductId {
  int32 id = 1;
}

Next, createProduct takes a newProduct and returns a result, which will be just a status message of “success” or “failure”. Take note of the newProduct message and the updateProduct and deleteProduct methods on your own.

With the service defined, you can now generate interfaces in a number of different languages.

Server

Moving on, let’s create the gRPC server that will be used to serve up the remote procedure calls.

Setup

Start by updating the package.json file in the “server” folder:

{
  "name": "node-grpc-server",
  "dependencies": {
    "@grpc/proto-loader": "^0.4.0",
    "google-protobuf": "^3.6.1",
    "grpc": "^1.18.0",
    "knex": "^0.16.3",
    "pg": "^7.8.0"
  },
  "scripts": {
    "start": "node index.js"
  }
}

Take note of the dependencies:

  1. @grpc/proto-loader: loads .proto files
  2. google-protobuf: JavaScript version of the Protocol Buffers runtime library
  3. grpc: Node gRPC Library
  4. knex: SQL query builder for Node
  5. pg: PostgreSQL client for Node

Before installing, you will need to install protoc, the Protobuf Compiler. If you’re on a Mac, it’s easiest to install with Homebrew:

$ brew install protobuf

Otherwise, you can manually download and install it from here.

Now you can install the NPM dependencies:

$ cd server
$ npm install

Next, add a knexfile.js file to the “server” folder:, which is used for configuring knex for different environments–e.g., local, development, production:

const path = require('path');

module.exports = {
  development: {
    client: 'postgresql',
    connection: {
      host: '127.0.0.1',
      user: '',
      password: '',
      port: '5432',
      database: 'grpc_products',
    },
    pool: {
      min: 2,
      max: 10,
    },
    migrations: {
      directory: path.join(__dirname, 'db', 'migrations'),
    },
    seeds: {
      directory: path.join(__dirname, 'db', 'seeds'),
    },
  },
};

Add the appropriate username and password. Then, create the a “db” directory along with the “migrations” and “seeds” directories. Your project structure should now look like:

├── client
│   ├── app.js
│   └── package.json
├── protos
│   └── product.proto
└── server
    ├── db
    │   ├── migrations
    │   └── seeds
    ├── index.js
    ├── knexfile.js
    ├── package-lock.json
    └── package.json

Create a new migration file:

$ ./node_modules/.bin/knex migrate:make products

Then, add the following code to the migration stub in “server/db/migrations”:

exports.up = function (knex, Promise) {
  return knex.schema.createTable('products', function (table) {
    table.increments();
    table.string('name').notNullable();
    table.string('price').notNullable();
  });
};

exports.down = function (knex, Promise) {
  return knex.schema.dropTable('products');
};

Apply the migration:

$ ./node_modules/.bin/knex migrate:latest

Create a new seed file:

$ ./node_modules/.bin/knex seed:make products

Add the code:

exports.seed = function (knex, Promise) {
  // Deletes ALL existing entries
  return knex('products').del()
    .then(function () {
      // Inserts seed entries
      return knex('products').insert([
        { name: 'pencil', price: '1.99' },
        { name: 'pen', price: '2.99' },
      ]);
    });
};

Apply the seed:

$ ./node_modules/.bin/knex seed:run

Finally, wire up the basic server in index.js:

// requirements
const path = require('path');
const protoLoader = require('@grpc/proto-loader');
const grpc = require('grpc');

// knex
const environment = process.env.ENVIRONMENT || 'development';
const config = require('./knexfile.js')[environment];
const knex = require('knex')(config);

// grpc service definition
const productProtoPath = path.join(__dirname, '..', 'protos', 'product.proto');
const productProtoDefinition = protoLoader.loadSync(productProtoPath);
const productPackageDefinition = grpc.loadPackageDefinition(productProtoDefinition).product;
/*
Using an older version of gRPC?
(1) You won't need the @grpc/proto-loader package
(2) const productPackageDefinition = grpc.load(productProtoPath).product;
*/

// knex queries
function listProducts(call, callback) {}
function readProduct(call, callback) {}
function createProduct(call, callback) {}
function updateProduct(call, callback) {}
function deleteProduct(call, callback) {}

// main
function main() {
  const server = new grpc.Server();
  // gRPC service
  server.addService(productPackageDefinition.ProductService.service, {
    listProducts: listProducts,
    readProduct: readProduct,
    createProduct: createProduct,
    updateProduct: updateProduct,
    deleteProduct: deleteProduct,
  });
  // gRPC server
  server.bind('localhost:50051', grpc.ServerCredentials.createInsecure());
  server.start();
  console.log('gRPC server running at http://127.0.0.1:50051');
}

main();

Here, we load the gRPC service definition, create a new gRPC server, add the service to the server, and then run the server on http://localhost:50051.

Take note of the addService method:

server.addService(productPackageDefinition.ProductService.service, {
  listProducts: listProducts,
  readProduct: readProduct,
  createProduct: createProduct,
  updateProduct: updateProduct,
  deleteProduct: deleteProduct,
});

This method adds the gRPC service, defined in the product.proto, to the server. It takes the service definition–e.g., ProductService–along with an object which maps the method names from the gRPC service to the method implementations defined above.

Dynamic vs Static Code Generation

There are two ways of working with Protocol Buffers in Node:

  1. Dynamically: with dynamic code generation, the Protocol Buffer is loaded and parsed at run time with Protobuf.js
  2. Statically: with the static approach, the Protocol Buffer is pre-processed into JavaScript

We used the dynamic approach above. The dynamic approach is quite a bit simpler to implement, but it differs from the workflow of other gRPC-supported languages, since they require static code generation.

Want to use the static approach?

First, install grpc-tools globally:

$ npm install -g grpc-tools

Then, run the following from the project root:

$ protoc -I=. ./protos/product.proto \
  --js_out=import_style=commonjs,binary:./server \
  --grpc_out=./server \
  --plugin=protoc-gen-grpc=`which grpc_tools_node_protoc_plugin`

The generated code should now be in the “server/protos” directory:

  1. product_grpc_pb.js
  2. product_pb.js

You’ll then import the generated code into your index.js file and use them directly rather than loading the service definition.

Sanity Check

Try running the server at this point:

$ npm start

You should see:

gRPC server running at http://127.0.0.1:50051

Let’s wire up the gRPC client before adding the knex query functions.

├── client
│   ├── app.js
│   └── package.json
├── protos
│   └── product.proto
└── server
    ├── db
    │   ├── migrations
    │   │   └── 20190131084532_products.js
    │   └── seeds
    │       └── products.js
    ├── index.js
    ├── knexfile.js
    ├── package-lock.json
    └── package.json

Client

Like before, update the package.json file the “client” directory:

{
  "name": "node-grpc-client",
  "dependencies": {
    "@grpc/proto-loader": "^0.4.0",
    "body-parser": "^1.18.3",
    "express": "^4.16.4",
    "google-protobuf": "^3.6.1",
    "grpc": "^1.18.0"
  },
  "scripts": {
    "start": "node app.js"
  }
}

From the “client” directory, install the dependencies:

$ npm install

Update app.js:

// requirements
const express = require('express');
const bodyParser = require('body-parser');
const productRoutes = require('./routes/productRoutes');

// express
const app = express();
app.use(bodyParser.urlencoded({ extended: false }));
app.use(bodyParser.json());

// routes
app.use('/api', productRoutes);

// run server
app.listen(3000, () => {
  console.log('Server listing on port 3000');
});

Here, we instantiate a new Express server, define our routes, and then run the server. Add a “routes” folder along with the Express router in a product.js file:

// requirements
const express = require('express');
const grpcRoutes = require('./grpcRoutes');

// new router
const router = express.Router();

// routes
router.get('/products', grpcRoutes.listProducts);
router.get('/products/:id', grpcRoutes.readProduct);
router.post('/products', grpcRoutes.createProduct);
router.put('/products/:id', grpcRoutes.updateProduct);
router.delete('/products/:id', grpcRoutes.deleteProduct);

module.exports = router;

Finally, add the boilerplate for the gRPC client in client/routes/grpcRoutes.js:

// requirements
const path = require('path');
const protoLoader = require('@grpc/proto-loader');
const grpc = require('grpc');

// gRPC client
const productProtoPath = path.join(__dirname, '..', '..', 'protos', 'product.proto');
const productProtoDefinition = protoLoader.loadSync(productProtoPath);
const productPackageDefinition = grpc.loadPackageDefinition(productProtoDefinition).product;
const client = new productPackageDefinition.ProductService(
  'localhost:50051', grpc.credentials.createInsecure());
/*
Using an older version of gRPC?
(1) You won't need the @grpc/proto-loader package
(2) const productPackageDefinition = grpc.load(productProtoPath).product;
(3) const client = new productPackageDefinition.ProductService(
  'localhost:50051', grpc.credentials.createInsecure());
*/

// handlers
const listProducts = (req, res) => {};
const readProduct = (req, res) => {};
const createProduct = (req, res) => {};
const updateProduct = (req, res) => {};
const deleteProduct = (req, res) => {};

module.exports = {
  listProducts,
  readProduct,
  createProduct,
  updateProduct,
  deleteProduct,
};

Test it out:

$ npm start

You should see:

Server listing on port 3000

Kill the server before moving on.

├── client
│   ├── app.js
│   ├── package-lock.json
│   ├── package.json
│   └── routes
│       ├── grpcRoutes.js
│       └── productRoutes.js
├── protos
│   └── product.proto
└── server
    ├── db
    │   ├── migrations
    │   │   └── 20190131084532_products.js
    │   └── seeds
    │       └── products.js
    ├── index.js
    ├── knexfile.js
    ├── package-lock.json
    └── package.json

With that, we’re now ready to tie everything together.

CRUD

We’ll use the following workflow for wiring up the gRPC stubs:

  1. Add the appropriate knex query to the server (server/index.js)
  2. Add the associated handler to the client (client/routes/grpcRoutes.js)
  3. Test it via cURL

listProducts

Server:

function listProducts(call, callback) {
  /*
  Using 'grpc.load'? Send back an array: 'callback(null, { data });'
  */
  knex('products')
    .then((data) => { callback(null, { products: data }); });
}

Client:

const listProducts = (req, res) => {
  /*
  gRPC method for reference:
  listProducts(Empty) returns (ProductList)
  */
  client.listProducts({}, (err, result) => {
    res.json(result);
  });
};

To test, run the client in one terminal window and the server in another. Then, run the following command:

$ curl http://127.0.0.1:3000/api/products

You should see the products:

{
  "products": [
    {
      "id": 1,
      "name": "pencil",
      "price": "1.99"
    },
    {
      "id": 2,
      "name": "pen",
      "price": "2.99"
    }
  ]
}

readProduct

Server:

function readProduct(call, callback) {
  knex('products')
    .where({ id: parseInt(call.request.id) })
    .then((data) => {
      if (data.length) {
        callback(null, data[0]);
      } else {
        callback('That product does not exist');
      }
    });
}

Client:

const readProduct = (req, res) => {
  const payload = { id: parseInt(req.params.id) };
  /*
  gRPC method for reference:
  readProduct(ProductId) returns (Product)
  */
  client.readProduct(payload, (err, result) => {
    if (err) {
      res.json('That product does not exist.');
    } else {
      res.json(result);
    }
  });
};

Test success:

$ curl http://127.0.0.1:3000/api/products/1

{
  "id": 1,
  "name": "pencil",
  "price": "1.99"
}

Test failure:

$ curl http://127.0.0.1:3000/api/products/99999

"That product does not exist."

createProduct

Server:

function createProduct(call, callback) {
  knex('products')
    .insert({
      name: call.request.name,
      price: call.request.price,
    })
    .then(() => { callback(null, { status: 'success' }); });
}

Client:

const createProduct = (req, res) => {
  const payload = { name: req.body.name, price: req.body.price };
  /*
  gRPC method for reference:
  createProduct(newProduct) returns (result)
  */
  client.createProduct(payload, (err, result) => {
    res.json(result);
  });
};

Test:

$ curl -X POST -d '{"name":"lamp","price":"29.99"}' \
    -H "Content-Type: application/json" http://127.0.0.1:3000/api/products
{
  "status": "success"
}

$ curl http://127.0.0.1:3000/api/products
{
  "products": [
    {
      "id": 1,
      "name": "pencil",
      "price": "1.99"
    },
    {
      "id": 2,
      "name": "pen",
      "price": "2.99"
    },
    {
      "id": 3,
      "name": "lamp",
      "price": "29.99"
    }
  ]
}

updateProduct

Server:

function updateProduct(call, callback) {
  knex('products')
    .where({ id: parseInt(call.request.id) })
    .update({
      name: call.request.name,
      price: call.request.price,
    })
    .returning()
    .then((data) => {
      if (data) {
        callback(null, { status: 'success' });
      } else {
        callback('That product does not exist');
      }
    });
}

Client:

const updateProduct = (req, res) => {
  const payload = { id: parseInt(req.params.id), name: req.body.name, price: req.body.price };
  /*
  gRPC method for reference:
  updateProduct(Product) returns (result)
  */
  client.updateProduct(payload, (err, result) => {
    if (err) {
      res.json('That product does not exist.');
    } else {
      res.json(result);
    }
  });
};

Test:

$ curl -X PUT -d '{"name":"lamp","price":"49.99"}' \
    -H "Content-Type: application/json" http://127.0.0.1:3000/api/products/3

{
  "status": "success"
}


$ curl http://127.0.0.1:3000/api/products/3
{
  "id": 3,
  "name": "lamp",
  "price": "49.99"
}

deleteProduct

Server:

function deleteProduct(call, callback) {
  knex('products')
    .where({ id: parseInt(call.request.id) })
    .delete()
    .returning()
    .then((data) => {
      if (data) {
        callback(null, { status: 'success' });
      } else {
        callback('That product does not exist');
      }
    });
}

Client:

const deleteProduct = (req, res) => {
  const payload = { id: parseInt(req.params.id) };
  /*
  gRPC method for reference:
  deleteProduct(ProductId) returns (result)
  */
  client.deleteProduct(payload, (err, result) => {
    if (err) {
      res.json('That product does not exist.');
    } else {
      res.json(result);
    }
  });
};

Test:

$ curl -X DELETE http://127.0.0.1:3000/api/products/3

{
  "status": "success"
}


$ curl http://127.0.0.1:3000/api/products

{
  "products": [
    {
      "id": 1,
      "name": "pencil",
      "price": "1.99"
    },
    {
      "id": 2,
      "name": "pen",
      "price": "2.99"
    }
  ]
}

What if the product doesn’t exist?

$ curl -X DELETE http://127.0.0.1:3000/api/products/9999

"That product does not exist."

Conclusion

gRPC provides a declarative, strongly typed mechanism for defining an API.

Workflow:

  1. Define the API (service definition and message structure) using a Protocol Buffer
  2. Implement the server using the generated interface
  3. Add the message stubs to the client

Resources:

  1. Final code
  2. What is gRPC? guide
  3. Building scalable microservices with gRPC
  4. gRPC for Web Clients