Protobuf or JSON structured front-end communication protocol?

image

In the new project in our team, we chose the frontend framework VUE for the new product, the backend is written in PHP, and has been working successfully for 17 years now.

When the code began to grow, I had to think about simplifying the exchange of data with the server, which I will talk about.

About backend


The project is large enough and the functionality is very confused, therefore, the code written in DDD had certain data structures, they were complex and voluminous for some universality in the project as a whole.

About frontend


4months development of the front, we used JSON as a response from the server, we mapped in State Vuex in a format convenient for us. But for the return to the server, we needed to convert in the opposite direction, so that the server could read and map its DTO objects (it may seem strange, but it should be so :))

Problems


It seems to be nothing, they worked with what is, the state grew to large objects. They began to break up into even smaller modules, each of which had its own states, mutations, etc. ... The API began to change after new tasks from managers, and it became more and more difficult to manage all this, then it was mapped wrong, then the fields changed ...

And here we are started to think about universal data structures on the server and front to eliminate errors in parsings, mappings, etc.

After some searching, we came to two options:

  1. protocol-buffers
  2. Server-side JS DTO auto-generation for the front, with further JSON processing in these DTOs.

After trying the pen, it was customary to use Protobuf from google.

And that's why:

  1. There is already a functional that compiles the described structures for many platforms, including for PHP and for JS.
  2. There is a documentation generator for created structures .proto
  3. You can easily screw some versioning for structures.
  4. The search for objects is facilitated when refactoring in both PHP and JS.
  5. And other chips like gRPC, etc. if necessary.

Stop talking, let's see how it all looks


I won’t describe how it looks on the PHP side, everything is pretty much the same there, the objects are the same.

I will show you an example of a simple client JS and mini server on Node.js.

First, we describe the data structures that we need. Doca .

product.proto

syntax = "proto3";

package api;

import "price.proto";

message Product {
    message Id {
        uint32 value = 1;
    }
    Id id = 1;
    string name = 2;
    string text = 3;
    string url = 4;
    Price price = 5;
}

price.proto

syntax = "proto3";
package api;

message Price {
    float value = 1;
    uint32 tax = 2;
}

service.proto

syntax = "proto3";

package api;

import "product.proto";

service ApiService {
    rpc getById (Product.Id) returns (Product);
}

I’ll explain a little about the service, why it is needed, if not even used. The service is described only for the sake of documentation in our case, what it accepts and what it gives, so that we can substitute the necessary objects. It is only needed for gRPC.

Next, the code generator is downloaded based on the structures.

And the generation command runs under JS.

./protoc --proto_path=/Users/user/dev/habr_protobuf/public/proto --js_out=import_style=commonjs,binary:/Users/user/dev/habr_protobuf/src/proto/ /Users/user/dev/habr_protobuf/public/proto/*.proto

More details in the dock .

After generation, 3 JS files appear, in which everything is reduced to objects, with the functionality of serialization to the buffer and deserialization from the buffer.

price_pb.js
product_pb.js
service_pb.js

Next we describe the JS code.

import { Product } from '../proto/product_pb';

//         Product.Id
const instance = new Product.Id().setValue(12345);
let message = instance.serializeBinary();

let response = await fetch('http://localhost:3008/api/getById', {
    method: 'POST',
    body: message
});

let result = await response.arrayBuffer();

//     Product,   ,    , 
//      .
const data = Product.deserializeBinary(result);
console.log(data.toObject());

In principle, the client is ready.

We Express on the server

const express = require('express');
const cors = require('cors');

const app = express();
app.use(cors());

//           .
const Product = require('./src/proto/product_pb').Product;
const Price = require('./src/proto/price_pb').Price;

//    , .     .
app.use (function(req, res, next) {
  let data = [];
  req.on('data', function(chunk) {
    data.push(chunk);
  });
  req.on('end', function() {
    if (data.length <= 0 ) return next();
    data = Buffer.concat(data);
    console.log('Received buffer', data);
    req.raw = data;
    next();
  })
});

app.post('/api/getById', function (req, res) {
  //     Product.Id,   
  const prId = Product.Id.deserializeBinary(req.raw);
  const id = prId.toObject().value;

  //   " "      
  const product = new Product();
  product.setId(new Product.Id().setValue(id));
  product.setName('Sony PSP');
  product.setUrl('http://mysite.ru/product/psp/');

  const price = new Price();
  price.setValue(35500.00);
  price.setTax(20);

  product.setPrice(price);

  //       Product
  res.send(Buffer.from(product.serializeBinary()));
});

app.listen(3008, function () {
  console.log('Example app listening on port 3008!');
});

What do we have in total


  1. A single point of truth in the form of generated objects based on structures that are described once for many platforms.
  2. There is no confusion, there is clear documentation both in the form of auto-generated HTML and just viewing .proto files.
  3. Everywhere, work is underway with specific entities, without their modifications, etc. (and we all know that the frontend loves gag :))
  4. Very convenient operation of this protocol is exchanged over web sockets.

There is of course a small minus, this is the speed of serialization and deserialization, here is an example.

I took lorem ipsum for 10 paragraphs, it turned out 5.5kb of data, taking into account the filled objects Price, Product. And I drove data on Protobuf and JSON (all the same, just filled in JSON schemes, instead of Protobuf objects)

   

Protobuf parsing

client
2.804999ms
1.8150000ms
0.744999ms
server
1.993ms
0.495ms
0.412ms

JSON

client
0.654999ms
0.770000ms
0.819999ms

server
0.441ms
0.307ms
0.242ms

Thank you all for your attention :)

All Articles