Protobufs are amazing. Also, great explanation of how the binary format is better than textual. I remember reading that HTTP 2 also went with binary for the same reason.
I'm using Protobuf since 5 years already, and I love the way it works.
The main reason for me to use Protobuf is when my services have to manage an important data volume. For example, 3 years ago, we had to push, via REST, messages over 10MB. 10MB of JSON is a lot and using Protobuf reduced the the payload size to some Kilobytes.
Recently, I'm using Protobuf when publishing data in Kafka. One of the issues I find in this case is that, if you are using a UI tool to "see" the content of the messages, with JSON is perfectly fine, but with Protobuf is specially challenging because, indeed, you are watching binary information!
Simply put, Fernando!
You nailed the versioning explanation; keeping backward compatibility is a topic often forgotten.
Thanks for the shoutout.
Nicely explained Fernando.
Protobufs are amazing. Also, great explanation of how the binary format is better than textual. I remember reading that HTTP 2 also went with binary for the same reason.
Also, thanks for the mention.
I used protobuf on a single project only. Usually on the systems I maintain this isn't a bottleneck. Great summary Franco!
I'm using Protobuf since 5 years already, and I love the way it works.
The main reason for me to use Protobuf is when my services have to manage an important data volume. For example, 3 years ago, we had to push, via REST, messages over 10MB. 10MB of JSON is a lot and using Protobuf reduced the the payload size to some Kilobytes.
Recently, I'm using Protobuf when publishing data in Kafka. One of the issues I find in this case is that, if you are using a UI tool to "see" the content of the messages, with JSON is perfectly fine, but with Protobuf is specially challenging because, indeed, you are watching binary information!
Nice article Fernando!