[フレーム]
BT

InfoQ Software Architects' Newsletter

A monthly overview of things you need to know as an architect or aspiring architect.

View an example

We protect your privacy.

Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ

Unlock the full InfoQ experience

Unlock the full InfoQ experience by logging in! Stay updated with your favorite authors and topics, engage with content, and download exclusive resources.

Log In
or

Don't have an InfoQ account?

Register
  • Stay updated on topics and peers that matter to youReceive instant alerts on the latest insights and trends.
  • Quickly access free resources for continuous learningMinibooks, videos with transcripts, and training materials.
  • Save articles and read at anytimeBookmark articles to read whenever youre ready.

Topics

Choose your language

InfoQ Homepage News LinkedIn Adopts Protocol Buffers for Microservices Integration and Reduces Latency by up to 60%

LinkedIn Adopts Protocol Buffers for Microservices Integration and Reduces Latency by up to 60%

This item in japanese

Jul 19, 2023 2 min read

Write for InfoQ

Feed your curiosity. Help 550k+ global
senior developers
each month stay ahead.
Get in touch

LinkedIn adopted Protocol Buffers for exchanging data between microservices more efficiently across its platform and integrated it with Rest.li, their open-source REST framework. After the company-wide rollout, they reduced the latency by up to 60% and improved resource utilization at the same time.

The LinkedIn platform employs a microservices architecture, and for years now, JSON has been used as the serialization format for over 50 thousand API endpoints exposed by microservices at LinkedIn. To help their teams build consistent interactions between services, the company created a Java framework called Rest.li, which became open-sourced.

The framework helps create servers and clients that use the REST style of communication and abstracts away many aspects of data exchange, including networking, serialization, or service discovery. It primarily supports Java and Python but can also work with Scala, Kotlin, JavaScript, Go, etc.

Data and Control Flow Between a Rest.li Server and Client (Source: Rest.li Documentation)

JSON is the default serialization format in Rest.li and has been selected due to its wide language support and being human-readable. The last property, however beneficial, introduces problems from the performance (and particularly latency) point of view.

Karthik Ramgopal and Aman Gupta, engineers at LinkedIn, share challenges with using JSON for inter-service communication:

The first challenge is that JSON is a textual format, which tends to be verbose. This results in increased network bandwidth usage and higher latencies, which is less than ideal. [...] The second challenge we faced was that due to the textual nature of JSON, serialization and deserialization latency and throughput were suboptimal.

The team has been considering alternatives to JSON, looking for a compact payload size and high serialization efficiency to reduce latency and increase throughput. They also didn’t want to limit the number of supported language stacks and enable gradual migration by integrating the new serialization mechanism into Rest.li. Finally, after a comprehensive review, they decided to go with Protocol Buffers (Protobuf), which scored the highest, based on the defined criteria.

The main difficulty around integrating Protocol Buffers into Rest.li was the dynamic schema generation based on the framework's custom schema definition system, PDL. The solution involved generating a symbol table that is used to generate Protobuf schema definition dynamically, but the method for delivering symbol tables varied depending on the type of client. Backend clients fetch and cache symbol tables on-demand, while for web/mobile apps, symbol tables are generated at build-time and included as versioned dependencies.

After changes to the framework were rolled out, the team gradually reconfigured the clients to enable Protobuf instead of JSON using HTTP headers. The result of Protocol Buffers adoption was an average increase in throughput by 6.25% for responses and 1.77% for requests. The team also observed up to 60% latency reduction for large payloads.

Latency comparison between JSON and Protobuf (Source: LinkedIn Integrates Protocol Buffers With Rest.li for Improved Microservices Performance)

Based on the learnings from the Protocol Buffers rollout, the team is planning to follow up with migration from Rest.li to gRPC, which also uses Protocol Buffers but additionally supports streaming and has a large community behind it.

See also the InfoQ Podcast: API Showdown: REST vs. GraphQL vs. gRPC – Which Should You Use?

About the Author

Rafal Gancarz

Show moreShow less

Rate this Article

Adoption
Style

Related Content

The InfoQ Newsletter

A round-up of last week’s content on InfoQ sent out every Tuesday. Join a community of over 250,000 senior developers. View an example

We protect your privacy.

BT

AltStyle によって変換されたページ (->オリジナル) /