Skip to content

JDTLS Rust Proxy#214

Open
tartarughina wants to merge 11 commits intozed-extensions:mainfrom
tartarughina:rust-proxy
Open

JDTLS Rust Proxy#214
tartarughina wants to merge 11 commits intozed-extensions:mainfrom
tartarughina:rust-proxy

Conversation

@tartarughina
Copy link
Collaborator

This PR introduces a new proxy written entirely in Rust.

The proxy sits between Zed and JDTLS, forwarding LSP messages bidirectionally, sorting completion responses by parameter count, and exposing an HTTP server for extension-originated requests.

Motivation

  • Eliminates the Node.js runtime dependency
  • 771 KB static binary
  • Faster cold start as no V8 JIT warmup
  • Lower memory footprint as no garbage collector overhead
  • ~2.5x faster median message processing (benchmarked on Apple Silicon)
  • Cross-platform — builds for macOS, Linux, and Windows (x86_64 + aarch64)

Other changes included

  • Add justfile with dev commands (proxy-build, proxy-install, ext-build, fmt, clippy, all)
  • Add CI/CD workflow that bundles the proxy with each release

tartarughina and others added 8 commits March 8, 2026 20:32
Implements a high-performance LSP proxy in Rust to replace the Node.js
version, eliminating the 50MB runtime dependency. The proxy forwards
LSP messages between Zed and JDTLS while sorting completion items by
parameter count.

Key improvements:
- 2.5x faster message processing (13µs vs 33µs average)
- 771KB static binary vs 50MB Node.js runtime
- Cross-platform CI builds for all supported architectures
- HTTP server for extension requests with 5s timeout
- Parent process monitoring to prevent orphaned JDTLS instances
Move Unix and Windows parent monitoring implementations into
separate `platform` modules
@tartarughina
Copy link
Collaborator Author

tartarughina commented Mar 14, 2026

Some additional info about the new proxy.

  • I've been running it for a week on MacOS and on a remote Linux machine without issues.
  • Windows has been tested as well although feedback for it was minimal, potential blocker.

A comparison between Node and Rust powered by Opus 4.6

LSP Proxy Benchmark: Node.js vs Rust

Overview

This report compares the performance of the existing Node.js LSP proxy (proxy.mjs) against the new native Rust replacement (java-lsp-proxy). Both proxies sit between Zed and JDTLS, forwarding LSP messages bidirectionally, sorting completion responses by parameter count, and exposing an HTTP server for extension-originated requests.

Methodology

  • Both proxies were instrumented with high-resolution timing (nanosecond on JS via hrtime.bigint(), microsecond on Rust via std::time::Instant)
  • Benchmarking was gated behind LSP_PROXY_BENCH=1 — zero overhead when disabled
  • Each message records: direction, LSP method, payload size, and proxy processing overhead in microseconds
  • Overhead measures only the proxy's own processing time (parse → transform → forward), excluding JDTLS response latency
  • Tests were run on the same machine (macOS, Apple Silicon) with the same Zed configuration and JDTLS version, performing typical editing workflows: navigation, completions, saves, diagnostics

Test Environment

Details
Machine macOS, Apple Silicon (aarch64)
JDTLS 1.57.0-202602261110
Node.js v24.11.0 (Zed-bundled)
Rust proxy Release build, 771 KB binary
Zed Dev extension

Results

Node.js Proxy (3,700 messages)

Direction Count Min (µs) Median (µs) P95 (µs) P99 (µs) Max (µs) Avg (µs)
client → server 1,399 3 16 81 147 429 28
server → client 2,011 4 24 74 121 4,501 32
server → client (completion) 290 13 50 179 272 458 71
Total 3,700 33

Total overhead: 124,796 µs (~125 ms)

Rust Proxy (5,277 messages)

Direction Count Min (µs) Median (µs) P95 (µs) P99 (µs) Max (µs) Avg (µs)
client → server 2,093 0 7 32 58 269 10
server → client 2,666 1 8 32 63 1,185 12
server → client (completion) 523 4 17 116 143 253 29
Total 5,277 13

Total overhead: 72,026 µs (~72 ms)

Head-to-Head Comparison (Median)

Direction Node.js Rust Speedup
client → server (passthrough) 16 µs 7 µs 2.3x
server → client (passthrough) 24 µs 8 µs 3.0x
server → client (completion sort) 50 µs 17 µs 2.9x
Overall average 33 µs 13 µs 2.5x

Tail Latency Comparison (P99)

Direction Node.js Rust Improvement
client → server 147 µs 58 µs 2.5x
server → client 121 µs 63 µs 1.9x
server → client (completion sort) 272 µs 143 µs 1.9x

Analysis

  • The Rust proxy is 2.5x faster on average across all message types
  • The completion sorting path — which involves full JSON parse, field mutation, and re-serialization — shows a 2.9x improvement at the median (17 µs vs 50 µs)
  • Tail latency (P99) is ~2x tighter in Rust, meaning more predictable performance
  • Both proxies add negligible latency compared to JDTLS response times (typically 10-500 ms), so the user-perceived difference is minimal

Personal touch

  • ONE IS WRITTEN IN RUST (therefore better)

Copy link
Collaborator

@playdohface playdohface left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This all looks very reasonable to me, and works without issues on my system, including the debugger 🚀
Things that I had on my mind while reviewing this

  • it might be good to include some way to log from inside the proxy for debugging (I think sending a "window/logMessage" is going to make it show up in Zeds Server Logs). eprintln! will only be shown if we fail to start as far as i can tell
  • proxy::main could likely benefit from splitting, but for now it's perhaps better as one file, to stay in line with the old one.

Comment on lines +233 to +241
fn encode_lsp(value: &Value) -> String {
let json = serde_json::to_string(value).unwrap();
format!("{CONTENT_LENGTH}: {}\r\n\r\n{json}", json.len())
}

fn encode_lsp_serializable(value: &impl Serialize) -> String {
let json = serde_json::to_string(value).unwrap();
format!("{CONTENT_LENGTH}: {}\r\n\r\n{json}", json.len())
}
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this could just be encode_lsp(value: &impl Serialize) -> String

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants