NordVPN Android
The JIT Tax
NordVPN's login flow was 60% slower than it had to be. Not a network problem. Not a server problem. The device was recompiling the app from scratch on every install.
The Incident
NordVPN measured their Android app against internal performance benchmarks and found cold start averaged 4.3 seconds, warm start 2.7 seconds, and the login flow — the most critical conversion path — was 60% slower than the team's target. The backend was fast. The network was fine. The problem was that every time a user installed or updated the app, the Android runtime had to reinterpret and JIT-compile the code from scratch before it could run efficiently.
Evidence from the Scene
- Cold start: 4.3 seconds. Warm start: 2.7 seconds. Both above benchmark.
- The login flow consistently took 60% longer than the engineering target
- Backend response times were normal — the delay was entirely on-device
- Performance improved noticeably after a user had launched the app 5–6 times
- A new install on the same device was always the slowest session
- Android Profiler showed high JIT compilation activity during the first few launches
The Suspects
1 of these are the real root causes. The others are plausible-sounding distractors.
No Baseline Profiles — every install required full JIT compilation of hot code paths
R8 full mode disabled, leaving dead code in the binary and increasing JIT workload
Login form submitting credentials synchronously on the main thread
Android Keystore operations for VPN certificate verification adding login latency
VPN tunnel negotiation over HTTP/1.1 instead of HTTP/2 adding round-trip overhead
The Verdict
Real Root Causes
No Baseline Profiles — every install required full JIT compilation of hot code paths
Without Baseline Profiles, the Android runtime starts with interpreted bytecode and JIT-compiles hot paths on the fly. For the first several launches — before the device has accumulated enough profiling data — all critical startup and navigation code runs slower than it will after the device 'warms up'. Baseline Profiles pre-compile this code at install time, eliminating the JIT warmup cost entirely.
Plausible But Wrong
R8 full mode disabled, leaving dead code in the binary and increasing JIT workload
R8 would help reduce binary size and improve overall performance, but the specific pattern here — performance that worsens on new installs and improves after repeated launches — is the JIT warmup signature, not an R8 issue.
Login form submitting credentials synchronously on the main thread
A synchronous network call on the main thread would produce a consistent delay, not one that improves after repeated launches. The 'worse on first install' pattern is a dead giveaway for JIT compilation overhead.
Android Keystore operations for VPN certificate verification adding login latency
Keystore operations add fixed latency but do not produce the 'improves after repeated launches' pattern. This is a JIT warmup signature.
VPN tunnel negotiation over HTTP/1.1 instead of HTTP/2 adding round-trip overhead
Protocol overhead would affect every launch equally. The fact that performance worsens on new installs and improves over time is exclusively explained by JIT compilation behaviour.
Summary
NordVPN's performance problem was entirely caused by the absence of Baseline Profiles. Every new install began with interpreted bytecode — the Android runtime had no pre-compiled version of the startup and login critical paths. The fix: generate Baseline Profiles using Macrobenchmark with Gradle Managed Devices (no physical hardware required in CI) and ship them in every release build. The results: cold start 4.3s → 3.2s (26%), warm start 2.7s → 1.8s (33%), login flow 60% faster, overall in-app speed 29% faster. Published by Android Developers in November 2023.
The Real Decision That Caused This
“Shipping an app without Baseline Profiles — leaving every user to pay the JIT compilation tax on every new install and update.”
Lesson Hint
Chapter 7 (Platform & Performance) covers Baseline Profiles, Macrobenchmark, and startup optimization. Chapter 3 (Jetpack Compose) covers the Compose-specific Baseline Profile tooling.
Want to test yourself before reading the verdict?
Open Interactive Case in Autopsy Lab