Game Server Tick Rate Explained: Gameplay Precision vs Infrastructure Cost

Game Server Tick Rate Explained: Gameplay Precision vs Infrastructure Cost for Game Developers

Key Insights

Key Insights

Key Insights

  • Tick Rate Is a Cost Decision: In Competitive games, tick rate carries a significant CPU load and bandwidth cost, making tick rate one of the most consequential infrastructure choices in multiplayer development given its cost while having a significant impact on players experience of the game's "feel".

  • VALORANT's Commitment to 128 Hz: Riot rebuilt core engine systems from the ground up to get from a 50ms server frame time to under 2ms. A years-long engineering investment most studios cannot easily replicate.

  • CS2's Subtick System: Rather than increasing tick rate, Valve introduced microsecond timestamping of player actions, addressing the quantization problem without multiplying infrastructure costs.

  • Marathon's Middle Ground: Bungie's extraction shooter demonstrates that the right tick rate depends on genre requirements and operational capacity, not just on what the highest number is.

  • No Universal Answer: Apex runs at 20 Hz, VALORANT at 128 Hz, CS2 at 64 Hz with Subtick, and Marathon at 60 Hz. Each reflects a different balance of precision requirements, player expectations, and what the studio can sustain.

Outscal’s excellent YouTube channel published a comparative breakdown of server tick rate across Counter-Strike, VALORANT, and PUBG.

This motivated us to discuss how tickrate doesn’t just impact player’s online experience, but also the hosting costs of operating dedicated game servers a different scale of the tick rate range. As the impact can be substantial.

In this article, we’ll highlight some of same examples Outscal highlighted, but also new ones and how each type of tickrate impacts costs. Which asks of every game developers: what is that precision worth?

What Is Tick Rate?

At its core, tick rate is the server's processing rhythm. Each tick is one complete cycle: the server collects every player's inputs since the last update, runs the physics simulation, checks hit detection, and broadcasts the updated world state back to all connected clients. That entire loop is one tick.

Tick rate, measured in hertz, is how many times per second that cycle repeats. A 64 Hz server runs it every 15.6 milliseconds. A 128 Hz server runs it every 7.8 milliseconds. The math is simple: 1,000 ms divided by tick rate gives you the processing window per cycle.

For players, higher tick rates mean the server's picture of the game world is more current. Hit registration is tighter. Movement is more responsive. Situations where a player appears to die behind cover because the server snapshot lagged behind reality happen less often. Each tick is a window, and what falls into that gap matters.

The Tick Rate Cost Equation

Here is where the decision gets hard.

Double your tick rate and you roughly double your CPU load and your bandwidth consumption.

Every player in every active match receives state updates twice as often, which means twice the outgoing data per second. That outbound data is network egress, and egress costs money (unless you use servers which include egress in the price, like Bare Metalv or Private Fleet host) .

At scale (e.g., tens of thousands of concurrent matches across global servers) the gap between 64 Hz and 128 Hz is not a rounding error. It is a significant portion of infrastructure spend. Since egress accounts for 20-30% of overall hosting spend for cloud, doubling tick rate can increase (or lower) that amount.

Then, the impact on vCPU usage can compounds this cost.

Running the game simulation more frequently means the server's processor has more work to do per second: more physics evaluations, more hit checks, more state comparisons. Respawn put this plainly in their official developer deep dive on Apex Legends :

"20Hz servers result in about five frames of delay, and 60Hz servers result in three frames of delay. So for triple the bandwidth and CPU costs, you can save two frames worth of latency in the best-case scenario."

That math was their justification for APEX: Legends staying at 20 Hz.

This is why tick rate is one of the most consequential decisions in multiplayer game design. And it is why the industry has not converged on a single answer.

Paying the Full Price: VALORANT at 128 Hz

Riot Games made their position clear from the start of VALORANT's development: every player, in every match, gets 128 Hz servers. No tiered access, no split between casual and competitive pools.

As Brent Randall, engineer on VALORANT's Gameplay Integrity team, described it in Riot's official technical blog: "'Make the server 20x faster' isn't a very tractable problem." Their starting server frame time was 50 milliseconds. At 128 Hz, the target was 2.34 milliseconds per frame. That gap required rebuilding how the engine used server resources, across almost every system.

Animation was one of the largest cost drivers. The team reduced it by calculating character animations every fourth frame and interpolating between them, cutting animation processing by 75%. During buy phase, when no combat can occur, they disabled server-side animations entirely, saving another 33%. They also replaced Unreal Engine's default networking layer with a custom RPC-based system that, in their own documented testing, delivered 100x to 10,000x performance improvements depending on the type of action. Even hardware selection came under review, with a server switch for cache architecture differences contributing another 30% gain.

And even after all of that, a 2022 engine update broke their performance during the competitive season, requiring a full team response days before Champions. They fixed it. The system came back better than before. After patch 5.07, Riot reported that 99.3% or more of server frames met their strict 128 Hz budgetl.

That is the real cost of committing to 128 Hz. Not just higher compute and egress bills, but an ongoing engineering obligation to defend that budget against every new feature, every engine update, every seasonal patch.

It is achievable. Riot proved it. It requires sustained investment that not all studios are in positioned to replicate such a massive endeavour.

Rethinking the Premise: Counter Strike 2's Subtick System

Valve looked at the 128 Hz engineering commitment and asked a different question. What if higher tick rate is solving the wrong problem?

The root issue with fixed-interval tick systems is quantization. At 64 Hz, two players who fire at almost the same instant are processed within the same tick. Whoever's action lands closer to the tick boundary gets processed first. The other player can lose a duel partly due to timing chance rather than skill. Raising tick rate narrows the window but does not eliminate the problem.

CS2's Subtick system addresses this differently. Instead of processing all inputs at fixed tick boundaries, the server records a precise microsecond timestamp for every player action.

A shot fired partway into a tick cycle gets logged at its exact moment. A counter-shot fired later in the same cycle gets its own timestamp. The server then processes these events in true chronological order, independent of the base 64 Hz broadcast rate.

Valve described the goal directly on the CS2 announcement page: "Tick rate no longer matters for moving, shooting, or throwing. Sub-tick updates are the heart of Counter-Strike 2." The server still runs at 64 Hz for state broadcasting, but the events that occur inside each tick are resolved with microsecond accuracy. Temporal precision, decoupled from infrastructure cost.

Valve also locked third-party services, including FACEIT, to the same 64 Hz base rate, eliminating the 128-tick community servers the competitive scene had relied on for years. The technical case for Subtick is solid. The friction it generated was real. It is a useful reminder that architectural decisions in multiplayer do not live in isolation from player expectation and community history.

Finding the Middle Ground: Marathon at 60 Hz

Not every studio lands at one extreme.

Bungie's Marathon, an extraction shooter released in 2026, runs on 60 Hz servers. In a genre where losing a firefight means losing your entire loadout for a run, server precision is not an academic concern. One reviewer described the 60 Hz tick rate as "easily Marathon's most underrated feature", noting that tight hit registration is core to the game's competitive integrity in high-stakes duels.

By comparison, ARC Raiders, a competing extraction shooter, runs at 20 Hz. The difference shows most in intense PvP encounters, where latency or desync would otherwise punish players unfairly. Marathon's choice sits well above the baseline most extraction shooters operate at, without committing to the full engineering overhead that 128 Hz demands.

It is a deliberate middle point. For a genre built around high-consequence encounters with low time-to-kill, 60 Hz provides enough precision to make gunfights feel fair while keeping infrastructure costs below what a VALORANT-style commitment would require. The right tick rate is not always the highest one available. It is the one that fits your game's precision requirements against what your infrastructure can actually sustain.

There Is No Right Answer: Just the Right Trade-Off

APEX: Legends runs at 20 Hz, VALORANT at 128 Hz, CS2 at 64 Hz with Subtick, Marathon at 60 Hz. Arc Raiders at 20 Hz, and reportedly Hunt Showdown at 30 Hz and Escape from Tarkov at 12-16Hz.

Each choice reflects a different calculation of what the game demands and what the studio can support operationally.

The underlying logic is consistent across all of them. Higher tick rates send more data to more players more often. They demand more vCPU time per simulation cycle. They cost more at scale.

For some games and genres, that precision is worth the expense. For others, the marginal improvement in hit registration does not justify doubling the egress bill.

It is also worth noting that tick rate is not the only lever. Deploying servers closer to players reduces the baseline latency that tick rate decisions are partly trying to compensate for. A well-placed 64 Hz server can likely outperform a 128 Hz server deployed far from its player base. Platforms like Edgegap's game server orchestration network deploy across 615+ locations globally and deliver an average latency reduction of 58%. Where latency is proven to be the #1 reason for player’s frustration with online games.   

In other words, tick rate and network placement work together. Neither tells the full story alone.

The decision is a trade-off. Make it deliberately, with a clear view of your game's genre, your player base's expectations, and your team's capacity to maintain whatever standard you commit to.

If you want to see how tick rate decisions interact with your actual server costs, Edgegap's pricing calculator lets you model this against your player count and regions. --

This article is based on and cites the video "64 vs 128 Tick Servers: CS2 vs Valorant vs PUBG" by Outscal, published on YouTube, alongside primary technical documentation from Riot Games, Respawn Entertainment, and Valve's Counter Strike 2. All rights in the original content are owned by their respective owners.

Get your Game Online Easily & in Minutes

Start Integrating Now!

Get your Game Online Easily
& in Minutes