
Multiplayer Game Hosting Deep Dive: Destiny 2

Destiny 2 is an exemplary game that harnesses multiple online services for seamless gameplay. This article explores how Destiny 2 effectively utilizes peer-to-peer (P2P) networking and authoritative servers to create an immersive multiplayer experience.
Hybrid Server Technologies in Gaming: A Deep Dive into Destiny 2's Networking Model
Destiny 2, developed by Bungie, utilizes a blend of two networking technologies, creating a hybrid model that distinguishes it from most online shooters. Their solution combines the strengths of both traditional approaches while mitigating some of their inherent flaws.
This article draws on two GDC talks we've covered in depth: Justin Truman (now Studio Head at Bungie) on Destiny's networked mission architecture at GDC 2015 (Destiny 2's Network Architecture - Multiplayer Game Deep Dive), and John Chu (now Lead Technical Program Manager at Bungie) on building cross-play across seven platforms at GDC 2022 (Developing Destiny 2's Crossplay - Game Backend Deep Dive).
Understanding Peer-to-peer and Client-Server Models
To appreciate Bungie's approach, it helps to understand the two building blocks they combined. Peer-to-peer networking is a form of direct communication where clients (players) interact with one another without an intermediary. Its main advantage lies in lower latency. However, it often leads to security and privacy concerns, as we'll explore later.
The client-server model, on the other hand, operates via a dedicated server that hosts the game, providing stability and consistency but potentially introducing latency due to geographical distance between players and servers.
Destiny 2's Unique Hybrid Model
Bungie's goal from the start was, as Truman described it, to "blur the line between singleplayer and multiplayer experiences," letting players encounter others simply by hitting "play Destiny" without traditional matchmaking lobbies. Accomplishing that required a networking model unlike anything a traditional dedicated server or pure P2P approach could provide on its own.
The solution has three distinct layers working in parallel.
Shared Zones (P2P): The first is the P2P layer. Players within a shared zone communicate directly with each other, forming what Bungie called peer-to-peer "bubbles." Which are direct client-to-client connections as to keep the responsiveness high for movement and moment-to-moment interaction, and which enable Bungie to connect those “bubbles” together so that strangers can encounter each other organically mid-mission without a lobby or loading screen.
Physics Host (Dedicated Servers): The second is the physics host, which in Destiny 2 runs on a dedicated server rather than a player's machine. This was a deliberate fix for one of Destiny 1's most persistent problems: host migration. When the player hosting physics left a session, the game stuttered or broke entirely. Moving that responsibility to a dedicated server removed the dependency on any individual player's connection and made sessions stable regardless of who joined or left.
Activity Host (Purpose-built Authoritative Process for Mission State): The third is the activity host, a stripped-down cloud process that Truman's team designed to run only the state that mission logic strictly required. Not bullet trajectories, not animation state, not world-space positions. Just the discrete facts that activity scripts needed: whether a squad had spawned, how many enemies remained alive, whether an objective had triggered. The result was an executable of around 45MB, running at 10Hz, that allowed a single 40-core server to handle close to 5,000 simultaneous instances. As Truman explained, that translated to roughly 1 million concurrent users supported by just a few hundred machines. A full-simulation dedicated server approach would have required something closer to half a million headless processes running the entire game.
The key discipline that made this possible was what Bungie called sensors: a declaration-first system where a script had to formally declare what state it needed before it could reference it.
Nothing drifted onto the server by accident. If a script didn't declare a dependency, that state didn't persist server-side. It kept the activity host footprint small, bandwidth predictable, and the architecture honest.
In the transition from Destiny 1 to Destiny 2, those three layers together replaced the single fragile point of the player-hosted physics model. When Bungie eventually shipped cross-play, John Chu described the underlying networking changes required as "some of the biggest and scariest bone-breaking changes" in the engine, precisely because that layered architecture had to be rebuilt to carry consistent state across platforms that had previously been developed and optimized in isolation.
Challenges of Destiny 2's Hybrid Model
Bungie's own engineers have described their networking topology as "uniquely complicated." The player-facing consequences bear that out.
IP exposure is a structural consequence of P2P, not a bug. In any peer-to-peer setup, clients communicate directly with each other, which means every player you encounter can see your IP address. In Destiny 2, that's every player in both PvE and PvP. As Chris "Battle(non)sense" observed in his 2017 PC Gamer analysis, the PC version still carried a clear set of trade-offs that dedicated server games simply don't face:
"The game shouts out your IP address to everyone you encounter, overlays from third party tools don't work, the functionality of gameplay capture tools is in some cases limited, the responsiveness of the hit registration will vary on a player to player basis, and some players will run into issues with the required upstream bandwidth, a problem that rarely happens with dedicated servers."
For studios considering a hybrid model, IP exposure is worth treating as a first-class design constraint. P2P and IP privacy are fundamentally incompatible without a relay layer between peers.
Security compounds this too. When clients handle sensitive tasks like hit registration in a P2P environment, the game becomes harder to protect from exploitation. The common principle in multiplayer development is "never trust the client." In a hybrid model, that principle gets harder to enforce cleanly.
Netcode: Where the Seams Show
Note: the following information is based on network testing by Chris "Battle(non)sense" for PC Gamer in 2017; some specifics may have shifted with subsequent backend updates.
In a dedicated server game, your hit registration delay comes down to one variable: your ping to the server. Consistent, predictable, unaffected by anyone else in the match.
P2P removes that guarantee entirely. As Chris noted, "the quality of the players' internet connections, and the distance between them, have a direct impact on how responsive the hit registration feels." A player sending updates at 15Hz instead of the expected 30-40Hz slows down the lag between you and them measurably, regardless of how solid your own connection is. One player's poor upstream degrades the experience for everyone connected to them. And the number of players on residential connections with limited upload capacity is larger than most developers expect.
Dedicated servers absorb that variability centrally. P2P distributes it across every peer relationship in the match.
Bandwidth scales poorly with player count for the same reason. In one hour of 4v4 Crucible, Chris measured a single client sending 152.3MB and receiving 128.3MB. In a P2P topology, adding players multiplies traffic in every direction. That math likely informed Bungie's decision to cap Crucible at 4v4 rather than 5v5 or 6v6. Studios designing around a hybrid model should run their own bandwidth projections early, especially for larger player counts.
Finally, tick rate carries its own nuance here. Destiny 2 samples combat damage at 30Hz (the 10Hz figure that circulated as a rumor refers to the Activity Host, which handles scoring and ammo spawns at a lower frequency). On PC, clients can send updates at up to 40Hz when frame rate exceeds 40fps, a direct coupling between rendering performance and network update rate that has no equivalent in a dedicated server architecture.
Private Fleet: Hybrid Orchestration at Scale
For studios running infrastructure at Destiny 2's scale, server orchestration becomes a core engineering concern.
Edgegap's Private Fleet is a container-based orchestration solution that handles persistent servers to optimize costs, then automatically burst to cloud with just-in-time deployment and global server deployments across 615+ locations worldwide. Proximity hosting places players on the nearest available server, which reduces latency by up to 58% on average, without requiring studios to build and manage that infrastructure themselves.
Conclusion
The hybrid networking model Destiny 2 employs illustrates both the possibilities and the real costs of blending server architectures. It proves that combining P2P flexibility with the authority of dedicated servers can deliver seamless shared-world experiences at scale, but it requires careful engineering to address the security, latency, and consistency challenges that come with it. The two GDC talks linked above go considerably deeper on how Bungie solved those problems in practice, and are worth reading alongside this piece.
—
This article draws on insights from the GDC 2015 presentation by Justin Truman, the GDC 2022 presentation by John Chu, and network analysis by Chris "Battle(non)sense" published by PC Gamer in November 2017. All rights in the original content are owned by their respective owners.
Written by
the Edgegap Team









