timetocode

A blog about game development and procedural content generation.

First time here? Try the graphical archive of all posts.

Feel free to email me or Tumblr ask | FAQ | Also on Twitter @bennettor

...querying twitch.tv... http://www.twitch.tv/timetocode

Slowly working up to 100+ concurrent players with HTML5 and node.js
I require a fatal grandiosity in all my prototypes. Without this groundless ambition I run numerous risks. The first and most significant risk is a deep boredom of the soul. The second risk is that any prototype that is not inherently doomed may eventually grow into a job. So really, grandiosity is the only responsible avenue.
In any case, the picture above shows an evolution of my ‘learn-to-network-program’ project. There’s a boring video about the original floating around this blog somewhere. I had been trying to learn entity interpolation and clientside prediction. For those unfamiliar with the concepts, they’re basically both ways of concealing lag and making a multiplayer game look nice. I think i’ve got a grasp on both techniques now. If I still believe this after few projects (and anyone is interested) I may whip up some tutorials.
But all this is so reasonable, and as I’ve been trying to convince you: it would be highly irresponsible not to condemn this prototype by extending it beyond its means.
The above image shows 76 players in varying pastel shades connected to a single core node.js server. To achieve this I opened 76 Google Chrome tabs and moved each player. This required quite a bit of refactoring.  For the sake of taxing my poor server each player is sending data as if they were all moving around. However to be slightly reasonable the game and server were running their simulations at 5 fps — I imagine tuning down the rate of calculations to something like this (or lower) is an acceptable response to this clusterfuck of a situation (76 players all wanting to stand next to each other). This is the equivalent to playing with 200 ping, but thanks to the aforementioned clientside prediction it was barely noticeable. At this point moving around was still very possible, and better than in many games that I’ve played. There were some occasional jumps/skips in the interpolation, I’m not sure if this was a brief spike in the server or something to do with 70+ tabs open in chrome each rendering an html canvas at 60fps and receiving data through sockets at 5fps. When the server was run at a fast 30 fps instead (similar to counterstrike, tf2, etc) I was able to get about 27 players standing close together before the server wanted to die. These are especially interesting numbers considering:
the data being sent has no optimization, it was human-readable JSON and contained much more information than needed for movement alone
i limited this node server to use one thread (out of 8)
at 76 players each ‘tick’ of the server took approximately 55ms, but at 5 fps this means the server was idle almost 3/4 of the time
each player received full updates about every player resulting in a giant json object of approximately 21,600 bytes 5 times per second (a crazy amount of data total: ~100 KB/s per player, meaning the server was moving ~ 7.6 MB/s).
batching information about all other visible players into one large object instead of sending an object per player allowed the server to move from ~25ish concurrent players to 70+ (with consequences, but still surprisingly functional)
a json object fully describing a single player was about 300 bytes, unoptimized
i have no idea what i’m doing
All in all, I set out to break things. I set out to break my prototype server, and I set out to see my interpolation and prediction algos fail. And they all failed plenty, however none of the failures were sufficiently convincing. Every time I revisit this code I end up with some smoothing tricks or a few more simultaneous connections before the server runs into trouble. I’ve seen some of the limits but I doubt that I’m anywhere near the ceiling even now.
Next time I get to tinker with this, I’m going to work the QuadTrees from recently blog posts in. I’m not sure how much processing the quadtrees will consume, but they’ll offer the potential to have players ignorant of one another, as well as a way to assess which areas of the virtual world are crowded. It’d be nice if the quadtrees could make the server stable at ~40 players and a fast tickrate… or possibly 100 players and an RPGish tickrate. I suspect that if players were likely to be far apart, the quadtrees could in fact push a server into several hundreds of players. This would be impressive, I doubt many real MMOs could run off a single box. On that note, these datastructures may also offer a schema for dynamically scaling a single virtual world out across numerous physical servers. 

Slowly working up to 100+ concurrent players with HTML5 and node.js

I require a fatal grandiosity in all my prototypes. Without this groundless ambition I run numerous risks. The first and most significant risk is a deep boredom of the soul. The second risk is that any prototype that is not inherently doomed may eventually grow into a job. So really, grandiosity is the only responsible avenue.

In any case, the picture above shows an evolution of my ‘learn-to-network-program’ project. There’s a boring video about the original floating around this blog somewhere. I had been trying to learn entity interpolation and clientside prediction. For those unfamiliar with the concepts, they’re basically both ways of concealing lag and making a multiplayer game look nice. I think i’ve got a grasp on both techniques now. If I still believe this after few projects (and anyone is interested) I may whip up some tutorials.

But all this is so reasonable, and as I’ve been trying to convince you: it would be highly irresponsible not to condemn this prototype by extending it beyond its means.

The above image shows 76 players in varying pastel shades connected to a single core node.js server. To achieve this I opened 76 Google Chrome tabs and moved each player. This required quite a bit of refactoring.  For the sake of taxing my poor server each player is sending data as if they were all moving around. However to be slightly reasonable the game and server were running their simulations at 5 fps — I imagine tuning down the rate of calculations to something like this (or lower) is an acceptable response to this clusterfuck of a situation (76 players all wanting to stand next to each other). This is the equivalent to playing with 200 ping, but thanks to the aforementioned clientside prediction it was barely noticeable. At this point moving around was still very possible, and better than in many games that I’ve played. There were some occasional jumps/skips in the interpolation, I’m not sure if this was a brief spike in the server or something to do with 70+ tabs open in chrome each rendering an html canvas at 60fps and receiving data through sockets at 5fps. When the server was run at a fast 30 fps instead (similar to counterstrike, tf2, etc) I was able to get about 27 players standing close together before the server wanted to die. These are especially interesting numbers considering:

  1. the data being sent has no optimization, it was human-readable JSON and contained much more information than needed for movement alone
  2. i limited this node server to use one thread (out of 8)
  3. at 76 players each ‘tick’ of the server took approximately 55ms, but at 5 fps this means the server was idle almost 3/4 of the time
  4. each player received full updates about every player resulting in a giant json object of approximately 21,600 bytes 5 times per second (a crazy amount of data total: ~100 KB/s per player, meaning the server was moving ~ 7.6 MB/s).
  5. batching information about all other visible players into one large object instead of sending an object per player allowed the server to move from ~25ish concurrent players to 70+ (with consequences, but still surprisingly functional)
  6. a json object fully describing a single player was about 300 bytes, unoptimized
  7. i have no idea what i’m doing

All in all, I set out to break things. I set out to break my prototype server, and I set out to see my interpolation and prediction algos fail. And they all failed plenty, however none of the failures were sufficiently convincing. Every time I revisit this code I end up with some smoothing tricks or a few more simultaneous connections before the server runs into trouble. I’ve seen some of the limits but I doubt that I’m anywhere near the ceiling even now.

Next time I get to tinker with this, I’m going to work the QuadTrees from recently blog posts in. I’m not sure how much processing the quadtrees will consume, but they’ll offer the potential to have players ignorant of one another, as well as a way to assess which areas of the virtual world are crowded. It’d be nice if the quadtrees could make the server stable at ~40 players and a fast tickrate… or possibly 100 players and an RPGish tickrate. I suspect that if players were likely to be far apart, the quadtrees could in fact push a server into several hundreds of players. This would be impressive, I doubt many real MMOs could run off a single box. On that note, these datastructures may also offer a schema for dynamically scaling a single virtual world out across numerous physical servers. 

  1. kr-studios said: nice work
  2. timetocode posted this
Blog comments powered by Disqus