Server(s) Side

posted in the game for project unirule
Published October 13, 2018
Advertisement

Hello GameDev,

I've been busy working on Dynamic Assets.  In order to successfully incorporate them this time around I've needed to include a whole host of programming across all the servers for this game and I thought 'why not do a blog about the servers'.  I've never programmed a server before this project and really have no idea what I'm doing.  I just get stuff working and I'm happy.  But seems Node.js is very intuitive and it is so simple that you don't need to be a rocket scientist to figure stuff out.  But a quick overview of my servers and why I have 3 so far and will probably have 5 or 6.

Relay Server
The relay server hosts the website and is responsible for user authentication and what-nots.  While users are connected to the simulation the relay acts as a relay ( imagine that ) between all the clients and the back-end servers.  It relies on socket.io to communicate with the clients and zeromq to communicate with the back-end servers.

Data Server
The data server holds a static version of the world and all it's contents.  It's purpose is to provide all the information needed to a newly connected client.  This way the demand for data isn't hampered by newly connected clients while the simulation is running.  Of course it needs occasional updates and the terrain changes and the assets change.  And soon too updated Simulin positions.

Simulation Server
This hosts the path-finding and tiny bits of user functionality that I've Incorporated thus far.  But basically the simulation server is the work-horse behind the simulation, or at least will be.  I plan on breaking this up into 3 separate servers.  My wish list is to have 2 path-finding servers which each host world information necessary to path-finding and then requests are toggled between the two ( cut my pathfinding time in half ). And an AI server which will handle what it is the Simulin are doing.

--------------------------------------------------

Now my assumption is that each new instance of a Node.js server will utilise it's own processing core?  Am I wrong about this?  I sure hope not because I figure that if I break the needs of the project over multiple servers it will make better use of a computers abilities.  Maybe each Node.js server will operate in parallel? 

Because I'm using many different servers to do all the stuff I need to do it's taking a much longer time to program all this.  Plus, I recently switched all my THREE.js geometry over to buffergeometry.  The servers hold world data information according to vertex objects and face objects, but the clients hold world information according to buffer arrays which require a little trickery to update correctly.  Keeps me on my toes.

-----------------------------------------------------

Anyways, here's some video's to check out of my website

And here is a video of three connected clients.
One client is adding stuff to the world, the other two are receiving the updated content. from a different perspective.

Thanks for checking it out!

Previous Entry Moving the Masses
4 likes 0 comments

Comments

Septopus

Looking GREAT!

What kind of hosting situation are you aiming for?  That should probably be your #1 design restriction for the # of soft-servers you should be thinking about (starting with anyhow).  Cores are about the most expensive aspect of a virtual server hosting plan(if you are going that route).  If you are looking at leasing actual hardware or co-locating your own, then cores are pretty cheap compared to the rest of the hardware/bandwidth investment. 

With that in mind, most modern "cores" can handle multiple threads.  A cursory search seems to indicate that node.js is single threaded(old info so don't take just my word for it), so I would consider 1 for each thread that the core(s) can handle(Processor Specs provided by VM hosting providers will give you this info) the max per Server instance, but probably just 1 per core would be best.  Of course, performance testing is the only way to be absolutely positive of how it will behave in the wild.  When I was a systems engineer I made extensive use of local virtual machines for testing these kinds of things out.  Local VMs are fairly easy to build in all shapes and sizes(#cores(you can even emulate specific processors if I recall correctly)/ram/disk/etc)..  An infinite playground for testing out software configurations.

Anyhow, that's my systems engineering input. ;)

Keep up the killer work!

October 13, 2018 05:19 PM
Septopus

As an example, on Microsoft Azure

(2) servers with 2 cores and 4GB of ram each (month to month) is about $100 a month.. A 1 year reservation is only $600, and a 3 year is around 1300-1400..

Just some info to keep in mind ;)

October 13, 2018 05:34 PM
Awoken

Hi @Septopus, thanks for the feedback.
I intend to buy a computer to host the game.  I will then hook up a business line to handle incoming and outgoing data.  Here in Canada upload speeds are severely limited ( I have my own suspicions as to why ) and you need a business line to have decent upload speed.  If nobody plays my game then at the end of the day I've got a killer gaming rig.  But that's about as far as I've gotten with this idea.  First I'm going to see if I can get 10 people playing my game, haha.  If those ten people play and enjoy themselves ( and most importantly, want more ) then I can start thinking about the future beyond that.

October 13, 2018 07:22 PM
Septopus

Sounds like a reasonable plan to me. ;)

October 13, 2018 08:10 PM
Septopus

Pretty much the same story in the US as well, upload bandwidth is a commercial product almost everywhere.  That's how ISPs make money, it's not off of end users. ;)

October 13, 2018 08:15 PM
Rutin

This is very cool! :) 

October 16, 2018 12:01 AM
You must log in to join the conversation.
Don't have a GameDev.net account? Sign up!
Profile
Author
Advertisement
Advertisement