node.js - Socket.io client does not emit based on available memory/cpu in tab? -


i've discovered odd quirk using socket.io. have simple setup: 2 clients on separate computers , node.js server on 1 of clients. tick rate (time between each message) 100ms (i.e. reasonable). both clients communicate server socket.emit , server sends message both clients. works great. well, here's what's odd ...

this webgl web game, mind you, there lot of stuff happening in tab. when things start cook in game , tab thread starts use more system resources, socket ceases emit server. game running fine, perhaps drop in fps no serious crash or that. in addition, can tell socket.emit code being executed (or @ least called) every 100ms normal (by executing console.log directly before it). despite that, server doesn't message, sent client socket.emit. know because console.logging every communication makes server...

i can literally increase size of window (which sucks more resources, because webgl renderer has more work) , messages cease received on server when window gets big. bizarre! why sucking more system resources cause socket.emit messages not reach server? , if did function that, why wouldn't receive message/error/warning on client or server? have difficulty believing socket.emit decides not work based on how system resources there are.

on clients i'm using latest chrome, phaser.io (pixi.js renderer) , mac osx. hell going on??? many thanks.


Comments

Popular posts from this blog

javascript - Thinglink image not visible until browser resize -

firebird - Error "invalid transaction handle (expecting explicit transaction start)" executing script from Delphi -

mongodb - How to keep track of users making Stripe Payments -