Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie
Hi there,
There is an issue with role permissions that is being worked on at the moment.
If you are having trouble with access or permissions on regional forums please post here to get access: https://www.boards.ie/discussion/2058365403/you-do-not-have-permission-for-that#latest

Pushing processor cycles onto the client

  • 12-05-2011 8:39pm
    #1
    Registered Users, Registered Users 2 Posts: 859 ✭✭✭


    we have our own application/web servers at the moment but have started to look at AWS in time for the next hardware refresh and an interesting factor has come into play - we have a primary data source which always has to be decoded from binary into text, (i.e. the response from the data source is always in binary)

    we have a choice to push data to the client in binary and decode it there in the browser or do it on our servers and serve json to the client.

    If we move to AWS then there are probably significant and tangible savings by moving all this computation to the client where we don't pay for it. Could this be a game changer and make javascript far more important?


Comments

  • Registered Users, Registered Users 2 Posts: 11,989 ✭✭✭✭Giblet


    Older browsers will choke with heavy computations without abusing setTimeout and Web Workers which either are hacks, or barely supported. JSON is easier to process, depending on the dataset size, and more workable. Your server should optimise the processing if possible, can possibly cache results too. If you decide to go the JavaScript route, learn your JavaScript and don't use any libraries. JavaScript is very important at the moment and getting more so. Just be aware of it's limitations on older browsers.

    If you are doing processing with the DOM, be aware of how writing to it will freeze the browser, so cache, clone and move off DOM if possible.


  • Registered Users, Registered Users 2 Posts: 3,141 ✭✭✭ocallagh


    I'm working on a cloud based app (hosted with rackspace) with an 'obese' client right now.

    We sync a lot of static data with the client and store in SQLite DBs for each client browser. The effect has been very positive. We've actually downgraded our hardware recently as it was no longer required. This was not the original intention though, clients needed instantaneous results from AJAX calls.

    A few things of note:
    • You need a very competent team of JS developers. Roughly 30% of our code base is now JS. We use jquery which helps a lot.
    • You need a very competent UI designer with experience in JS
    • Security is a concern - I wouldn't use this approach in the public domain (Both JS and Data are out there for anyone to look at)
    • Future of Web Databases is uncertain.

    We're looking at moving away from this solution though. SQLite has it's limitations. For each client install, we'll setup a secure local db server on the office network. The client's browser will request data from this DB through a browser plugin. Again, this will only be for reading static data and the primary reason is to speed up the AJAX calls.

    Overall, it's a bit messy, and if it's just CPU you're looking to reduce I'd rather keep it as a traditional thin client and just beef up your server hardware. It's much more secure and there are fewer points of failure.

    edit: If you're looking at just processing one common task which needs a lot of CPU, you should look into NPAP/extensions/dlls etc - you could push a lot of your code in C++ (which would be much faster than JS) and call it from JS through a browser plugin.


Advertisement