Ready to PUSH Enterprise Past the Limits?


So you’ve already hopped on the BlackBerry Enterprise Push bandwagon, and built your first few apps… all is well in the world. You always had a suspicion that the 8K barrier for Push would come into play, but warnings soon turn into concerns, and all but fade from memory… Until, one day you create an app that you have to push a significant amount of data. I mean, realistically, it only takes 50 contacts in the new ECL application to break the 8K limit! Of course you could “Push and Pull”, but that is a significant amount of added effort when you just need to send a “little more data”.

So what do we do? In comes compression to the rescue… or so you’d think. First you could try CJSON, why not right? It’s compressed JSON, except it’s really not compressed well – 5% just doesn’t cut it. Strike one.

You could be more bold and try to compress the entire headers with GZIP or DEFLATE – makes sense right, after all you are just using standard browser technologies? But… bummer, even though you can compress it server-side you have no way to unzip/undeflate the payload when it’s not supported in BrowserField (and hence WebWorks)! Strike two. At least your only two options: gzip and deflate weren’t that great to begin with anyway… so no hard feelings.

Ok, now it’s time to be REALLY bold. If we could start from scratch, the best compression technology out there now is arguably LZMA ( – if you aren’t using it already you should be! (compare compression technologies for yourself at:

So now we have the end-goal, if there was a way to compress it server-side then uncompress it client side in pure JavaScript, we could maybe make this happen. Futhermore, since LZMA compression is high and can be slow – we need a way to make it fast on these mobile devices without introducing substantial overhead or intermediate servers! Well, we’re in luck, Archie Cobbs was able to port the LZMA SDK from Java and make it available to all as gwt–lzma: Now we want it to be as fast as possible, so the use of WebWorkers is mandatory to allow threads to spawn in JavaScript, to that end we can use a nice open-source example that shrink-wraps it all at:

I’ve simplified everything out into appropriate libraries now. It is all elegantly tied into the latest ECL app code on GitHub. Here is what the server side screen looks like:


And of course, the relevant code you’ll want to implement…

CODE: (server side)
// Really compress that content using webworkers…
var compress_ratio = document.getElementById(‘compress_ratio’).value;
var content = document.getElementById(“content”).value;
my_lzma.compress( content, compress_ratio, function (result) {
var compressed = JSON.stringify(result);
var compression = (((content.length-compressed.length)/content.length)*100).toFixed(2);

Note: I’ve used the handy JSON.parse / JSON.stringify to protect the blob format that gets pushed between client and server and protect the integrity of our data.

CODE: (client side)
var content=document.getElementByID(“content”).value;
//Convert the string back into a ByteArray using cool JSON util
my_lzma.decompress(byteArr, function (result){
//Update with the compressed payload
document.getElementByID(‘content’).value = result;

It’s that simple. All the pieces together now, check it out in action with the full ECL app in my latest repository @

Ok and what kind of compression ratios can we expect in real life, well nothing like a chart to show that…


Try these yourself. Please note, it may seem odd that the HUGE list (the bottom row in the chart) compresses to a smaller size than the BIG list above it. This is due to similar and repeated data – so the compression is much higher instead of totally random data. I maintain that this is likely a more common occurrence in real world datasets (everyone works for the same company name, similar telephone numbers and addresses, etc…).

Join the conversation

Show comments Hide comments
+ -
blog comments powered by Disqus