paul bennett

Archive for the ‘web applications’ Category

Twitter in the microblogging world now has the pull of Google in the search world – if you’re not ‘on Twitter’ you’re nowhere inn terms of being able to connect, share and promote via microblogging (surmising that Twitter now gets 80 – 90% of microblogging traffic / use.)
This effectively locks the market into one monopoly platform – Twitter.

Imagine this applied to blogging or web publishing. Say there were many publishing services, you could only use one publishing service at a time and publishing service A had 90% of the total traffic and users.

If you weren’t using publishing service A, regardless of the quality of your content, you’d miss out on 90% of the traffic, connection or exposure just by being on a less popular service. Not a nice thought.

omp-diagram
Would an open messaging platform – essentially a server application which allows you to create a master account and associate ‘child’ accounts to it for other microblogging services (Twitter, Jaiku et al) make any difference in leveling the microblogging playing field?

The idea is that the platform would broadcast your posts or ‘tweets’ to all your subscribed services, but would also aggregate posts from the users you’re following and stream them back to your ‘open messaging’ client.

Rather than ‘a twitter client’, you could have a general micro blogging client offering not only the power to broadcast your posts to multiple services, but to also aggregate the incoming messages regardless of what service they came from.

It wouldn’t matter what service someone you follow was using (surmising that you also had an account on that service), because in your microblogging client, you’d see one unified stream of updates regardless of the API they were built on.

Microblogging would become a standardised platform in its own right.  Rather than having disparate API’s, an open messaging platform could seek and serve to unify microblogging API’s and REST-based services, or at least provide a simple bridging framework between them.

Being frustrated with the tinymce plugin for Expression Engine, I decided to create a rich text editor plugin for Expression Engine using the YUI library simple editor.

Due to a magic combination of:

  1. the awesomeness of the YUI library
  2. the thoroughness of the YUI documentation
  3. the simplicity of creating extensions for Expression Engine

it was surprisingly straightforward.

If you’re using Expression Engine and are either sick of fighting with tinymce or aren’t using a rich text editor, you can download it from http://code.google.com/p/ee-yui/.

I’ve started building Serf – a personal project which I hope people will find useful. I’ve had plenty of ideas over the years but the idea behind Serf is something I find fascinating and feel compelled to pursue.

I’m going to keep it under wraps until it’s released (probably mid-October, if not sooner) but I can say it’ll involve hooking into some large API’s as well as some semantic web action. I’ll post notes as I can, as I’m sure I’ll be needing to solve some interesting problems along the way.

Tags: ,

How do you get listed as having 97% uptime when your service was down for almost a full day in May?

97% sounds like a lot – pretty good in fact. Right?

The reality is that 1 day is around 3% of a calendar month. This means you can effectively take your service out for a full day each month and still claim 97% uptime (assuming no other outages during the month) .

Your customers, however, may not see it so positively

I’ve worked on a project where I used WordPress as a CMS for a (non-blog) content-driven site and was impressed by how flexible it was. I did cheat however and skirted around having to create an entire theme by creating simple PHP pages and pulling the data out using the post ID, as I thought WordPress wouldn’t function as well as a content site when using a theme.

Way wrong. I’m looking forward to using WordPress again, but properly using some of the ifnormation in the following links:

ReadWriteWeb wrote up a while ago about Firefox 3 adding offline support for web applications. Arouind the same time, the Google Gears announcement came out and kind of fizzled. (Anyone using Google Gears? Anyone?)

I put this aside but recently discovered that companies are investing time and money into creating offline version of online web apps and using web app API’s to maintain data consistency. Take this offline front end for basecamp as an example.

Forgive me here, but isn’t connection to the internet becoming MORE pervasive every year. We now have not only desktop PC’s accessing and modifying information on the the internet, we have laptops, mobile phones, net-enabled devices and even ambient devices. The future is kind of banking on increased access to ‘the cloud’ in order for everything to tie together.

If today was 10 years ago, then this would make sense, but offline applications simply make no sense to me. If you feel the need for offline apps it seems you need to ask yourself why. If it’s a connection quality issue then it’s your connection which needs dealing with. If it’s commuting then surely there are far simpler tools you probably already have (text editor, word processor) which can perform the same basic functions until you’re online again.

If it’s about using the browser as platform, that’s been happening with Mozilla and Firefox for years via XUL and the Mozilla development engine products like Komodo use to get cross-platform applications functioning much easier.
The only reason I see for this development (especially in Firefox 3) is so Google can add simple offline support to it’s office suite in order to remove one more Enterprise excuse, but it still confounds me that time and energy is being sunk into offline support.

I’ll wait patiently to be educated..

The base documentation from Adobe is pretty good, but there are a couple of gotcha’s which aren’t immediately clear.

  1. You need to copy AIRAliases.js from
    <air install dir>/frameworks
    into your application directory (the same place you’ve placed you xml config file and sample html file.)
    If you try to test the sample application without doing this, you’ll get an error.
  2. My Windows setup is weird about PATH variables for some reason. If you get the standard ‘adl is not recognized as an internal command’ error, you can simply add the path to the adl executable into your command line query like so:
    <air install dir>/bin/adl.exe HelloWorld-app.xml
  3. By default, when an application is packaged, the ADT process attempts to contact a time server to generate a timestamp. If you use a proxy server to connect to the internet, you’ll get a ‘connection refused’ error.
    You can get around this by adding ‘-tsa none‘ before the file component of the command like so:

    adt -package -storetype pkcs12 -keystore sampleCert.pfx -tsa none HelloWorld.air
    HelloWorld-app.xml HelloWorld.html AIRAliases.js


Archives