Just a quick thought I had:
I’m thinking of sites like Google, Facebook, Amazon, etc. Web 2.0/3.0, the semantic web.
These websites are created and developed by some extremely creative teams of people. Built on the foundations of many previous versions, inspired by web sites that came before them and developed around them and are working on constantly improving the user experience.
One aspect of Facebook that I highly praise is the automatic updating whenever a new message, status update, friend request, etc, comes through.
No more hitting F5/refreshing the webpage to find out if someone has messaged you back yet.
Will this sort of ‘On the fly’ updating be expected on the majority of the web sites created from here on out, based on our exposure to such high quality sites and applications? The web is very diverse, and so I doubt this sort of ‘push update’ will be expected on a page where it details history records that never change (static).
I’m just thinking over the implications of how new emerging web sites may have to come out ‘all guns blazing’ to meet up to the sky heights of those ever popular web sites we see so often that their features are a ‘norm’ if they wish to succeed in a user-driven market. This all depends on the users and how they perceive web sites should be accessed in the future.
This could be quite a challenge for a small development team, experienced or not, who have a very large hill to climb to reach the same platform. As always, technology, online and offline, is always developing and each new change, revolution, discovery or update is interesting to observe as it grows.
One thing I will want to see is the development of new websites trying to match the quality of those websites we visit on a daily basis, often without thinking of just how much time, effort, research and manpower has been invested in getting it to that point. Succeed, fail or improve on the model, above and beyond the expectations.