Archive for the ‘Uncategorized’ Category

The Promise of Honeycomb

January 12, 2011

Wow….  So, as I noted in my last post, I was watching the CES presentation of Android Honeycomb last night and ultimately I was blown away.

During the presentation, I felt like Honeycomb had some nice improvements over the current version of Froyo/Gingerbread, but nothing really mind bending.  It wasn’t until I went back to using my laptop that I realized the potential that I see from the concepts shown in Honeycomb for computing and its integration with my life and work.

Ultimately, I think what really impressed me was the interface to my data and the increases in efficiency the new interface will give me in my ability to see it and manipulate it.  I’ve already experienced this change with how I currently use my phone (a Nexus One with stock Android 2.2) in comparison to how limited the Windows interface is with the only access being a keyboard and a mouse.  From the improvements I saw during the presentation, I can see how these changes can be applied everywhere on all my computing devices.  Google is moving leaps and bounds faster than Microsoft.

The way I’m looking at it, my “data” is like a sculpture.  Throughout the day, I manipulate it to see various aspects of it, combining the pieces in various ways to use it to get work done.  My calendar here, an address there, a video here, some writing there, a picture there.  Ultimately, my data could all be accessed as if it was in one place, and it was all one thing, moldable to be able to create new things. My data today is like a bunch of blobs here and there, to the point I don’t even know where all the blobs are.  And it takes a lot of work to be able to combine it together to make what I need, whether it’s a document, or getting directions to a job I have to go to.  Put in another way, using a keyboard and a mouse to manipulate my data is like sculpting a pile of clay with a butter knife holding it in my fist; very awkward.

In the meantime, on my phone, I’m able to use my fingers to precisely and quickly do exactly what I want, and all my data is starting to come together as objects that know what each other are and are able to morph into other objects without my even telling them what to do.  Instead of having to find a program then find and open a document in the program; Widgets show my data right in front of me on the screen, and if I need more detail, I just touch the widget.

The promise of Android Honeycomb as I see it is that we’ll be able to use this same interface directly on our desktops.  Yes, there are already pieces of the functionality in Android in our desktops today – I could set up multiple desktops years ago in Windows or Mac OS, but the tools were just so clunky to me I’ve never bothered to try.  Touchscreens and widgets make my data sooo much more directly accessible that these concepts actually make sense, and widgets allow so much flexibility in presenting and using my data.

Without going into too much detail, I’m imagining a desk with a touchscreen built into the top of it, and maybe a second vertical touchscreen at the back of the desk.  And now I can drag each “desktop” into view, just like on my phone, maybe moving one up to the vertical screen, maybe turning on a keyboard on the desktop.  Who needs a physical keyboard anymore? I like the keyboard on my Nexus One just fine, which I wasn’t sure of when I bought it.  And if my desktop screen is big enough, I could display multiple “desktops” at once, and move them around as I need them like a picture puzzle, sharing data between them for the various widgets.

And as more and more of my data moves to the cloud, I don’t have to worry about what is stored where anymore.

I realize that I’m not putting forth anything new here, that this is all the vision that has been growing since time immemorial.  But I was so struck with the leap we are about to take I just had to write about it to release my excitement.  All I know is I can’t wait to see what’s coming!

Some Etiqette for Google Wave

January 4, 2010

Another article with some excellent thoughts about Google Wave was recently posted by James Gurd – Will GW be the stand-out collaboration success of 2010? 

I especially liked his thoughts on the etiquette of waving:

New skills and practises will be needed to manage Waves

As with any new plaything, there is an adjustment required to get people using Wave effectively to communicate. Here are some of the things to look out for:

  • Keep your Wave clean. People can stray from purpose/goals and you need to be able to keep people on-topic and content relevant.
  • Guide the conversation, not control it. The personality behind the wave is important.
  • Learn how and when to edit the conversation. You can go back and change what you’ve written but do so sparingly to avoid disrupting the flow and retrieval.
  • Move towards greater collaboration, not individual domination.
  • Real time typing can be a threat. People worry about the impact of errors; there is a cultural shift in embracing this as real-time communication.
  • Participants need to be educated on the etiquette of waving. Understanding that conversations are as imperfect as conversations when speaking face-to-face.
  • Waving is not designed for one to one communication. This is for one-to-many and many-to-many.
  • Conversation flow can be personalised. Allow people to have a voice.
  • How do you make the right decision for the benefit of the wave? When do you leave? When do you edit? When do you need a summary? Who does this?

A lot of the initial waves I’ve read were pretty poor, basically a bunch of people talking about how cool Wave was.  The real problem being that the waves looked like a conversations, which I think is an inherent problem because of the way Wave works.   Too much “wasted talk” that is irrelevant to the subject being discussed, many blips that will have to be removed if the wave is going to be succinct enough for somebody to want to wade through it later.

 I really think for Wave to be a useful tool, users are going to have to be careful with what they keep in a wave, and perhaps with what they say up front.

To that end, I hope they add some tools for organizing wave information, i.e., the ability to collapse threaded blips, and even ways to annotate blips as you create them, say, “add conversation blip” choice, that a robot can look for and clean out of the wave say 2 days hence or something.  This would allow users to have a discussion live, but have those unnecessary blips automatically cleaned out after a certain date.