Always up for discussion!

Please keep discussions civil.  Drivebys/angry politicos/hateChat and other unhelpful comments will receive a tap with the banhammer.


We all Launch Together


New Languages and New Technologies

There’s an etiquette that needs to go with every new technology.  Google’s Glass Explorer experiment was an exercise in this.  Some might regard it as a failure, but I tend to look at it as another necessary step.  Without an established etiquette, a visual body language adopted by the users, a code of interaction that anyone *not* employing or familiar with Glass could understand, conflicts arose.  In some cases those misunderstandings were bordering on violent.  It’s a lesson to all developers of interactive wearables, and I don’t think many of them have taken it to heart just yet.

This is not the first time technology has required social norms evolve to suit.  As cellular phone entered the marketplace, then became smaller and smaller, users were called selfish and inconsiderate for answering their phones and speaking aloud in public spaces (to the point where some restaurants banned phones entirely).  When hands-free devices became commonplace, it got even worse because you simply could not tell if the person was listening to you or to a voice on the end of the line.  It got better over time, people using their bluetooth headsets learned to turn away, avoid eye contact, hold their conversations in their cars.  Other people learned to check to see if the person was on their device, helped by the flashing blue light on the side that drew your attention to even the subtlest earpiece on the market.

Within Magic Leap’s patent artistry (pictured at the top), we can see allowances for different styles of interactivity, many of which convey a clear body language to those looking in from the outside.  What remains to be seen is if they will do the experiment, if they will allow their product out into the wild so they can see how the human factor reacts, and what work they will need to do to smooth that transition into common usage.

Pocket Gamer Connects SF



I have to admit, Pocket Gamer Connects is one of my favorite app conferences.  I went to their Helsinki event last year, and I was invited to speak this year at the event here in San Francisco.  They always have some of the coolest speakers, not just the big marketing talks, or the monetization talks, which are interesting, sure, but they get a whole host of smaller developers.  They get talks on the indie experience, or they get different local takes on different aspects of development.  Couple that with a fairly creative eye with regards to what you might consider an “event space” and you get a great intimate event with a lot more networking potential than you might otherwise find at some of the larger venues.

Talk Data to Me

There’s a difference, a pretty large difference, between an AI and a chatbot. It’s perhaps hard to see if you’re on the receiving end, if you don’t know what to look for, but the way they act and react are different and in the case of a chatbot, once you figure out how the logic behind it works, you can talk it in circles.  Which is a good way to kill an afternoon, if you’re bored on the intarwebz.

Not that I have ever done this.  Oh no, not me.

The point of a chatbot, usually, is to mimic conversation.  They are often not capable of *steering* a conversation themselves, they don’t, or can’t, as leading questions unless the developer has planned ahead (and even then, you can tell when the canned questions come into play, the segues are never terribly smooth).  What they can do reasonably well, however, is continue a conversation in much the same way that many humans do.  It deconstructs your sentence, pulls the appropriate verbs and subjects, and constructs a question or response of it’s own.

If you’ve ever gotten a customer service call, or contacted customer service through one of those “live chat” services offered by banks and online retailers you’ve likely encountered a few chatbots.  Depending on the sophistication, they are often used to just collect your basic information before passing you off to a real-live human, but you can hear the difference if you listen.



Little Nibbles

Progress is being made on all fronts with regards to microscopic medical devices.  It’s not taking the form we might have imagined once upon a time, no self-replicating nano-bots, no magic chemical elixirs to restore youth and vigor.  But you know, the future never does.  You can see the roots of many technologies in the kids of things futurists and science friction writers come up with, but the ultimate result, the practical application is often very different.

Most of the time, IMHO, it’s somehow even cooler, these future technologies come to life.  When you read about them, when you try to envision them, your ideas are almost always incomplete.  You’re working with the gestalt, the overall concept, rather than the specifics.  But when you see those specifics operating IRL, when you see the bits turn and bend, when you see the shadows they cast and the way they hang in solution, they suddenly become the kind of real that your imagination can’t quite compare to.