Skip to main content

We are the Taggers.

How cool is this: a single human brain cell can recognise a person.

In studies, it turned out that a single neuron fired in test subjects when they were asked to recognise famous people. Each time the test guys were shown various pictures of Halle Berry, the same neuron fired. The same neuron also fired when viewing the sequence of letters 'H-A-L-L-E-B-E-R-R-Y' or when viewing drawings of the star.

It turns out that the brain uses a whole lot less neurons than was previously thought to store information - not like bits in an array but more like individual computers in a network.

Now, I've done absolutely no study into this at all, and have no qualifications in the field (or any field, for that matter) but here's my half-arsed hypothesis.

Perhaps the brain's awesome power to store and recall things is tied directly to it's ability to imagine and create. Maybe when I'm recalling a picture of Halle Berry in a catwoman costume, I'm actually just triggering a switch that asks for the recall to occur, and then a few neurons get together to imagine and create the recall information based on some complex imagination theory. Throw in some ego and self esteem for good measure, and I 've got a pretty good recollection of what I saw.

I often feel that when I'm going about my day to day life, I'm really only seeing and analysing a very narrow field of view. My peripheral vision, all the things that I think I can see seem to me to be pre-rendered in my head based on some time in the past when I've looked at them. Of course as soon as I look at anything in my peripheral vision, I end up focusing my attention on it, and everything else becomes peripheral. Turns out that Neurologists call this the Attention Window ( I learned about it here) and that the image you see is generally built up over time, not rendered out all at once like in a FPS. That explains why sleepwalkers can go adventuring in their sleep, without falling over stuff.

All this reminded me how lame computers are by comparison, and of the eventual doom of metadata. Someday, people are going to think that it's cute that we spent so much time telling computers how to describe their data.

Eventually, a computer will be able to derive as much information from something as we can. It will be able to look at a webcam image and then say "The weather looks crap today", probably by firing a few silicon 'neurons' and 'imagining' the result. The metadata is inferred, not explicit.

Whereas we're all busy tagging everything so a computer can understand it. Nowadays, information management is all about explicit metadata, often for elements that aren't meta at all - they're just data:

   <report>
<datestamp>2000-09-01</datestamp>
<station fullname="San Jose" abbrev="KSJC">
<latitude>37.3618619</latitude>

<longitude>-121.9290089</longitude>
</station>
<temperature>
<min>-5</min>
<max>10</max>

<forecast-low>0</forecast-low>
<forecast-high>11</forecast-high>
</temperature>
<wind>
<speed>5</speed>

<direction>NNW</direction>
</wind>
</report>


if @forecast-high<18
puts "The weather is crap today"
end

Your brain doesn't need all that stuff tagged. If you just see the report, You'll figure out what's what. Surely someday a computer will do it reliably too.

Meantime, on with the tagging...

Comments

  1. Hi Sam,

    I agree that all that data is required - but then I don't think of those key things as meta- they're actually the data itself. We're always going to have to tell a computer what to compute somehow. The real thing my example lacked was a DTD or schema definition...

    With regards to the AI problem, I'd like to think that sophisticated algorithms and elegant thinking would win out over brute force (It makes us so much more interesting!) but your chess example certainly points the other way.

    As someone who spends far too much of their life poking around with metadata, I think I probably have a hidden agenda when it comes to it's demise.

    Of course, it will all get really complicated when computers start having hidden agendas...

    ReplyDelete

Post a Comment

Popular posts from this blog

Going West vs Going to Sleep

Phew! That was one busy adventure to the other side of this wide brown land (It is wide, and brown, but mainly wide) TUF 2005 in Perth was the launching ground for our new product, ice. Stilly and I were presenting the keynote, which was based around showing off ice, and talking about collaboration and other reasons why a bunch of customers might want to buy it. In a stroke of genius\insanity, we decided to let the audience pick the demonstration platform based on random outcomes - we built a giant cardboard die with various operating systems and platforms written on each side - then we'd let a volunteer from the audience roll the dice(die?) to determine which platform we should do our demo on. ice (the italics belong to the marketing department) works on any platform, so we were pretty confident that we would be okay. But, what I hadn't counted on (those italics are mine), was my crummy laptop (which was acting as the server) deciding that it would be a good idea to hibernat...

Still Crazy

When I started with TOWER Software four years ago, I was keen to get on with the job. You know, new project manager guy and all, trying to figure out what was what, and who was who. As part of this breaking-in process, I went around and asked each developer what they were working on, and how long they estimated that their current project would take. I'll admit that I had a secret agenda - it's important to find out who are the overly optimistic guys, and who are the more seasoned realists, because you're supposed to adjust your project schedules accordingly.. Anyway, I collected all this data and feed it into a secret Gantt chart I had somewhere. Most of the team were working on features that were being shipped in the next few months, and I got the broad range of overly positive responses, which is pretty common. I know I'm a terribly optimistic estimator. (Incidentally, if you're like me, my advice is to always multiply your estimate by the value of pi in order to ...

The height of Retro cool?

Like Rory , I grew up with a lame arse PC. I too was bitterly jealous of those amiga owners. With their fancy fandanlged-hand-holding-a-floppy-disk bios, and versions of Marble Madness that looked just like the arcade, they had no idea how lucky they were. But, I'm not so sure that the grey box which evaporated my childhood, (while I'm very fond of it) was actually the height of eighties cool. In fact, the computer I owned was far, far worse than the virtual boy of PCs - something that made those poor betamax owners laugh themselves into hysterical coniptions as to what a loser of a product this thing actually was, and they paid 450 dollars for a flashing digital clock. My dad bought us a genuine, IBM PC-JX. The IBM PC-Jr is widely regarded as one of IBM's dumbest decisions. What very few know, is that after the IBM PC-Jr flopped dismally in the US, IBM was left with a bunch of leftover hardware that nobody wanted. I can hear the meetings now: shimmery dissolve in "Jo...