by

As digital marketers we are tasked with creating online experiences for our users be it through content, creative or technical solutions. One area the web community is beginning to wake up to is the notion that the sites we build need to work for machines as well as humans.

What sort of creature man’s next successor in the supremacy of the earth is likely to be. We have often heard this debated; but it appears to us that we are ourselves creating our own successors: the machine. We are daily adding to the beauty and delicacy of their physical organisation; we are daily giving them greater power and supplying by all sorts of ingenious contrivances that self-regulating, self-acting power which will be to them what intellect has been to the human race

–  Extract from Darwin among the Machines by Samuel Butler

This is an amazing insight into our current thought process when it comes to robotics, the semantic web and even wearable technology but what is particularly astonishing is the amount of foresight Butler had. To put his vision into context, “Darwin among the Machines” was written in 1863. 136 years before the invention of the WWW, around 100 years before the Internet and only 42 years after Michael Faraday invented the notion of the electric motor.

And yet today in 2014 this concept couldn’t be closer to the truth. Everyday ‘we are giving them greater power’, greater power to act on our behalf, act autonomously. One way we do this is by empowering them with data and systems allowing them to access, retrieve, consume and aggregate content from the web.

“I thought this was a digital marketing blog?” I hear you scream well, yes. So what does this mean for marketers, designers and developers? Working recently on a couple of web projects I realised that although the idea of using semantic web technology within search is now a well-established, trusted and accepted model, it surprised me how little forward planning we put towards designing and developing web solutions that work for machines as well as humans.

Research has shown that brands that have jumped on the Schema.org bandwagon have seen increases in click through rates, traffic and ultimately conversion. So why is it that, despite the obvious benefits of a more semantic experience for web crawlers, we’re still primarily designing websites, interactives and content for human consumption only?

We are missing a persona: the machine-based persona.

The Machine Persona

Personas have emerged in the past few years as a means to put a human face to the soulless and faceless stats of the demographic data scientist. The problem is that just considering a demographic profile misses that human element. So personas were invented to help the new breed of marketer understand the human side of the data.

Broadly speaking, the goal of establishing personas comes from the world of user-centered design; to understand important tasks in the UI and the user’s motivations. Like traditional personas, machine-based personas need to have a name, a picture, some goals, background information and usage scenarios on how the persona would interact with the website or application. It is interesting to note, however, that machine-based personas are functionally-led rather than emotionally-led, which means that a machine persona’s descriptions can be more methodical and factual.

Let’s take an example in what would be, in many cases, a key machine persona: GoogleBot.

Machine Persona - Googlebot

Googlebot

  • Background – GoogleBot is a web crawler, a computer program that browses the World Wide Web in a methodical and automated manner.
  • Key goals – as a web crawler its main objective is to capture and index as much information about sites on the web as accurately and as fast as possible.
  • Usage scenario – on a regular basis GoogleBot crawls your site and identify any changes since their previous visit. Your site publishes local events, and it happens that the GoogleBot webspider has functionality that can deliver your site’s events straight onto the search engine’s results page and ultimately provide more exposure for each listing as well as your website. The spider needs the content to be semantically structured for this feature to work.

This example shows that by adding a simple machine-based persona we are able to identify specific scenarios and opportunities that could be easily missed otherwise, thus bringing the semantic needs of crawlers front and centre when designing/re-designing new websites, interactives and content.

While this may seem like a trivial step for an SEO to get their head around but for UX designers and developers it forces helps them to map the needs of crawlers into the design and build process ensuring that semantic markup isn’t just delivered eventually but, as part of a user story, is given priority over non-essential development tasks.

Leave your thoughts...