O raciocínio de Jeff Jarvis na Buzz Machine sobre composição na rede é essencial para pensar a interface em uma cultura pautada por dados. Reproduzo o trecho em que ele aborda o uso de algoritmos para organizar o mar de informações que circula na web, a função dos agregadores de conteúdo e a qualidade dos links indicados:
“(…) Algorithms — Google News or Daylife (where I’m a partner) — also meet the organizational challenge of abundant content and they tackle the challenge of timeliness. For search to infer content’s relevance, it must gather data from our use — that was Google’s key insight — but that won’t work for news, whose value is perishable. So algorithmic aggregators use other signals — source, content analysis, timing, location, association — to cluster and present coverage in a nest of relevance. These algorithms enable content to coalesce into stories and topics that search will find because it gains depth and attracts links and clicks. Algorithmic aggregators exposed a key conflict in old v. new media worldviews: The old-media view is that aggregators extract value from content by displaying it; the new-media view is that they add value by creating audiences and causing links — this is the essence of the misunderstanding of the link economy.
Thanks to new tools — Twitter, Facebook, Buzz — human links are exploding as a means of discovery, which gives lie to the old-media complaints of Rupert Murdoch et al that aggregators are stealing their content. When your own readers recommend and link to your content, is that stealing? Do you want to turn those people away and call themworthless? Facebook, according to Hitwise, is the fourth largest referrer of audience to media. Bit.ly alone causes two billion clicks a month, double Google News’ impact. Soon Buzz will be causing many links (teaching Google what’s hot and relevant, which is a key reason to start the service). And, of course, bloggers have shown the way as curators. Thanks to our newer, easier tools that enable links, humans are becoming a huge force in content discovery, reducing search’s and algorithms’ share and dominance.
Now we need to better understand the quality of those links and linkers. Clay Shirky craves algorithmic authority. Azeem Azhar is one of many entrepreneurs trying to systematize the annointing of more authoritative tweeters (read: linkers) at Viewsflow. On the latest This Week in Google, Google’s Matt Cutts talked about efforts to find more signals of quality so it can send us not to the crops of lowest-cost content farms but instead to original work. (The good news is that quality will out.) In the link economy, sending traffic to original work becomes an ethical imperative as links are the means to support that work. But it’s an old-media mistake — a leftover of the brand era — to think that authority can or should be one-for-all or that it’s the creator who establishes authority. Authority will vary by context and need as well as opinion (one man’s New York Times is another’s Fox News). Branded media was one-size-fits-all as was search and algorithmic aggregation. Now discovery will become personalized based on context (who you are, where you are, what you’re doing, what you’ve done, what you like…) as well as timing, taste, and quality.”