My boss forwarded me a Nielsen link yesterday that talked about online socializers:
But with the increasing number of resources available, it’s difficult to know what you should believe or take at face value. Socializers – those who spend 10 percent or more of their online time on social media – feel this effect more than others do. When asked, 26 percent feel that there is too much information available on the Internet, compared to 18 percent of people who predominantly use portals and just 5 percent of people who primarily use search engines.
But why does too much information lead one to use social media as a navigation tool? The short answer: Socializers trust what their friends have to say and social media acts as an information filtration tool. This is key because Socializers gravitate towards and believe what is shared with friends and family. If your friend creates or links to the content, then you are more likely to believe it and like it. And this thought plays out in the data.
Increasingly, I’ve been having to filter down what I look at because the net is just catching too much stuff. My blogroll is pretty massive and it takes some time to get through — I’ve had to remove some of the more spammy blogs like DCist, Engadget, etc.
The Nielsen article differentiates between searchers and socializers (searchers tending to be less active socially online, using search engines to find content). But what if we could combine searching with social trust?
Various obstacles have blocked an identity layer online, but none moreso than peoples’ demands for privacy. Privacy is used haphazardly as a way to ensure trust. That is, we protect ourselves in public by restricting who has access to us to only family and friends. But this is not “trust” per se — it’s obfuscation. But internet trends such as collaborative wikis, Netflix ratings, and tagging show that open trust systems can provide much more information than small, closed networks. They open themselves up to abuse but with just a few people and a few tools to manage that abuse, the systems can be massive gains for public knowledge.
[By the way, as a related aside, for my Yahoo!/ISD fellowship research, I wrote a paper talking about the meaning of “privacy” and what is currently happening online with regards to how the US and the advancing BRIC countries (Brazil, Russia, India, China) are dealing with openness and closedness.]
What we are heading towards is some brutal endgame with respect to personal data: Facebook has been developing a pretty complex privacy infrastructure but it is being lambasted both from security people for exposing too much data and from internet geeks who want portable identities and data that they can use across social networks.
Certainly underlying all this is fear of government monitoring. The Patriot Act under Bush (and probably under Obama too) has disgustingly blurred the lines between lazy domestic surveillance and strict burden of proof for court orders. Until the government can reassert that it must require a lot of evidence and court approval (perhaps involving a watchdog representative too) to start spying on someone (not just American citizens), the prospect of freeing up personal data online must be tempered.
But imagine if we could sort out all these issues and build up a proper trusted network online for reputation and identities, ensured by a public trust and not by a for-profit company or by the government? What if we could ensure transparency not only for individuals but also upon governments and companies? What I feel is missing in the debate about “big federal government” is that companies have become as powerful or in some cases more powerful than governments. Unions and large public organizations as well. Transparency and accountability are not popular ideas across the board.
But I look forward to a day when I can do what should be mundane tasks. I went to a get-together with mainly girls once, and they were playing with jdate, the dating service for Jews. They were searching only for guys who had Master’s degrees or above. And they got the results and were disappointed with men who appeared to me to be absolute all-stars: doctors, good-looking, wealthy, fun guys. But the girls were practically yawning.
What if I could search across Amazon for only people who’ve read over 200 books? What if I could look for opinions on Afghanistan only from bloggers who have served a tour there in the Marines? What if I could find Digg articles from people who have had at least one child and who own a camera I’m looking at? What if I could filter out my Twitter follow list so I only view tweets from those with at least 100 users and who post at least 3 times a day and who have had over 20 of their tweets voted upwards?
What of serendipity? Well, the random public lifestream will still be there. But I want to be able to filter across networks and across siloed databases.
And sure, not everyone will want to share all this information with the world. They should have the right not to. But what about those of us who want to opt-in and start using all this data to make our lives better and to be able to use our reputation and others in order to make better decisions?