Does Google’s announcement that it will add a stream of real time content from across the web into its search results validate the value of real time search.
This is how Google put its announcement earlier this week:
Now, immediately after conducting a search, you can see live updates from people on popular sites like Twitter and FriendFeed, as well as headlines from news and blog posts published just seconds before. When they are relevant, we’ll rank these latest results to show the freshest information right on the search results page.
It’s a big difference to users, who will now get a raft of new information sources thrown into the mix that thus far have been “largely” the preserve of the tech and media communities, but what will everyone make of this, I’m not sure.
One thing that is for sure is that this move by Google brings the real time search products, like twitter right into the mainstream, and if Google thinks it should include it then in many ways this does validate the value of real time search to users.
Tobias Peggs, GM of OneRiot a real time search product, said that 40% of searches on internet were looking for real time information.
Speaking on a panel called “content versus conversation” at Le Web, in Paris, Peggs said that this means there’s a 40% market share in the search market for real time search, a big chunk.
However, real time search has a lot of challenges to deal with if it is to cut into the traditional search market.
Peggs said that traditional search had 10 years experience of monetising its content through Search Engine Marketing and Search Engine Optimisation, but this breaks down when it comes to real time web.
Nick Halstead, CEO of tweetmeme.com, agrees that real time search has to find its revenue stream, but it also faces other challenges, such as authority and relevancy of content in the context of traditional search.
I agree, when you search twitter for content and it is derived from your followers, who you have hand selected, you can then make up your own judgement and select the authority of that source.
Take that element of context out of the mix and if alongside your results for a search on “earthquake”, for example, are loads of tweets from people you don’t know, what is the value of that information?
Google’s move definitely validates the value of real time search, but it also throws up lots of other challenges for the real time web, which is still a young developing industry. Sometimes we can’t second guess the way it will develop.
Companies are failing to keep up with the pace of change on the internet and need to follow three steps to keep up, according to Jeremiah Owyang, a partner at the Altimeter Group who spoke at Le Web conference in Paris today.
He said the real time web is no longer quick enough, and that the next stage would involve web users talking about what they plan to do in the future. He said, “Most companies can’t keep up with the slow web, let alone the future web.” He said there are several examples of companies whose adverts or marketing had been negatively discussed on Twitter and YouTube, such as Motrin, an American painkiller brand. Despite the company responding within 24 hours when mothers tweeted their adverts were “patronising”, the response wasn’t quick enough – the story escalated from a Friday and was on mainstream news by Monday.
He told companies to:
1. Provide “social personalisation”. Companies first need to listen to people on the social web, using Google alerts, Technorati to monitor blogs, and Twitter search. There is also social monitoring software available, and advised searching for topics around your product – for example searching “headache” and “back pain” if you’re Motrin.
The second part of the process is to build up “social personalisation”. This involves using the information you’ve found on customers via the social web and matching it with the data already held on them. Companies can look at a customer’s website or Facebook profile and recommend a new car for them, for example. “These systems aren’t very mature at the moment,” Owyang said, “but this is going to extend into the real world.”
2. Recruit an unpaid army that will monitor the web for you. “You can’t be in all places at all times on the web, so work with your customers to build that army.” People are not being paid, but they’re being promoted and their reputation is boosted. They get recognition rather than payment, and Owyang suggested allowing them access to special information and events. It’s also important to listen to their feedback and make changes based on it.
3. Invest in the management systems needed for this, which Owyang called social CRM. The systems are again in their infancy because most of them have not yet connected up with the data available on social networking profiles. Using the data from these profiles can help companies to respond quicker and provides a company with a “real time” database on what consumers are saying.
Marissa Mayer, VP for search products and user experience at Google, spoke today at Le Web in Paris about the big problem facing publishers on the internet – how to make money in an age where consumers expect free content.
News Corp’s Rupert Murdoch has gone on the offensive recently, with rumours of him planning to remove all his company’s content from Google and place it on Bing in an effort to make money. Google’s response has been polite – Mayer today said, “We hope it doesn’t happen. Our goal is to be as comprehensive as possible. We have to respect the copyright owners because if we dont have the content we dont have a search engine. I hope for the sake of the quality of the search engine that doesnt happen.”
But they’ve also been gently insistent that they are doing lots to help the cause of ailing publishers. Mayer said in her interview with TechCrunch’s Michael Arrington, “We have developed a lot of programmes and products that help with the problem. We want to make it easy for people to monetise content, and we want to empower publishers. Plus, news search itself delivers billions of clicks on to these sites each month. We are quite motivated and compelled to produce solutions to this problem that really do help publishers.”
Mayer talked about a possible answer to the business model problem – the hyperpersonalised news stream – that kind of begs the question, instead of complaining, why don’t publishers harness some of the innovation inherent in successful web companies like Google?
Mayer didn’t say that (again, she was polite), instead talking about how papers like the New York Times and the Washington Post are willing to be progressive, despite criticism of the contrary. But Google’s ideas of the personalised news stream sounds like the kind of idea one of the big publishing names should be coming up with, instead of starting playground fights over who gets to list what.
She said increasing engagement is crucial, adding, “When you read a newspaper there are lots of different columns and lots of different places for your eyes to go next. On most news web sites, there’s nowhere to go. Look at Amazon – when you buy something, it says ‘Here’s what you might like to buy next’. The web has an endless stream so why can’t I have a personalised stream of news that’s portable. You can read it on your desktop or mobile phone, it takes your preferences into account, the blogs you follow, your social circle, your location. Putting all that together, you can have a very compelling product which could really increased engagement.
“What form that takes is yet to be seen, but whatever solves the problem will probably have these characteristics.”
Google said it respected the copyright of publishers and doesn’t want to lose any from its index.
Marissa Meyer, VP at Google, told delegates at Le Web conference in Paris, that it respected publishers’ copyright.
“Google wants to be as comprehensive a search as possible, so have to respect the copyright… we have to do that. For the quality of the search engine we would hope to keep all publishers in Google,” she said.
But she warned publishers that they needed to do more to engage users in online news. She said that online search reduced the amount of content readers looked for.
“If you search for news you may just read one story – so that is the atomic unit. You land there and just look at the one article.”
Publishers must engage their readers by offering additional relevant content at the bottom of articles and by bringing their content alive.
“Why can we not have a personalised stream of news that includes our broad preferences, our social circle and that is relevant to our location?
“If we can bake all these things together could have a compelling news product. I call it the hyper-personalised news stream.”
So Google still sees a value for news online but believes that it has to become more individually relevant and personalised and alive, in that it continues to develop, to engage readers more.
It is true that news does date very quickly, particularly in the face of real time updates that are now available through the likes of twitter.
Meyer and Google clearly want to keep Murdoch and his newspapers within the Google body of content, so I believe there is more to run in this story yet.
However, whereas Google will talk about working with publishers, it will also throw the gauntlet back to them to ensure that they are doing the best by their users in the online medium….and that means making their content more alive and engaging to stop users hitting the back button. That’s the challenge.
Just a quick post to flag up my colleague Faisal Alani’s (AKA Inspectagadget) latest gadget review where he pits the Palm Pre against the iPhone in his own witty way.
Take a look and let us know what you think.
So tomorrow (Wednesday 9 Dec) we head off to Le Web in Paris to cover the biggest real time web conference in Europe.
For a change we are departing mob handed, well three of us, (that’s me, Faisal Alani and Rebecca Thomson) to see whether we can cover the conference in true multimedia fashion through blogs posts on this blog and the Social Enterprise blog, video, pictures, tweets and news stories.
I really hope that by covering the conference in this way we can give you a really good experience of all the goings on at Le Web in Paris.
The IT industry’s bruises from fighting the downturn in 2009 are still pretty raw, but as the year approaches its end many will be applying the ointment with a bit more of a smile.
The mood of cautious optimism among IT directors was confirmed by Accenture research showing UK and Ireland businesses as the most likely in Europe to increase spending next year.
Not only that, but two of the top three items on their wish list were e-business and CRM systems, both technologies focused on growth, rather than retrenchment and cost cutting, which have been the watchwords of late.
But not all those economic bruises will heal so quickly. There is still a significant split in buyer attitudes between capital expenditure (capex) and operational expenditure (opex) spending.
Opex – covering areas such as subscriptions, software-as-a-service or outsourcing – have been the least hard-hit by the recession.
Capex – servers, PCs, big up-front software licences – have really suffered. The server and business desktop markets slumped more than 20% year-on-year at one stage – that’s a decline of the scale that led the US government to spend billions of dollars bailing out the car industry.
Most capex items in IT are purchased through finance deals such as operating leases, turning them into opex for accounting purposes, but the technology leasing sector has been just as hard-hit, with access to funding severely restricted.
And with major cutbacks in the public sector looming after the general election next year, any suppliers relying on government IT will still be wary of a further beating.
All these factors combined start to create ideal conditions for IT managers. Your suppliers need you more than ever, but they know buyers will still be cautious. It is going to be a great time to negotiate a good deal – technology decision-makers should enjoy having the upper hand while it lasts.
It is looking like a very big week ahead for government IT.
A few words from chancellor Alistair Darling on the BBC yesterday have already thrown the future of the £12.7bn NHS National Programme for IT into confusion and doubt.
The Tories, meanwhile, as well as enjoying the chance to throw further barbs over the NHS project, have called for a moratorium on government IT projects to review whether or not such major spending is needed in the light of planned cuts.
Gordon Brown will today announce more details of further reductions in public spending as the General Election campaign clicks up a gear. Other IT projects will no doubt face the axe, although the prime minister is also expected to call for more public services to move online as a means of saving cash.
On Wednesday, Darling will present his pre-Budget report – Labour’s final financial review before the election – with overall public sector cuts bound to have an effect on local government IT spending.
On top of all that, the new Whitehall IT strategy, a draft of which was leaked recently, is due out at any time, possibly by the end of the week, confirming details of the G-cloud plan to introduce a cloud computing infrastructure.
Twelve years of Labour IT policy could be completely overhauled in the space of a week.
But all we have so far – and all we are likely to have for some time – is questions.
Darling told the BBC that details of cutbacks will not be confirmed until the first half of next year. In the meantime, NHS and government IT professionals are left in limbo, not knowing which projects to prioritise or even whether their work is going to be scrapped entirely.
Major suppliers will be wondering what lies in store too -and probably checking the contractual small print with their lawyers.
What, for example, will be the effect of a scaled back NHS IT programme on BT Global Services? The telco’s IT arm has already been through a massive restructuring caused in large part by its commitments to the NHS, with thousands of jobs lost and contracts renegotiated. What happens if the government decides to delay or cancel parts of the project that fall under BT’s remit? The financial implications could be significant – and certainly likely to bring legal discussions over the government’s contractual position.
Within the NHS, IT practitioners will be wondering how to move forward with much needed – and already much delayed – patient administration and electronic records systems. Will the centralised applications from BT and CSC be scrapped? Will NHS trusts be given more autonomy to purchase their own systems? And if so, how do we know if local purchasing would actually be any cheaper than the current centralised plan? After all, for years we have been told that central IT purchasing saves money.
At any time, such uncertainties would cause alarm around the public sector IT community. Given the impending election, there is already such confusion about the future that parts of government IT risk paralysis for months. Even the ambitious plans in the new strategy have to be questioned, as a potential Tory government would undoubtedly seek a further review once in power, so the feasibility of pursuing the new plan between now and next year’s national ballot must be questioned.
The need to cut costs is driving every aspect of government policy now, regardless of who is in power, and that is a fact we cannot avoid. But technology is at the heart of government and public service delivery, essential to functioning policy – and to improving efficiency and productivity to deliver a large chunk of the billions of pounds of savings being targeted.
Such deep uncertainty around IT strategy and major projects threatens to undermine more than just the work of public sector IT professionals.
ComputerWeekly.com is celebrating the 17 days of Christmas by giving away a book a day for the next 17 working days before Christmas.
We’re stopping at 17 for several reasons; we only have 17 books to give away; we don’t think that good old Royal Mail will get the book to you before Christmas after that date; and finally and most importantly we hope to be on holidays from 21 December onwards.
So how does the competition work?
Each day we will ask a question on twitter – the only rules are that the question is topical – if you can answer the question, you email your response to: email@example.com.
A winner will be selected each day from those who answer the question correctly.
Day 17 – 22 December
How much has Twitter made from deals with Microsoft and Google?
Day 16 – 21 December
Which large website has been criticised today for tax avoidance?
Day 15 – 18 December
Who is the president of Oracle?
Day 14 – 17 December
What year did AOL buy Time Warner?
Day 13 – 16 December
Who is ComputerWeeklys Editor-In-Chief?
Day 12 – 15 December
How long is this week’s ComputerGeekly video?
Day 11 – 14 December
What’s the name of the hack used by criminals to block Cofee?
Day 10 – 11 December
What accounted for 19% of computer security breaches in 2009?
Day 9 – 10 December
Which IT company recently narrowly averted striek action?
Day 7 – 9 December
How many security bulletins were released by Microsoft in 2006?
Day 6 – 8 December
Where is ASIACrypt being held this year?
Day Five – 7 December
Which call centre software has “Which” magazine recently announced a deal to move to?
Day four – 4 December
Which well known IT supplier faces strike action?
Day three – 3 December
Which well known UK institution plans to become a bank?
Day two – 2 December
Q2. How many users does Facebook have now?
Prize: Schneier on Security by Bruce Schneier
Day one – 1 December
Q1. Which UK personality was the most searched for on Google in the UK this year?
Prize: A signed copy of Claude Roeltgen’s IT’s hidden face.
“If you are a business manager who needs to work successfully with IT, or if you are an IT professional who needs to be able to explain why something can be installed in 10 minutes, but success demands many more steps before and after that, this book is for you.”
There has always been a marked imbalance of power between major IT suppliers and their customers. No matter how often IT managers are told about open standards, hear that “we are listening to you”, or are blitzed by marketing mantras such as “customer-centric”, hardened buyers know they are at the mercy of their suppliers’ whims.
Lock-in might not as often be a technical issue these days as a commercial reality – the cost of upheaval will usually outweigh the pain of the status quo.
SAP proved the point last year when it hiked support fees, apparently without consultation with customers, bringing much criticism from user groups in the UK and worldwide.
The software giant deserves some credit for listening to those complaints and responding with an innovative plan to link price increases to an independently-audited measure of the associated benefits. But surely a lot of conflict could have been avoided if SAP had been less heavy-handed in the first place.
The apparent resolution shows the potential of users working together – and there is a greater opportunity emerging to redress that imbalance.
The power of the crowd is already having a big effect on consumer-focused companies such as retailers and banks. The collaboration and information sharing enabled by social networking and the web is delivering a new wave of influence for users.
Try posting a few messages on Twitter about problems with your BT service and see how quickly the telecoms provider replies. Social media is becoming as much about advocacy and customer service as it is about Stephen Fry and telling the world what you had for breakfast.
IT managers would do well to consider the tools now at their disposal to put pressure on problem suppliers. Those suppliers need to be ready to deal with newly-empowered customers not afraid to share their experiences and pool their influence to swing the balance of power ever further away from unwanted tradition.