GigaOM and The Guardian co-hosted a small event in London to talk about the role and impact of technology on media and journalism last night. Katie Fehrenbacher (Senior Editor) and Paul Walborsky (CEO) of GigaOM met with Tanya Cordrey (Chief Digital Officer) of The Guardian and discussed how the different trends affected how their operating models work and what the readerships now demand of them.
Mobile first, paper last
Tanya opened the event which led straight into a panel discussion with Paul and chaired by Katie. It was immediately apparent that the biggest trend hitting the media is Mobile. Both Paul and Tanya acknowledged that media consumption habits have shifted towards being mobile first, hardly surprising to learn that most Guardian readers check their mobile devices first thing in the morning for news and content and paper is given a cursory thought as a format now. Paul noted that there are different types of behaviour being displayed with this new trend, with one he termed as ‘grazers’ of content; those who are happy to skim through weekly news rather than in-depth.
Single source of truth
One of the interesting concessions in the way technology has affected media and journalism is that people have access to more channels than ever before. Paul discussed that whilst GigaOM and The Guardian are one such set of channels they will never be considered as a single source of truth and there was no point in waging a battle for it. Consumers will inevitably read and enjoy a variety of sites and portals of information and he was happy that GigaOM had a place in that mix.
You will never win a single source of truth battle, have a place in that truth.
Atomic Content and Competition
When faced with the future of how news and media is distributed, Paul answered that he believed content will become ‘atomised’ below the level of the article, people consume via aggregated sites and portals like Pulse and media must accept this. But the burning question was how to monetize through aggregated sites and Paul believed that old models are dying out.
A CPM rate in the 1990′s that cost $80 is now only worth 50c
GigaOM make their money through running events and their analytics model, GigaPro. By retaining a high quality focus on journalism and content GigaOM have retained a premium CPM rate, Paul added.
Tanya went on to discuss that innovation plays a big part in continuing to evolve, mentioning the experimental work The Guardian did with Facebook. From that experiment they learnt that they have a high engagement with 18-24 year olds but also noted that their consumption habits differ from others. Strangely though Tanya said that “Facebook is a product that has come and gone…” and given my own personal loathe of the site and their privacy ethics she could be right. The media and journalists need to jump in bed with the right partners.
The long and the short
Paul and Tanya discussed the continuing need for long form, stating that it’s here to stay and analysis is more important than breaking news stories or regurgitating press releases.
Twitter breaks the news, GigaOM provides the analysis
When challenged why some sites hide information behind paywalls both Tanya and Paul defended their positions. GigaOM don’t have one other than the paid research and analysis they conduct under GigaPro and this is very much a premium subscription model due to the quality of the product and service. “There’s a difference between paywalls and pay products”, said Paul and Tanya went on to add that “paywalls are not for us, the numbers just don’t add up.“
Big Data and Predicting the news ?
Of course, I wanted to know about the usage of Big Data in journalism and whether Paul and Tanya saw a time where the media would use data to predict events before they happened. Both answered ‘No’ but it begs whether they really understood the implications. It’s no secret that Microsoft and Technion have played with predicting future events like riots and political unrest based on previous years trends and information and there’s certainly scope for the media to do exactly that.
By using data and predictive modeling the media could in theory send a journalist to a hot spot under the conclusion from the analytics that something big will occur there. Obviously this is a little too ‘Minority Report’ as Tanya put it but I was approached by a few others in the event about the question and the feeling was that the potential was there.
Perhaps the media just aren’t quite ready for it just yet but I’m sure someone has already predicted the outcome on that one….
But there’s also a case to be made in using data to create news and content in a more targeted and relevant way, tailored to individual tastes. Yes, this can be manually done using tools like Pulse but a consumer does this themselves. Media should be harnessing big data to be able to deliver content without a user having to do it based on the analytics performed. Less scattergun, more sniper. Coupled with the mobile trend Paul and Tanya discussed and it’s a bit of a no-brainer.
Media is embracing technology slowly, but combining real analytics with the deliverysystem creates a relevancy that will really sort the journalistic wheat from the chaff.
And that’s how you’ll really monetize media.