Media Monitoring For Successful Sourcing and Distilling of News

Effective media monitoring requires a mix of ingredients, blending both efficient search technology and hands-on expert human review. As discussed in our last post, this is one part of the “secret sauce” that goes into producing relevant and actionable business intelligence. But what’s the complete recipe for success?

Multiple news screen

We use the term “search” to describe the overall activity of finding the exact news you need to power media monitoring programmes. Now let’s break it down into three essential steps: sourcing, filtering and processing.

Generally, these steps could fit into any media monitoring programme. But reaching your business goals can get messy if you move ahead without understanding what each process and its underlying technology can – and cannot – do.

While these aren’t the only examples – and it’s important to realise that these systems exist in silos – you need to be certain that the option you choose is relevant to the business intelligence you need.

  1. Sourcing: Every media-monitoring campaign starts with the need to identify and gather the right results. You’re at the open end of a fat pipe of available media and data, and to manage it, it’s essential that you identify the right data sources to supply your programme.

The criteria? Relevancy. Accuracy.  Trustworthiness. Timeliness.

Via automated crawling and indexing, general search engines such as GoogleBing and DuckDuckGo cast the widest net as they set out to find everything published on the internet. Other “Internal” or “Deep Web” engines focus on internal servers and research databases that fence themselves off from the general search engines.  Other more targeted applications may only focus on postings found in social media feeds like FacebookTwitter and LinkedIn to monitor more conversational social chatter.

  1. FilteringNarrowing the data stream to meet the targeted needs of your business intelligence requires a consistent and high level of precise filtering. Those filters need to be updated and maintained according to the changing flow both of daily events and data.

You should strive for results that are consistent each day so that you can follow that evolution of events – yet flexible so that they adjust for changes in the news cycle. You also need to block unwanted repetition, especially when a single outlet may have the same story reposted at multiple URLs – to save yourself from drowning in time-wasting material.

Redundancy, or blocking duplication, depends on your goals. For instance, let’s say you need to track how far a story about a product recall was published. In that case, you want to count all the media outlets that ran an article about the event – even if it’s the exact same story – because you want to calculate overall views. Of course, a human curator would help identify if two stories published on, say, the New York Times website were simply the same or if one is about the product recall announcement and the other a journalist’s first-hand report of the situation.

However, if your media monitoring is designed to focus only on capturing the top news affecting your business or industry, then the filtering needs to remove all those redundant stories. Once you have the story from a verifiably reliable source, the rest are just burning your time and resources.

  1. Analysis: Now that you have found and filtered the content, time to analyse. The key goal here is to generate actionable business intelligence.

You can get this done with a system as familiar as Google’s indexing and page ranking algorithm, or with one of many services such as Cision’s media monitoring package that focus on coverage of online media outlets and social media channels.

For now, general search engines, with their ability to rank the most important articles and filter out duplicates, offer the best technology available for monitoring key news developments and identifying the best stories. While you could draw similar analysis from Internal or Deep Web search, you need difficult-to-acquire access to secure servers and the ability to authenticate across firewall security.

If your goal is to gather analytics or measure sentiment, then social media analysis applications like Crimson Hexagon or Adobe Marketing Cloud are the current go-to tools.

In the next post, we will look at how you can get the most out of this process by using cross-monitoring and analysis.

– post by Tim McGuinness