Recently I’ve seen people mention the difficulty of generating content that can garner massive attention and links. They suggest that maybe it’s better to focus on content without such potential that can earn just a few links but do it more consistently and at higher volumes.
In some cases, this can be good advice. But I’d like to argue that it is very possible to create content that can consistently generate high volumes of high-authority links. I’ve found in practice there is one truly scalable way to build high-authority links, and it’s predicated on two tactics coming together:
- Creating newsworthy content that’s of interest to major online publishers (newspapers, major blogs or large niche publishers).
- Pitching publishers in a way that breaks through the noise of their inbox so that they see your content.
How can you use new techniques to generate consistent and predictable content marketing wins?
The key is data.
Techniques for generating press with data-focused stories
It’s my strong opinion that there’s no shortcut to earning press mentions and that only truly new, newsworthy and interesting content can be successful. Hands down, the simplest way to predictably achieve this is through a data journalism approach.
One of the best ways you can create press-earning, data-focused content is by using existing data sets to tell a story.
There are tens of thousands — perhaps hundreds of thousands — of existing public datasets that anyone can leverage for telling new and impactful data-focused stories that can easily garner massive press and high levels of authoritative links.
The last five years or so have seen huge transparency initiatives from the government, NGOs and public companies making their data more available and accessible.
Additionally, FOIA requests are very commonplace, freeing even more data and making it publicly available for journalistic investigation and storytelling.
Because this data usually comes from the government or another authoritative source, pitching these stories to publishers is often easier because you don’t face the same hurdles regarding proving accuracy and authoritativeness.
Potential roadblocks
The accessibility of data provided by the government especially can vary. There are little to no data standards in place, and each federal and local government office has varying amounts of resources in making the data they do have easy to consume for outside parties.
The result is that each dataset often has its own issues and complexities. Some are very straightforward and available in clean and well-documented CSVs or other standard formats.
Unfortunately, others are often difficult to decode, clean, validate or even download, sometimes being trapped inside of difficult to parse PDFs, fragmented reports or within antiquated querying search tools that spit out awkward tables.
Deeper knowledge of web scraping and programmatic data cleaning and reformatting are often required to be able to accurately acquire and utilize many datasets.
No comments:
Post a Comment