Content, Shares, and Links: Insights from Analyzing 1 Million Articles

Posted by Steve_Rayson

This summer BuzzSumo teamed up with Moz to analyze the shares and links of over 1m articles. We wanted to look at the correlation of shares and links, to understand the content that gets both shares and links, and to identify the formats that get relatively more shares or links.

What we found is that the majority of content published on the internet is simply ignored when it comes to shares and links. The data suggests most content is simply not worthy of sharing or linking, and also that people are very poor at amplifying content. It may sound harsh but it seems most people are wasting their time either producing poor content or failing to amplify it.

On a more positive note we also found some great examples of content that people love to both share and link to. It was not a surprise to find content gets far more shares than links. Shares are much easier to acquire. Everyone can share content easily and it is almost frictionless in some cases. Content has to work much harder to acquire links. Our research uncovered:

  • The sweet spot content that achieves both shares and links
  • The content that achieves higher than average referring domain links
  • The impact of content formats and content length on shares and links

Our summary findings are as follows:

  1. The majority of posts receive few shares and even fewer links. In a randomly selected sample of 100,000 posts over 50% had 2 or less Facebook interactions (shares, likes or comments) and over 75% had zero external links. This suggests there is a lot of very poor content out there and also that people are very poor at amplifying their content.
  2. When we looked at a bigger sample of 750,000 well shared posts we found over 50% of these posts still had zero external links. Thus suggests while many posts acquire shares, and in some cases large numbers of shares, they find it far harder to acquire links.
  3. Shares and links are not normally distributed around an average. There are high performing outlier posts that get a lot of shares and links but most content is grouped at the low end, with close to zero shares and links. For example, over 75% of articles from our random sample of 100,000 posts had zero external links and just 1 or less referring domain link.
  4. Across our total sample of 1m posts there was NO overall correlation of shares and links, implying people share and link for different reasons. The correlation of total shares and referring domain links across 750,000 articles was just 0.021.
  5. There are, however, specific content types that do have a strong positive correlation of shares and links. This includes research backed content and opinion forming journalism. We found these content formats achieve both higher shares and significantly more links.
  6. 85% of content published (excluding videos and quizzes) is less than 1,000 words long. However, long form content of over 1,000 words consistently receives more shares and links than shorter form content. Either people ignore the data or it is simply too hard for them to write quality long form content.
  7. Content formats matter. Formats such as entertainment videos and quizzes are far more likely to be shared than linked to. Some quizzes and videos get hundreds of thousands of shares but no links.
  8. List posts and videos achieve much higher shares on average than other content formats. However, in terms of achieving links, list posts and why posts achieve a higher number of referring domain links than other content formats on average. While we may love to hate them, list posts remain a powerful content format.

We have outlined the findings in more detail below. You can download the full 30 page research report from the BuzzSumo site:

Download the full 30-page research report

The majority of posts receive few shares and even fewer links

We pulled an initial sample of 757,000 posts from the BuzzSumo database. 100,000 of these posts were pulled at random and acted as a control group. As we wanted to investigate certain content formats, the other 657,000 were well shared videos, ‘how to’ posts, list posts, quizzes, infographics, why posts and videos. The overall sample therefore had a specific bias to well shared posts and specific content formats. However, despite this bias towards well shared articles, 50% of our 757,000 articles still had 11 or less Twitter shares and 50% of the posts had zero external links.

By comparison 50% of the 100,000 randomly selected posts had 2 or less Twitter shares, 2 or less Facebook interactions, 1 or less Google+ shares and zero LinkedIn shares. 75% of the posts had zero external links and 1 or less referring domain links.

75% of randomly selected articles had zero external links

Shares and links are not normally distributed

Shares and links are not distributed normally around an average. Some posts go viral and get a very high numbers of shares and links. This distorts the average, the vast majority of posts receive very few shares or links and sit at the bottom of a very skewed distribution curve as shown below.

This chart is cut off on the right at 1,000 shares, in fact the long thin tail would extend a very long way as a number of articles received over 1m shares and one received 5.7m shares.

This long tail distribution is the same for shares and links across all the domains we analyzed. The skewed nature of the distribution means that averages can be misleading due to the long tail of highly shared or linked content. In the example below we show the distribution of shares for a domain. In this example the average is the blue line but 50% of all posts lie to the left of the red line, the median.

There is NO correlation of shares and links

We used the Pearson correlation co-efficient, a measure of the linear correlation between two variables. The results can range from between 1 (a total positive correlation) to 0 (where there is no correlation) to −1 (a total negative correlation).

The overall correlations for our sample were:

Total shares and Referring Domain Links 0.021

Total shares and Sub-domain Links 0.020

Total shares and External Links 0.011

The results suggest that people share and link to content for different reasons.

We also looked at different social networks to see if there were more positive correlations for specific networks. We found no strong positive correlation of shares to referring domain links across the different networks as shown below.

  • Facebook total interactions 0.0221
  • Twitter 0.0281
  • Linkedin 0.0216
  • Pinterest 0.0065
  • Google plus 0.0058

Whilst there is no correlation by social network there is some evidence that very highly shared posts have a higher correlation of shares and links. This can be seen below.

Content sample Average total shares Median shares Average referring domain links Median referring domain links Correlation total shares – referring domains
Full sample
of posts

(757,317)
4,393 202 3.77 1 0.021
Posts with over
10,000 total shares

(69,114)
35,080 18,098 7.06 2 0.101

The increased correlation is relatively small, however, it does indicate that very popular sites, other things being equal, would have slightly higher correlations of shares and links.

Our finding that there is no overall correlation contradicts previous studies that have suggested there is a positive correlation of shares and links. We believe the previous findings may have been due to inadequate sampling as we will discuss below.

The content sweet spot: content with a positive correlation of shares and links

Our research found there are specific content types that have a high correlation of shares and links. This content attracts both shares and links, and as shares increase so do referring domain links. Thus whilst content is generally shared and linked to for different reasons, there appears to be an overlap where some content meets the criteria for both sharing and linking.

Screen Shot 2015-08-24 at 17.38.25.png

The content that falls into this overlap area, our sweet spot, includes content from popular domains such as major publishers. In our sample the content also included authoritative, research backed content, opinion forming journalism and major news sites.

In our sample of 757,000 well shared posts the following were examples of domains that had a high correlation of shares and links.

Site Number of articles in sample Referring domain links – total shares correlation
The Breast Cancer Site 17 0.90
New York Review of books 11 0.95
Pew Research 25 0.86
The Economist 129 0.73

We were very cautious about drawing conclusions from this data as the individual sample sizes were very small. We therefore undertook a second, separate sampling exercise for domains with high correlations. This analysis is outlined in the next section below.

Our belief is that previous studies may have sampled content disproportionately from popular sites within the area of overlap. This would explain a positive correlation of shares and links. However, the data shows that the domains in the area of overlap are actually outliers when it comes to shares and links.

Sweet-spot content: opinion-forming journalism and research-backed content

In order to explore further the nature of content on sites with high correlations we looked at a further 250,000 random articles from those domains.

For example, we looked at 49,952 articles from the New York Times and 46,128 from the Guardian. These larger samples had a lower correlation of links and shares, as we would expect due to the samples having a lower level of shares overall. The figures were as follows:

Domain

Number articles in sample

Average Total Shares

Average Referring Domain Links

Correlation of Total Shares to Domain Links

Nytimes.com

49,952

918

3.26

0.381

Theguardian.com

46,128

797

10.18

0.287

We then subsetted various content types to see if particular types of content had higher correlations. During this analysis we found that opinion content from these sites, such as editorials and columnists, had significantly higher average shares and links, and a higher correlation. For example:

Opinion content

Number articles in sample

Average Total Shares

Average Referring Domain Links

Correlation of Total Shares to Domain Links

Nytimes.com

4,143

3,990

9.2

0.498

Theguardian.com

19,606

1,777

12.54

0.433

The higher shares and links may be because opinion content tends to be focused on current trending areas of interest and because the authors take a particular slant or viewpoint that can be controversial and engaging.

We decided to look in more detail at opinion forming journalism. For example, we looked at over 20,000 articles from The Atlantic and New Republic. In both cases we saw a high correlation of shares and links combined with a high number of referring domain links as shown below.

Domain

Number articles in sample

Average Total Shares

Average Referring Domain Links

Correlation of Total Shares to Domain Links

TheAtlantic.com

16,734

2,786

18.82

0.586

NewRepublic.com

6,244

997

12.8

0.529

This data appears to support the hypothesis that authoritative, opinion shaping journalism sits within the content sweet spot. It particularly attracts more referring domain links.

The other content type that had a high correlation of shares and links in our original sample was research backed content. We therefore sampled more data from sites that publish a lot of well researched and evidenced content. We found content on these sites had a significantly higher number of referring domain links. The content also had a higher correlation of links and shares as shown below.

Domain

Number articles in sample

Average Total Shares

Average Referring Domain Links

Correlation of Total Shares to Domain Links

FiveThirtyEight.com

1,977

1,783

18.5

0.55

Priceonomics.com

541

1,797

11.49

0.629

PewResearch.com

892

751

25.7

0.4

Thus whilst overall there is no correlation of shares and links, there are specific types of content that do have a high correlation of shares and links. This content appears to sit close to the center of the overlap of shares and links, our content sweet spot.

The higher correlation appears to be caused by the content achieving a higher level of referring domain links. Shares are generally much easier to achieve than referring domain links. You have to work much harder to get such links and it appears that research backed content and authoritative opinion shaping journalism is better at achieving referring domain links.

Want shares and links? Create deep research or opinion-forming content

Our conclusion is that if you want to create content that achieves a high level of both shares and links then you should concentrate on opinion forming, authoritative content on current topics or well researched and evidenced content. This post falls very clearly into the latter category, so we will shall see if this proves to be the case here.

The impact of content format on shares and links

We specifically looked at the issue of content formats. Previous research had suggested that some content formats may have a higher correlation of shares and links. Below are the details of shares and links by content format from our sample of 757,317 posts.

Content Type

Number in sample

Average Total Shares

Average Referring Domain Links

Correlation of total shares & referring domain links

List post

99,935

10,734

6.19

0.092

Quiz

69,757

1,374

1.6

0.048

Why post

99,876

1,443

5.66

0.125

How to post

99,937

1,782

4.41

0.025

Infographic

98,912

268

3.67

0.017

Video

99,520

8,572

4.13

0.091

What stands out is the high level of shares for list posts and videos.

By contrast the average level of shares for infographics is very low. Whilst the top infographics did well (there were 343 infographics with more than 10,000 shares) the majority of infographics in our sample performed poorly. Over 50% of infographics (53,000 in our sample) had zero external links and 25% had less than 10 shares in total across all networks. This may reflect a recent trend to turn everything into an infographic leading to many poor pieces of content.

What also stands out is the relatively low number of referring domain links for quizzes. People may love to share quizzes but they are less likely to link to them.

In terms of the correlation of total shares and referring domain links, Why posts had the highest correlation than all other content types at 0.125. List posts and videos also have a higher correlation than the overall sample correlation which was 0.021.

List posts appear to perform consistently well as a content format in terms of both shares and links.

Some content types are more likely to be shared than linked to

Surprising, unexpected and entertaining images, quizzes and videos have the potential to go viral with high shares. However, this form of content is far less likely to achieve links.

Entertaining content such as Vine videos and quizzes often had zero links despite very high levels of shares. Here are some examples.

Content

Total Shares

External Links

Referring Domain Links

Vine video

https://vine.co/v/O0VvMWL5F2d

347,823

0

0

Vine video

https://vine.co/v/O071IWJYEUi

253,041

0

1

Disney Dog Quiz

http://blogs.disney.com/oh-my-disney/2014/06/30/qu…

259,000

0

1

Brainfall Quiz

http://www.brainfall.com/quizzes/how-bitchy-are-yo…

282,058

0

0

Long form content consistently receives more shares and links than shorter-form content

We removed videos and quizzes from our initial sample to analyze the impact of content length. This gave us a sample of 489,128 text based articles which broke down by content length as follows:

Length (words) No in sample Percent
<1,000 418,167 85.5
1-2,000 58,642 12
2-3,000 8,172 1.7
3,000-10,000 3,909 0.8

Over 85% of articles had less than 1,000 words.

We looked at the impact of content length on total shares and domain links.

Length (words) Total Shares Average Referring Domain Links Average
<1,000 2,823 3.47
1-2,000 3,456 6.92
2-3,000 4,254 8.81
3-10,000 5,883 11.07

We can see that long form content consistently gets higher average shares and significantly higher average links. This supports our previous research findings, although there are exceptions, particularly with regard to shares. One such exception we identified is IFL Science, that publishes short form content shared by its 21m Facebook fans. The site curates images and videos to explain scientific research and findings. This article examines how they create their short form viral content. However, IFLS Science is very much an exception. On average long form content performs better, particularly when it comes to links.

When we looked at the impact of content length on the correlation of shares and links. we found that content of over 1,000 words had a higher correlation but the correlation did not increase further beyond 2,000 words.

Length (words) Correlation Shares/Links
<1,000 0.024
1-2,000 0.113
2-3,000 0.094
3,000+ 0.072

The impact of combined factors

We have not undertaken any detailed linear regression modelling or built any predictive models but it does appear that a combination of factors can increase shares, links and the correlation. For example, when we subsetted List posts to look at those over 1,000 words in length, the average number of referring domain links increased from 6.19 to 9.53. Similarly in our original sample there were 1,332 articles from the New York Times. The average number of referring domain links for the sample was 7.2. When we subsetted out just the posts over 1,000 words the average number of referring domain links increased to 15.82. When we subsetted out just the List posts the average number of referring domain links increased further to 18.5.

The combined impact of factors such as overall site popularity, content format, content type and content length is an area for further investigation. However, the initial findings do indicate that shares and/or links can be increased when some of these factors are combined.

You can download the full 30 page research report from the BuzzSumo site:

Download the full 30-page research report

Steve will be discussing the findings at a Mozinar on September 22, at 10.30am Pacific Time. You can register and save your place here https://attendee.gotowebinar.com/register/41189941…

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Source: moz

 

Re-Examining The Top 10 Paid Search Best Practices, Part 1

search-investigate-magnifying-glass-ss-1920

Best practices — by definition — are a set of highly recommended tips and tricks born of repeated and ongoing expertise in a specific subject matter. As professionals, we rely on these tried and tested procedures every day because we assume them to be correct and effective.

But are these best practices always the right course of action? If anything, they might just be, as many describe, “just a good starting point” that shouldn’t be relied on as the only way to manage paid search.

Susan Waldes’ recent post here at Search Engine Land on why search marketers should reconsider using broad match challenges the best practice on avoiding that match type when possible. I read this piece and was inspired to re-examine other best practices search practitioners take for granted as being hard and fast rules.

The goal here is not to try to debunk these best practices — they’re all highly effective tips — but rather to explore a counter view of each to see if there are any interesting insights we may uncover from this exercise.

The Top Ten Paid Search Best Practices Re-Examined

The following best practices are not ranked in any order. These are a list of common tips that frequently appear on best practice lists published on the web and shared at industry conferences.

1. Avoid Broad Match

I’m starting with this one since I mentioned it above. Search marketers generally try to use as little broad match as possible in order to gain the most control of their accounts.

After all, on broad match, search engines tend to include plurals, misspellings and other close variants automatically, which can steal traffic from other ad groups that overlap. With potentially millions of keywords across thousands of ad groups, it may be hard for you to steward the budgets toward the strategies in the way that you want.

Overusing broad match can also submit your campaigns for auctions that aren’t necessarily for audiences you hope to reach. If you’re a chocolatier, you don’t want to show up for every search for candy, dessert, etc. Maybe it would be best to limit yourself to folks searching for chocolate only?

Insights from counter view? As Ms. Waldes explains in her article, broad match can be a valuable tool to reach consumers who may be interested in your products/services but don’t search on the terms within your account. She reminds us that on any given day, 10-20% of Google queries have never been seen before.

Broad match can effectively become the catch-all net to ensure that you do show up for everyone who may be interested in your business. Broad match also becomes a very strong research tool to mine queries that may not already be in your account.

2. Test. Analyze. Optimize. Repeat.

This is a best practice at the very heart of search marketing (and many other digital marketing disciplines). Because of the rapid-response nature of paid search, it is an idyllic advertising channel to continuously try out new things, examine the results, and then build upon what is and isn’t working.

In essence, you can load up new creative or keywords on Monday, let them run for a few days, pull the results on Thursday, and upload improvements to your campaigns on Friday.

In some instances, you may be able to analyze thousands of clicks and millions of impressions over just a few days. The value of that data has been instrumental in making paid search the largest (and arguably the most valuable) digital marketing channel.

Insights from counter view? This is not just a search marketing best practices, it’s nearly a mantra for many practitioners — including myself. It will be hard to argue against this approach. However, as I re-examine this process, I could argue that a lot of time is spent testing things that have already proven to be highly effective. Maybe we could do less testing (quantitative research) and more qualitative research.

The answers aren’t always in the numbers. Sometimes the best data comes from speaking to your customers via surveys or talking to them one-on-one. In the hours it takes to analyze the data from your search account, you may be able to speak to a half dozen consumers who provide you even stronger direction on how to best reach them with the paid search channel.

3. Don’t Solely Use Google

When I first started in paid search in 2002, there were a dozen or more viable search engines: Ask Jeeves, Excite, Alta Vista, Dogpile to name a few, as well as Yahoo, Google, and Microsoft. Because market share was spread across so many publishers, the only way to get scale was to tap into an aggregated stream of the entire search universe.

Over time, Google pulled out ahead and for some time now has represented about 65% of all US search traffic with Yahoo and Bing making up around 33%.

Let me say emphatically: you must unequivocally be on all three engines at this point if you want to be serious about your paid search efforts.

Insights from counter view? For the biggest search advertisers spending more than a couple hundred thousand dollars a month on paid search, it’s almost impossible to solely use Google. The available inventory from Bing/Yahoo (especially those incredibly valuable brand terms) completely dictates that you must run outside of Google to maintain a healthy search account.

However, for 99% of AdWords customers who are not the giant spenders  — those whose budgets are sub-$1000/month — it certainly makes sense to focus on Google for your SEM efforts.

Being the search marketing expert for all of my friends and family, I’m often asked to talk to their friends or colleagues about the paid search they’re doing for their small businesses or personal projects.

For those people who read that they should be using other engines, I often tell them that it’s okay that they’re just on Google. As long as they’re not capped out on branded terms in AdWords, small businesses who are completely stretched thin across so many tasks (even outside of marketing needs) don’t need to spend the time it takes to learn and manage accounts on other engines.

But, if you have a zillion things to do in a day and you’re wondering if you should carve out time for other engines, unless you’re an internet-only business, AdWords should be just fine for you. When you grow and start spending more, that’s when you can hire someone to help you expand to other publishers.

4. Utilize Negatives

Managing negatives effectively is a very strong technique to help control when the engines submit you for auctions as well as which campaigns and ad groups are submitted for specific use keyword queries. They’re one of the main levers search marketers use to pare back keyword groups in order make them more streamlined.

I personally think advanced negatives handling is one of the key differences to average and expert search marketers. Whenever I audit a paid search account for the first time, I always check out the account’s negatives in order to understand how advanced the approach has been to date.

There are some really amazing ways marketers are hacking the negatives field and there are plenty of great articles available online to help you get your negatives to the next level.

Insights from counter view? This best practice is super tough to counter because negatives are such a valuable parameter for search marketers. However, for the sake of this exercise, I imagine there are some marketers out there that haven’t truly mastered all the ins and outs of how negatives affect their campaigns. They may actually be limiting their effectiveness and scale by not managing them correctly.

Additionally, if there are multiple practitioners managing the same account (as there generally are with big advertisers), they could be utilizing different negative strategies that could bring a level of chaos to the account which each of the practitioners don’t realize.

I’m reaching there a bit to counter the negatives best practice, but if you’re on a team of SEM pros managing the same account, you might want to make sure you are all on the same page on how negatives should be used, when they should be used, and ensure each of you are using them in the same way to avoid any performance issues.

5. Use Different Creative For Mobile And Desktop Ads

Mobile, once a traffic outlier in single digit percentage of clicks, has now become the dominant sub-channel for paid search for most advertisers. As mobile grew in importance and volume, many best practices for mobile search were written; the main one being that search marketers should utilize different ad copy for mobile and desktop ads.

This totally makes sense. We know that user search behavior on mobile devices often differs from desktop usage — even for the same query. Mobile users are on the go, have smaller screens, are looking for more localized content, etc.

Insights from counter view? I’ve gone back and forth in my career regarding the importance of ad copy. Of course it’s part of the triumvirate of paid search (keywords, ads, bidding) but I do believe practitioners sometimes over think just how many variations of those 95 characters should be used. Do we really need to test 25 versions of the new promotion’s ad copy? Does changing one word here or there really make a difference?

Generally, I would emphatically say, “Yes! Of course it matters.” However, in the context of this article, could we learn something by arguing against using different ad creative for desktop and mobile campaigns?

The only thing reason I may take the counter view on this best practice would be that it takes a lot of time to write, edit, load manage, report, analyze, and test twice as much creative as it would by just using the same ads for desktop and mobile. If you are unable to complete your daily search duties due to time constraints, you might try deprioritizing this tactic in lieu of adopting other best practices.

Maybe by taking that time and applying it to advanced bidding analysis or to match type management would move the needle faster than focusing on device-specific ad copy?

Preview

In part two of this article, we will re-examine the next set of best practices and see if we can learn anything new from the counter view:

6. Big keyword lists

7. Conversion tracking

8. Specific landing pages

9. Watch your competitors

10. Query mining

The post Re-Examining The Top 10 Paid Search Best Practices, Part 1 appeared first on Search Engine Land.

Source: SEL

 

Five Tools that Completely Impacted the Way I Carry Out Business

5 Tools that changed the way I do business

This excellent post on business enterprise tools by Duct Tape Marketing explains the prime 5 tools they use which produced constructive change on the way they do business.

The modern online business man or woman will have to adapt with all the modifications in technology so they are able to retain pace with the on-line age. That is why you will find a great deal of distinct tools out there to assist you keep focused, organize content material, create workflows and speak to your own workforce.

Make things quick for yourself by taking a look at these fantastic tools, and see just how much less complicated they make your operating life.

 

SearchCap: Google Panda Reversal, Waze Sued & Bing Ads Halloween

Below is what happened in search today, as reported on Search Engine Land and from other places across the Web.

From Search Engine Land:

Recent Headlines From Marketing Land, Our Sister Site Dedicated To Internet Marketing:

Search News From Around The Web:

Industry

Local & Maps

Link Building

SEO

The post SearchCap: Google Panda Reversal, Waze Sued & Bing Ads Halloween appeared first on Search Engine Land.

Source: SEL

 

4 Social Media Reports That Will Boost Your Audience Engagement

In the early days of social media marketing, the number one goal for businesses was to amass huge followings, and for good reason. It was a time when there was no limit to organic post reach, and the companies with the largest social audiences benefited the most with totally free advertising.

Unfortunately for those early adopters, the networks have matured, the business models have grown and nowadays brands are lucky if even a fraction of their social following see their posts. That’s why marketers need to completely rethink the way they approach social as an organic channel.

Where before social marketing was all about building a giant audience, today social marketing is about providing quality, relevant content to your audience so that you can engage with them as individuals and attract them as a customer. But in order to really provide your social audience with engaging content, you need to figure out who they are. This can be done with some of the reports available online.

This post will cover how to use data to find your perfect social audience, and how to look at reports on past performance to find out which content resonates with your audience.

1. Demographics Reports

Who are the people that are interacting with you on social media? There’s a chance that you or your community managers can name some of the top influencers within your social following, but who is your average reader? One way to get a tremendous increase in social engagement is to answer this question, and create content that resonates with those individuals. Here are some of the reports available to you that can be used to figure out just who your average social engager is.

Facebook Page Insights

Let’s start with Facebook. Facebook has a plethora of information on your fans and followers. One way to access that information is through the Facebook Insights tab found on your brand’s homepage.

facebook-insights

If this is your first time using this tool, take a look around at some of the great data at hand. Otherwise, navigate over to the People tab to start pulling data on your fans and followers.

facebook-people-tab

The above screenshot is looking more specifically at the People Engaged section. It’s better to look at People Engaged since this shows the demographics of the people who are more inclined to actually engage with your social posts.

The first data you’ll see breaks down the various ages and genders of the people who are engaging with your Facebook posts. For example, the graph below shows that of all the people interacting with Sprout Social, 21% are women ages 25-34.

people-engaged-facebook

Various age groups and genders are likely to interact with content differently. That’s why it’s a key advantage to figure out your main audience and create posts catered to their interests.

Twitter’s Analytics Tool

Similar to Facebook, Twitter offers audience insights within its analytics dashboard. After you access your analytics page, navigate to the tab addressing your Followers.

twitter-audience-insights-followers

Assuming you have enough followers, this tab has information on their gender, income, education and more, which will help you create content on Twitter that this audience is more likely to interact with. However, the tab doesn’t include age data. One option is to use Sprout’s Twitter Analytics.

Followerwonk

Followerwonk is a Twitter tool created by the software company Moz. If you have access to the tool, choose the Analyze tab from your dashboard.

followerwonk-analyze-their-followers

From here you can enter any Twitter handle and scan their followers. The data provided is similar to Twitter’s, but the one we’re most interested in for creating demographic-driven content is the gender section.

followerwonk-analyze-data

Google Analytics

Most of the networks don’t provide as robust data as Facebook and Twitter, but never fear: Google always has your back. Access your account and navigate to Audience > Demographics > Overview.

google-analytics-demographics-overview

After that, you need to +Add Segment, that looks at the Traffic Sources you’re interested in finding demographics for, like Pinterest.

add-segment-traffic-sources-google-analytics

2. Location and Language Reports

Location-based geo–targeting is a very powerful tool for social media marketers. It’s possible for you to pull reports to find some of your most engaged cities, states or countries, and then write and distribute topical content to those locations for better response. For instance, creating a post for a specific city could call out the hometown sports team.

Facebook Page Insights

Back on the Facebook Page Insights report, just a little below the age and gender information, you’ll find data on where the people engaged with your page are located.

facebook-page-insights-screenshot

So the company in the example above could see some increased engagement if they targeted a post to their Chicago audience that called out The Cubs (or the White Sox if that’s what you’re into…).

Twitter’s Analytics Tool

You’ll find location information when you access your Twitter Analytics the same way as in the previous report, but this time you’ll need to navigate over to the second tab.

twitter-analytics-demographics

In the right column you’ll find information provided by Twitter on both the country and region your followers are in.

Followerwonk

Followerwonk preserves your reports for 60 days, so you should be able to find the same presentation you used when looking at the Demographics report. Followerwonk actually puts your followers’ location data in an interactive map.

followerwonk-demographics

Google Analytics

Similar to the Demographics Reports, if you can’t find location data for the other networks look to Google Analytics. Access your account and navigate through Audience > Geo > Location.

audience-location-google-analytics

After you create the custom segment looking at the social source, Google provides data on users by country, city, continent and subcontinent.

3. Sent Message Reports

Now that we know who our audience is, it’s a good idea to think about the content that they like. One way to do that is to look at your past social media performance to see which of your posts received good engagement, then use that information to dictate your strategy moving forward. A few of the platforms that we’ve mentioned so far have the ability to analyze old posts for performance, but I want to focus on two that do a great job.

Followerwonk

Just like with the other reports, you’ll need to navigate to the Analyze tab of your dashboard. The difference this time is that instead of choosing to “analyze their followers’ you want to choose ‘analyze their tweets’.

followerwonk-analyze-tab

Followerwonk uses Retweets as the main metric for deciding what makes the most important Tweet. In the example above I chose to look at Social Media Examiner. Followerwonk shows that this is their most important Tweet over the time period.

Social Media Examiner can look at this and decide that if they create more fun content that resonates with social media managers, they’ll see more engagement.

Sprout Social

Sprout Social is a social media management platform that also gives you analytics on your past social media performance.

sprout-social-content-breakdown

Sprout has a number of key performance indicators (KPIs) that you can look at to find out which of your pieces of content perform well. You can then study those posts so that they can inform your content strategy moving forward.

4. Day and Hour Reports

Once you’ve figured out who to target and what to post, the next question on your mind should be “when to post.” And you wouldn’t be the only one, either. This Google Trends graph shows just how fascinated people are with finding the perfect time for posting.

google-trends-when-to-post-search-graph

However, the best time to post for one brand may not be the best time for another. That’s why you should look at these reports to find your unique perfect post time.

Facebook Page Insights

Access your Facebook Page Insights again, and go to the Posts tab. This is where you’ll find additional information on your fans, such as when they’re online.

facebook-page-insights-post

Although this doesn’t say when you get the most engagement, it does have information on the days of the week and hours of the day when your fans are most likely to be online. Posting content at these peak days and hours should lead to an increase in your engagement.

Manual Reporting

Unfortunately, Facebook is one of the only networks out there that provides data on when your users are accessing the site. However, you can always pull data manually to look at which days are performing best. Or try talking to your community manager; most of them have a good idea of which days and times get great response. If all else fails, try a tool like Sprout Social that automatically post your social messages at the time that will yield the most audience engagement.

Every Audience is Unique

No two brands have an identical social media following, so it really doesn’t make much sense to use a one-size-fits-all content strategy. Taking the time to analyze your unique social media audience allows you to develop a content strategy that caters to them, which will set you leagues above the competition.

About the Author: Michael Patterson is a Digital Marketing Specialist at Sprout Social, a social media platform that helps brands manage their social media efforts. You can find him on Twitter @MPatterson22.

Source: KISS

 

Clean Your Site’s Cruft Before It Causes Rankings Problems – Whiteboard Friday

Posted by randfish

We all have it. The cruft. The low-quality, or even duplicate-content pages on our sites that we just haven’t had time to find and clean up. It may seem harmless, but that cruft might just be harming your entire site’s ranking potential. In today’s Whiteboard Friday, Rand gives you a bit of momentum, showing you how you can go about finding and taking care of the cruft on your site.

Cleaning the Cruft from Your Site Before it Causes Pain and Problems with your Rankings Whiteboard

Click on the whiteboard image above to open a high resolution version in a new tab!

Video transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re chatting about cleaning out the cruft from your website. By cruft what I mean is low quality, thin quality, duplicate content types of pages that can cause issues even if they don’t seem to be causing a problem today.

What is cruft?

If you were to, for example, launch a large number of low quality pages, pages that Google thought were of poor quality, that users didn’t interact with, you could find yourself in a seriously bad situation, and that’s for a number of reasons. So Google, yes, certainly they’re going to look at content on a page by page basis, but they’re also considering things domain wide.

So they might look at a domain and see lots of these green pages, high quality, high performing pages with unique content, exactly what you want. But then they’re going to see like these pink and orange blobs of content in there, thin content pages with low engagement metrics that don’t seem to perform well, duplicate content pages that don’t have proper canonicalization on them yet. This is really what I’m calling cruft, kind of these two things, and many variations of them can fit inside those.

But one issue with cruft for sure it can cause Panda issues. So Google’s Panda algorithm is designed to look at a site and say, “You know what? You’re tipping over the balance of what a high quality site looks like to us. We see too many low quality pages on the site, and therefore we’re not just going to hurt the ranking ability of the low quality pages, we’re going to hurt the whole site.” Very problematic, really, really challenging and many folks who’ve encountered Panda issues over time have seen this.

There are also other probably non-directly Panda kinds of related things, like site-wide analysis of things like algorithmic looks at engagement and quality. So, for example ,there was a recent analysis of the Phantom II update that Google did, which hasn’t really been formalized very much and Google hasn’t said anything about it. But one of the things that they looked at in that Phantom update was the engagement of pages on the sites that got hurt versus the engagement of pages on the sites that benefited, and you saw a clear pattern. Engagement on sites that benefited tended to be higher. On those that were hurt, tended to be lower. So again, it could be not just Panda but other things that will hurt you here.

It can waste crawl bandwidth, which sucks. Especially if you have a large site or complex site, if the engine has to go crawl a bunch of pages that are cruft, that is potentially less crawl bandwidth and less frequent updates for crawling to your good pages.

It can also hurt from a user perspective. User happiness may be lowered, and that could mean a hit to your brand perception. It could also drive down better converting pages. It’s not always the case that Google is perfect about this. They could see some of these duplicate content, some of these thin content pages, poorly performing pages and still rank them ahead of the page you wish ranked there, the high quality one that has good conversion, good engagement, and that sucks just for your conversion funnel.

So all sorts of problems here, which is why we want to try and proactively clean out the cruft. This is part of the SEO auditing process. If you look at a site audit document, if you look at site auditing software, or step-by-step how-to’s, like the one from Annie that we use here at Moz, you will see this problem addressed.

How do I identify what’s cruft on my site(s)?

So let’s talk about some ways to proactively identify cruft and then some tips for what we should do afterwards.

Filter that cruft away!

One of those ways for sure that a lot of folks use is Google Analytics or Omniture or Webtrends, whatever your analytics system is. What you’re trying to design there is a cruft filter. So I got my little filter. I keep all my good pages inside, and I filter out the low quality ones.

What I can use is one of two things. First, a threshold for bounce or bounce rate or time on site, or pages per visit, any kind of engagement metric that I like I can use that as a potential filter. I could also do some sort of a percentage, meaning in scenario one I basically say, “Hey the threshold is anything with a bounce rate higher than 90%, I want my cruft filter to show me what’s going on there.” I’d create that filter inside GA or inside Omniture. I’d look at all the pages that match that criteria, and then I’d try and see what was wrong with them and fix those up.

The second one is basically I say, “Hey, here’s the average time on site, here’s the median time on site, here’s the average bounce rate, median bounce rate, average pages per visit, median, great. Now take me 50% below that or one standard deviation below that. Now show me all that stuff, filters that out.”

This process is going to capture thin and low quality pages, the ones I’ve been showing you in pink. It’s not going to catch the orange ones. Duplicate content pages are likely to perform very similarly to the thing that they are a duplicate of. So this process is helpful for one of those, not so helpful for other ones.

Sort that cruft!

For that process, you might want to use something like Screaming Frog or OnPage.org, which is a great tool, or Moz Analytics, comes from some company I’ve heard of.

Basically, in this case, you’ve got a cruft sorter that is essentially looking at filtration, items that you can identify in things like the URL string or in title elements that match or content that matches, those kinds of things, and so you might use a duplicate content filter. Most of these pieces of software already have a default setting. In some of them you can change that. I think OnPage.org and Screaming Frog both let you change the duplicate content filter. Moz Analytics not so much, same thing with Google Webmaster Tools, now Search Console, which I’ll talk about in a sec.

So I might say like, “Hey, identify anything that’s more than 80% duplicate content.” Or if I know that I have a site with a lot of pages that have only a few images and a little bit of text, but a lot of navigation and HTML on them, well, maybe I’d turn that up to 90% or even 95% depending.

I can also use some rules to identify known duplicate content violators. So for example, if I’ve identified that everything that has a question mark refer equals bounce or something or partner. Well, okay, now I just need to filter for that particular URL string, or I could look for titles. So if I know that, for example, one of my pages has been heavily duplicated throughout the site or a certain type, I can look for all the titles containing those and then filter out the dupes.

I can also do this for content length. Many folks will look at content length and say, “Hey, if there’s a page with fewer than 50 unique words on it in my blog, show that to me. I want to figure out why that is, and then I might want to do some work on those pages.”

Ask the SERP providers (cautiously)

Then the last one that we can do for this identification process is Google and Bing Webmaster Tools/Search Console. They have existing filters and features that aren’t very malleable. We can’t do a whole lot with them, but they will show you potential site crawl issues, broken pages, sometimes dupe content. They’re not going to catch everything though. Part of this process is to proactively find things before Google finds them and Bing finds them and start considering them a problem on our site. So we may want to do some of this work before we go, “Oh, let’s just shove an XML sitemap to Google and let them crawl everything, and then they’ll tell us what’s broken.” A little risky.

Additional tips, tricks, and robots

A couple additional tips, analytics stats, like the ones from GA or Omniture or Webtrends, they can totally mislead you, especially for pages with very few visits, where you just don’t have enough of a sample set to know how they’re performing or ones that the engines haven’t indexed yet. So if something hasn’t been indexed or it just isn’t getting search traffic, it might show you misleading metrics about how users are engaging with it that could bias you in ways that you don’t want to be biased. So be aware of that. You can control for it generally by looking at other stats or by using these other methods.

When you’re doing this, the first thing you should do is any time you identify cruft, remove it from your XML sitemaps. That’s just good hygiene, good practice. Oftentimes it is enough to at least have some of the preventative measures from getting hurt here.

However, there’s no one size fits all methodology after the don’t include it in your XML sitemap. If it’s a duplicate, you want to canonicalize it. I don’t want to delete all these pages maybe. Maybe I want to delete some of them, but I need to be considered about that. Maybe they’re printer friendly pages. Maybe they’re pages that have a specific format. It’s a PDF version instead of an HTML version. Whatever it is, you want to identify those and probably canonicalize.

Is it useful to no one? Like literally, absolutely no one. You don’t want engines visiting. You don’t want people visiting it. There’s no channel that you care about that page getting traffic to. Well you have two options — 301 it. If it’s already ranking for something or it’s on the topic of something, send it to the page that will perform well that you wish that traffic was going to, or you can completely 404 it. Of course, if you’re having serious trouble or you need to remove it entirely from engines ASAP, you can use the 410 permanently delete. Just be careful with that.

Is it useful to some visitors, but not search engines? Like you don’t want searchers to find it in the engines, but if somebody goes and is paging through a bunch of pages and that kind of thing, okay, great, I can use no index, follow for that in the meta robots tag of a page.

If there’s no reason bots should access it at all, like you don’t care about them following the links on it, this is a very rare use case, but there can be certain types of internal content that maybe you don’t want bots even trying to access, like a huge internal file system that particular kinds of your visitors might want to get access to but nobody else, you can use the robots.txt file to block crawlers from visiting it. Just be aware it can still get into the engines if it’s blocked in robots.txt. It just won’t show any description. They’ll say, “We are not showing a site description for this page because it’s blocked by robots.”

If the page is almost good, like it’s on the borderline between pink and green here, well just make it good. Fix it up. Make that page a winner, get it back in the engines, make sure it’s performing well, find all the pages like that have those problems, fix them up or consider recreating them and then 301’ing them over if you want to do that.

With this process, hopefully you can prevent yourself from getting hit by the potential penalties, or being algorithmically filtered, or just being identified as not that great a website. You want Google to consider your site as high quality as they possibly can. You want the same for your visitors, and this process can really help you do that.

Looking forward to the comments, and we’ll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Source: moz

 

Extreme Leadership: The Need for Speed

In this era of extreme digital marketing, leading an organization isn’t too different from competitive mountain biking. Here are a few key tips to keep in mind to help you make it over the finish line successfully without getting bashed up on the way down.

Fotosearch_k2495915
Photo Credit: Maridav www.fotosearch.com

Start Out With the Right Equipment

You wouldn’t set out on a hardcore cross-country trail without a capable bike, strong helmet, tuned suspension, and of course, nice disc brakes – so why would you even attempt to run a web-based organization without being armed with the data and insights you need to achieve top performance? 

Set up the systems you need to clearly monitor and measure your business. You don’t need to just buy either the top or bottom shelf to put off your uncertainties. Think about what complexities you’re facing in your business. What problems do you really need to solve? How is your website set up to serve your customers? We’ve seen several clients over purchase or opt for “free” and not have the best solution in place to see and fix their business challenges.

Be Prepared to Go Fast and Win

With speed on your side, all you really need to do is steer with confidence and you’ll come through clean. But if your organization isn’t prepared to go fast with confidence, the bumps and roots are going to hit harder, slow you down and even take you off track.

In fact, you’ll probably find yourself in granny-gear trying to haul yourself uphill while others pass you by. Plus, you won’t have the momentum required to hit the banks and ramps, and you’re likely not enjoy the ride to its fullest potential. Make sure that you have the information you need to quickly and easily monitor your business performance, presented in a format that won’t require you to take your entire organization offline in order to figure out how to make the raw data actionable. Business is faster than ever and you need to be ready to make quick decisions, and adjustments, so you can enjoy the ride.

See Where You Want to Go

When you’re burning down the trail you need to see your path clearly, focus on where you want to go and be ready to react. You can’t possibly navigate successfully if you’re looking off trail or questioning your decisions. You’ll go down, hard.

The beauty of the web is that our customers leave digital data trails that help us understand their behavior and show us the right path. In order to lead effectively, C-suite executives need to start taking a more active role in aligning the entire business with the journey of their customers

This data can illuminate the path to better performance across the entire company by providing an understanding of the customer journey from beginning to end by showing the actual flow of the audience and identifying potential bottlenecks along the way.

It’s All About Confidence

In this fast paced business climate, decisions need to be made sooner, rather than later, because the meter is running. You simply can’t afford to base decisions solely on instinct or “art”. Data is the key to everything that is happening within your prospect and customer base.

If you have the right data and the correct insight into the customer experience, you can make decisions with speed, precision and confidence. Executives who take an active role in building a data business have a more in-tune view of their businesses and are able to compete more effectively. Those who aren’t taking an active role in mapping and understanding the customer journey are at huge disadvantage, and will soon be left in the dust.

To run a competitive organization, implementing the right systems and delivering the right data will enable you to meet the expectations of your customers and pull away from the competition. And Kissmetrics would love to help – request a personal demo, if you’d like to learn more.

About the Author: Brian Kelly is the CEO of Kissmetrics.

Source: KISS

 

SearchCap: Google Health, Google Maps Explore & Our App

Below is what happened in search today, as reported on Search Engine Land and from other places across the Web.

From Search Engine Land:

Recent Headlines From Marketing Land, Our Sister Site Dedicated To Internet Marketing:

Search News From Around The Web:

Industry

Local & Maps

Searching

SEO

SEM / Paid Search

The post SearchCap: Google Health, Google Maps Explore & Our App appeared first on Search Engine Land.

Source: SEL

 

New Interactions Reporting In AdWords Follow On Video Campaigns Integration

adwords interaction reporting columns video campaignsAlong with the introduction of video campaigns into the main AdWords interface, Google has added new reporting columns to help marketers analyze campaign performance across channels.

Here’s a rundown of the new columns:

  • Interactions – These are the main actions people take with the ad formats — clicks for text and shopping ads, views for video ads, and engagements for Lightbox ads and the new Gmail ads.
  • Interaction Rate – Shows how often people interact with an ad after it’s shown to them—such as clicks divided by impressions for text ads, or views divided by impressions for video ads.
  • Avg Cost – This is the average amount paid for divided by total interactions—clicks divided by cost, or views divided by total cost.

You’ll notice each metric is labeled depending on the type of ad format being measured.

Another change is that Total rows are broken out by campaign type, instead of by network, and total rows will only appear for only the campaign types in the account.

The post New Interactions Reporting In AdWords Follow On Video Campaigns Integration appeared first on Search Engine Land.

Source: SEL

 

New Study: Data Reveals 67% of Consumers are Influenced by Online Reviews

Posted by Dhinckley

Google processed over 1 trillion search queries in 2014. As Google Search continues to further integrate into our normal daily activities, those search results become increasingly important, especially when individuals are searching for information about a company or product.

To better understand just how much of an impact Google has on an individual’s purchasing decisions, we set up a research study with a group of 1,000 consumers through Google Consumer Surveys. The study investigates how individuals interact with Google and other major sites during the buying process.

Do searchers go beyond page 1 of Google?

We first sought to understand how deeply people went into the Google search results. We wanted to know if people tended to stop at page 1 of the search results, or if they dug deeper into page 2 and beyond. A better understanding of how many pages of search results are viewed provides insight into how many result pages we should monitor related to a brand or product.

When asked, 36% of respondents claimed to look through the first two pages or more of search results. But, looking at actual search data, it is clear that individuals view less than 2% of searches below the top five results on the first page. From this, it is clear that actual consumer behavior differs from self-reported search activity.

Do Searchers Go Beyond Page 1 of Google?

Takeaway: People are willing to view as many as two pages of search results but rarely do so during normal search activities.

Are purchasing decisions affected by online reviews?

Google has integrated reviews into the Google+ Local initiative and often displays these reviews near the top of search results for businesses. Other review sites, such as Yelp and TripAdvisor, will also often rank near the top for search queries for a company or product. Because of the prevalence of review sites appearing in the search results for brands and products, we wanted a better understanding of how these reviews impacted consumers’ decision-making.

We asked participants, “When making a major purchase such as an appliance, a smart phone, or even a car, how important are online reviews in your decision-making?”

The results revealed that online reviews impact 67.7% of respondents’ purchasing decisions. More than half of the respondents (54.7%) admitted that online reviews are fairly, very, or absolutely an important part of their decision-making process.

Purchasing Decisions and Online Reviews

Takeaway: Companies need to take reviews seriously. Restaurant review stories receive all the press, but most companies will eventually have pages from review sites ranking for their names. Building a strong base of positive reviews now will help protect against any negative reviews down the road.

When do negative reviews cost your business customers?

Our research also uncovered that businesses risk losing as many as 22% of customers when just one negative article is found by users considering buying their product. If three negative articles pop up in a search query, the potential for lost customers increases to 59.2%. Have four or more negative articles about your company or product appearing in Google search results? You’re likely to lose 70% of potential customers.

Negative Articles and Sales

Takeaway: It is critical to keep page 1 of your Google search results clean of any negative content or reviews. Having just one negative review could cost you nearly a quarter of all potential customers who began researching your brand (which means they were likely deep in the conversion funnel).

What sites do people visit before buying a product or service?

Google Search is just one of the sites that consumers can visit to research a brand or product. We thought it would be interesting to identify other popular consumer research sites.

Interestingly, most people didn’t seem to remember visiting any of the popular review sites. Instead, the brand site that got the most attention was Google+ Local reviews. Another noteworthy finding was that Amazon came in second, with half the selections that Google received. Finally, the stats show that more people look to Wikipedia for information about a company than to Yelp or TripAdvisor.

Prepurchase Research Sources

Takeaway: Brands should invest time and effort into building a strong community on the Google+, which could lead to receiving more positive reviews on the social platform.

Online reviews impact the bottom line

The results of the study show that online reviews have a significant influence on the decision-making process of consumers. The data supports the fact that Internet users are generally willing to look at the first and second page of Google search results when searching for details about a product or company.

We can also conclude that online review sites like Google+ Local are heavily visited by potential customers looking for information, and the more negative content they find there, the less likely they will be to purchase your products or visit your business.

All this information paints a clear picture that what is included in Google search results for a company or product name will inevitably have an impact on the profitability of that company or product.

Internal marketing teams and public relation firms (PR) must consider the results that Google displays when they search for their company name or merchandise. Negative reviews, negative press, and other damaging feedback can have a lasting impact on a company’s ability to sell their products or services.

How to protect your company in Google search results

A PR or marketing team must be proactive to effectively protect a company’s online reputation. The following tactics can help prevent a company from suffering from deleterious online reviews:

  • First, identify if negative articles already exist on the first two pages of search results for a Google query of a company name or product (e.g., “Walmart”). This simple task should be conducted regularly. Google often shifts search results around, so a negative article—which typically attracts a higher click-through rate—is unfortunately likely to climb the rankings as individuals engage with the piece.
  • Next, monitor and analyze the current sentiment of reviews on popular review sites like Google+ and Amazon. Other sites, like Yelp or Trip Advisor, should also be checked often, as they can quickly climb Google search results. Do not attempt to artificially alter the results, but instead look for best practices on how to improve Yelp reviews or other review sites and implement them. The goal is to naturally improve the general buzz around your business.
  • If negative articles exist, there are solutions for improvement. A company’s marketing and public relations team may benefit by highlighting and/or generating positive press and reviews about the product or service through SEO and ORM efforts. By gaining control of the search results for your company or product, you will be in control of the main message that individuals see when looking for more information about your business. That’s done by working to ensure prospects and customers enjoy a satisfying experience when interacting with your brand, whether online or offline.

Being proactive with a brand’s reputation, as viewed in the Google search results and on review sites, does have an impact on the bottom line.

As we see in the data, people are less likely to make a purchase as the amount of negative reviews increase in Google search results.

By actively ensuring that honest, positive reviews appear, you can win over potential customers.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Source: moz