Facebook Algorithm

Facebook uses a certain algorithm to determine where and what posts appear on each individual user’s newsfeed. The algorithm can be understood as the sum of actions; each action is made up of affinity, weight and time decay. An action is considered to be anything that happens on Facebook – literally anything. Facebook algorithm takes these Facebook actions and ranks them based on the importance to the user. Objects with the highest Facebook algorithm “score”, if you will, will show up at the top of the News Feed. Sounds a little confusing, so let’s break it down.

  • Affinity: Measures the relationship between the viewing user and the creator of the post. The closer the relationship, the higher the score. This is one way Affinity is measured. If you, user A, interact with user B’s posts, you will see their content more often, but they won’t necessarily see your content more frequently.
  • Weight: Different types of posts carry different rates. In order, they rank 1.) Photos/videos 2.) Links and 3.) Plain text updates. Engagement is also a factor in the post’s weight. More engagement on a post the greater the weight, and it becomes more visible. However, the amount of complaints or negative feedback also has an effect. The more negative feedback a post has, the less likely you are to see that post.
  • Decay: Posts continually lose value as they grow older. This way, the content on your News Feed stays fresh.

The higher the score for these three variables, the more important Facebook feels that object is to the user; therefore, it will be higher on the News Feed. By improving the size of your network and the frequency of their engagement with your content, you will simultaneously be improving your Facebook algorithm score.

The variables in which Facebook uses to determine News Feed content changes, but affinity, weight and decay are three main principles of the algorithm. The most recent addition to the algorithm is content on the News Feed that looks like spammy or considered click-bait. It is suggested to make sure your post or headline gives enough information to the reader to decide whether they want to read the article or not. If your post or headline is vague, it will be harder for it to be seen on the News Feed.

Why is it important?

It’s important to have a good Facebook algorithm score for two reasons: people are more engaged with the News Feed, and the News Feed is a competitive space.

In 2012, 40% of all time spent on Facebook was in the News Feed. In the US, people spend more time on the Facebook News Feed than ABC, MSNBC, Yahoo! News, CNN, New York Times and Huffington Post combined.  Since approximately 96% of fans don’t go back to a brand’s Facebook page after the initial engagement, your post is 40-150 times more likely to reach your fans through their News Feeds than your page.

You can check your current Facebook algorithm score, and see where your page needs improvement.  And don’t forget to call Robertson & Markowitz Advertising and Public Relations if you need help improving your content and increasing your Facebook algorithm score.

Can Search Engines Trust Social Media?

Can Search Engines Trust Social Media?

I recently wrote that search engines are ignoring more and more their old metrics that were so easily exploited by savvy Search Engine Optimization specialists, and that they are instead reading the more trustworthy signals that come from social media sharing and posting to measure a website’s popularity and relevance.  It was just a matter of time, however, before social media interaction metrics would be manipulated as well.

Google and other search engines attribute high relevance to social media pages.  Search engines sense that because of the sheer volume of user interaction taking place on social media profiles, social media pages are results of high importance to someone searching for a particular personality.  If you Google any celebrity, his or her Facebook and Twitter accounts will likely be in the top 5 results.  This may seem par for the course for pop celebrities such as singers, actors, and talk show hosts, but even if you Google Barack Obama, his Facebook and Twitter accounts show up before all the major news outlets and even before whitehouse.gov. 

But of its total 271 million active accounts, Twitter just announced that 23 million, roughly 8.5%, are not human.  These accounts are controlled by automated bots that tend to be used for spam and auto-responding to posts.

Beyond bots, however, some 70+% of the total 900+ million accounts on Twitter are classified as inactive for not having logged in in over a year.  One must wonder, where did these hundreds of millions of abandoned accounts come from? Those millions and millions of accounts, totaling over twice the population of the United States, do not belong to individuals who just lost interest.  It is well known that there are people for hire out there creating hundreds or thousands of accounts for the sole purpose of having those accounts follow a celebrity or politician to make them appear to have millions and millions of loyal followers.  In fact, such services are openly advertised and are claimed to be an industry standard (http://fakefollowerstwitter.com/). Imagine how much a pop star or presidential candidate would like to tout that they are the top-followed personality on Twitter and how much their sponsors might be willing to pay some shadowy techies to make it happen.  The root of the problem is obvious.

Social media platforms themselves will inevitably have to address this issue.  Social media websites rely on advertisements that have value based on how many human eyes see them and/or click on them.  I would imagine that it’s a tough pitch for Twitter to convince a business to advertise on a seemingly popular and well-followed page while the news has come out that an unknown but presumably massive number of that page’s following accounts are actually fake. And among those accounts that actually are active, a significant portion are bots that may click through ads and run up the advertiser’s bill.

Search engines, likewise, do not want their algorithms to reward social media pages that employ bots or falsified accounts that deceptively drive up traffic on their profiles.  These tactics are cut from the same cloth as the techniques that SEOs used to exploit to boost a website’s authority and relevance in the eyes of search engines. The tools are already out there, such as StatusPeople’s Fake Follower Check, for individuals to see how many of their Twitter followers are fake.  I predict that in the near future, search engines will begin to integrate similar tools into their algorithms to downgrade social media pages that employ these spammy techniques, just as they have done for conventional websites over the past several years.

Further reading:

http://www.fastcompany.com/3034279/most-innovative-companies/twitter-reveals-how-many-of-its-active-users-arent-quite-human?partner=rss

http://www.forbes.com/sites/johngreathouse/2012/08/27/celebrities-with-the-most-allegedly-fake-twitter-followers/  

http://www.businessinsider.com/number-of-users-who-abandon-twitter-2014-2

As Google Improves, so too must SEO Strategy

Google is a lot of things to a lot of people nowadays, but at the core of its network of ventures is its status as the #1 search engine in the world. Today Google searches account for over 67% of all search engine traffic in the US. Google did not achieve this dominant market share overnight — it took years of toppling all the popular search engine competitors that rose and fell in the olden days like Alta Vista, AOL, and Ask Jeeves. And how has Google beaten out their competition? By being the best at connecting online users with the content they want to find, and by getting better and better at it every day.
Because so many people trust Google to provide them the best search results, there will always be an inevitable game of tug-of-war afoot between Google and Search Engine Optimization specialists. SEOs will always be trying to uncover the secret methods Google uses to determine which websites are worthy of those top search engine results, and Google will always have to be one step ahead to prevent SEOs from exploiting tricks to push not-so-good websites high up in the rankings.

Target keywords in website titles and body content, links from other websites, and target keywords in those inbound links (anchor text) originally held great weight in Google’s measurement of the quality of a website. And in a nascent World Wide Web, those were decent indications of the quality, relevance, and popularity of a particular website for a given search term.

Over the years, however, SEOs have figured out how to easily manipulate those particular factors. “Packing” high-traffic target keywords unnaturally into websites in which they do not necessarily belong and “farming” hyperlinks through spammy directories or paying for them unfortunately became common lowball practices (aka “black-hat” techniques) to push websites higher up in Google’s rankings. Through the recent algorithm updates named Panda, Penguin, and Hummingbird, Google has reduced the importance of those metrics to prevent junky sites from cluttering their trustworthy results.

No matter how Google changes its algorithm, however, the volume of traffic going to a website will always matter. Today, for SEO specialists, quality content with the goal of having that content shared on social media platforms is key. Social media sharing today provides an important high-volume source of online traffic referrals, so Google naturally notices a website that a lot of people are sharing/recommending out of their own volition.

Encouraging social media users to share webpages requires the production of quality shareable content. Therefore, website programming and optimization is one piece of the current SEO puzzle among many. SEO now also benefits from the input and experience of writers, social media specialists, public relations professionals, and content marketing experts in developing and distributing share-worthy content in a way that it can and will be viewed and shared by lots of social media users.

That is why Robmark Web, with its parent company Robertson & Markowitz Advertising & Public Relations, can offer an SEO package that no other Savannah SEO company can. With over 20 years of experience in public relations, advertising, SEO, and website design, we offer our clients in Savannah, Hilton Head, Jacksonville, and beyond the best possible strategies to help your potential customers find your website through Google.

For more information on Robmark Web’s Search Engine Optimization and Social Media services, please visit RobmarkWeb.com.

Further reading:

http://www.theguardian.com/media/2014/jul/28/google-seo-social-media-search-marketing-panda-penguin-hummingbird