One of the first SEO tutorials I wrote for CanadaOne was on Targeting the Google index or country in which you are marketing. It's an important step along the path to high rankings in Google Local that is not clear to many people who have targeted a specific or multi-country strategy. In July of 2012 I published The Google Local Search Primer which was basically a tutorial on how to rank in Google Local results. At that time I broke the primer down into 3 steps which were:
One of the main resources that I use for Local SEO strategy is the Local SEO ranking factors compiled by David Mihm which was the main reference for the original tutorial. After 2012 the next important changes came about via the Pigeon update which you can learn more about in pigeon advice from local SEOs and the most recent changes are outlined in Local SEO ranking factors 2015. In this post I will be writing primarily about the latest changes to the ranking factors with some anecdotes about other changes that occurred between now and the Local SEO Primer. I will address the changes and provide some tips on future proofing your Local SEO.
Changes to Local Search Since Last Local Search PrimerIn preparing this post I reviewed a lot of material on the topic to determine what changes if any have occurred in Local search since the original Local SEO primer. There have been a number of changes to the role of citations and links and most of all changes in the role of your GMB page (Google My Business page) which was formerly the merged Google+ Page and Places page which was just plain Google Places before that. The Places page was the first to appear in the SERP but after the merge of your Places and Google+ Brand page the algorithm became more focused on the explicit info and signals in the composition of the Google+ Local page.
Location and Distance in Local Search AlgorithmThe Pigeon update began the move towards the website listed in GMB (Google My Business) contributing more to the Local search ranking algorithm. This a trend that has seen Google go farther and farther down the implicit signal path adding several signals from the website to the point where the NAP being on and matching what is included in your GMB page being a big factor. It also was the beginning of change to the location and distance ranking algorithm and this has also been changed significantly since the Pigeon update. The location algorithm has changed to a user centric measurement from the address's distance from the city/region centre. So now the algo looks for results based on distance from the user whereas in the past distance was based on distance from the core of the geographic region.
Note that although this is very true if the user can be identified as located in the geo targetted city I have to believe that if the user is not in the target city then it makes sense that they revert back to the original "distance from the core" formally used by Google for GEO targetted queries. It also means that if you search for say hockey Google will return results with local professional and amateur teams in your area.
This is the first time that I've really seen the distance/location algo discussed. IMO, although all the experts seem to be saying this is new I am not convinced that this is an ever evolving algo that will become more and more user centric. For many things like restaurants etc. I've felt for quite some time that these are more based on the users location. IMO, this is amplified in a mobile environment where they may know exactly where the user is. I would not base any strategy on the location/distance algo because it is changing too fast. For instance one way to capture more traffic in this new scenario is to have multiple locations in different parts of the city. This will not be as useful in smaller regions where all the addresses would be within the algo distance calculation.
Local SEO SERP Display ChangesOne of the biggest changes from the original primer is that it is again going through a transformation in the display of your Google My Business Info in the SERP. Phone numbers and other data points seem to be moving in and out of the SERPs with phone numbers and links to the GMB on Google+ being removed almost randomly in that one day they are there and the next they are gone. This is pretty much standard operating procedure as this is how Google tests searcher preferences and changes to the actions the user takes. IMO, they are removing these items to see how they affect click throughs on the new home service ads because if there are phone numbers and Google+ links in Local pacs people are liable to phone or click the G+ over clicking the ads.
There has also been a significant reduction in the number of listings in a Local pac from 7 (in most cases) to 3 listings. There are a lot of sites affected by this and the outcry from local SEO practitioners has been loud. I don't really understand this because this is not the first time that this change has taken place. In the past the algorithm for the number of listings and position of the Local pacs has changed and returned and changed again so I wouldn't panic, but, I would definitely make the effort to get into the top 3 listings so that when these changes to the algorithm happen they do not affect you negatively. I am looking into it further but my first impression is that when Google took the focus off the GMB page onto the business website for local rankings so links and especially "local links" have become more important.
Your Google My Business Page is Less Important to RankingsIn addition to the changes in the display of GMB data in the SERP there seems to have been a dilution of the GMB page as a major component of the Local Search ranking algorithm especially when compared to the algo at the time of the original Local SEO primer. Although there is correlation between Google+ activity and ranking it is increasingly apparent that Google+ Local and GMB are like other social signals that are correlated to ranking in that it does not seem to be the cause for higher ranks. Whether that notion was ever true is up for debate and I should admit that I am becoming even more sceptical of social signals as ranking factors. A recent whiteboard on the Moz came to a similar conclusion. For me "Social signals" are to SEO ranking what pyrite is to gold! Easily mistaken for the real thing by fools not willing to look closer! When this nonsense started I immediately pointed out that many of the signals cited as ranking metrics are blocked by Robots.txt on any Social Net work not willing to be robbed of their data by search engines hungry for social data.
Before going on and saying that the onpage factors on the website linked to from GMB is now an important factor in ranking I will add that the pendulum swings both ways where your GMB page is concerned. It has been the focus at times and other times they have gone with what is on the website. IMO, the website signals have become more important as Google begins to trust/include more implicit signals in local search rankings. As it turns out the explicit data submitted to Google for the GMB page was quite likely less dependable (accuracy!) than implicit data on the site which is likely used by the user to go to the location!
It is no surprise to me that home service ads were introduced because **in general** the industries targeted as purchasers of the ads were also the biggest abusers of the explicit data provided through their Google+ page. You could argue that their willingness to go to great lengths to manipulate/overdue optimization of the data and take on the accompanying risk makes them top prospects to pay for the digital real estate they were/are squatting on.
Structured DataGoogle's focus has moved further off of the GMB page for business info to the website page listed in the GMB. Structured data seems to have become the strongest signal that you can send Google to disambiguate the NAP and other business info to establish your brand/business as an entity. I have begun to make a point of adding organization schema using the name, URL and logo itemprops on almost all pages on a website. There was recently a discussion in the SEO Dojo group where a member shared a site that had 5 NAPs in the footer of the home page. The member was under the impression this was not a good practice and in most cases that is true because multiple NAPs makes choosing the right NAP hard for Google.
The site was able to use schema structured data to disambiguate the addresses. Furthermore, by removing the schema structured data from the NAPs in the footer of their location pages they were able to disambiguate the NAP for their location pages. IMO, if you have real locations and aren't faking the locations using structured data on these "doorway type" pages adds value and higher trust to pages optimized for locations. Implemented correctly I strongly believe that structured data is treated as a value add to lower quality pages. On Bill Slawski's blog he wrote How Google May Use Schema Vocabulary to Reduce Duplicate Content in Search Results which shows a few ways that Google can/could use structured data to handle duplicate content.
Links and CitationsAnother topic that was often cited by Local SEO's as gaining importance is building links and structured and unstructured citations. I would say that in the current environment that any mention of your site in local websites (linked or unlinked) is a huge factor in local rankings. Not to brag but after Pigeon I had already moved my link prospecting to local/country and topic specific targets. It was evident to me that this was where the algorithm was headed where links and citations were concerned. IMO, links and citations are more about location and topics than PageRank. PageRank is still a factor in the link equity passed, but, if you are a local centric business the location of the blog is just as important as the PR since most have little if any PR. After the latest update I saw a huge move in overall visibility because I had been targeting local/topic specific sites for over a year!
Future Proofing Local Search RankingsFuture proofing to a large extent is about understanding the capabilities of search engines, having a general understanding of IR (Information Retrieval) in particular the characteristics and trust placed in the explicit and implicit signals that are in Google search. So when I am future proofing my SEO I want to understand what kind of signal I'm manipulating because each has characteristics that affect trust and therefore either raises or lowers risk of manual or algorithmic penalties:
I have assumed that when Google says we do not use Google analytics for ranking... It's true! Another unknown is what user behaviour data is collected from the chrome browser. Again I'm not sure of what or if behaviour data is stored as IMO, storing user behaviour data woulld result in a lot of data that is more likely used in machine learning for tuning the Panda algo than a factor in ranking websites. What seems like a ranking factor is more likely to be about the site quality score than the bounce rate or dwell time on .
These are examples of extremely noisy signals where some, I assume, are dependent upon a Google application to send the data outside of Google Analytics (GA) since they've said on a few occasions GA is not used in rankings. Explicit signals come directly from the user so in an age before structured data and machine learning these explicit signals would "seem" to be superior signals for the NAPs and business information the GMB provides. The noise around implicit signals is well documented in IR circles. Therefore when I read in the deep dive on the Local SEO Ranking Factors how these implicit signals (bounce rates, dwell time and scrolling in particular) are becoming a bigger factor I have to say I am very, very, sceptical mainly because of the type of signal they are. Those kinds of implicit data are hard to collect and easily abused!
I do believe that over time click analysis has grown considerably in importance. I have believed for many years that click analysis is a major factor in determining what search verticals appear in Universal Search. Before that I had suspected for years that click analysis was the reason for seasonal products to drop in the ranking in the off season. For the purposes of future proofing I have worked under the principle that it made sense for Google to lower the rankings of transactional sites that are out of season and raise the rankings of sites that are more informational in nature as that better reflects user search intent. This often is the case on general search terms where the SERP is a hybrid of transactional and informational sites. Therefore strategies that use the hypothesis have a better chance for success. For instance, all other things being equal, a sale on bikinis in June is going to have a better chance for success than a sale in January.
Structured DataOne of the takeaways from the Local ranking factors 2015 post on MOZ was that structured data and citations were must haves for a successful Local SEO campaign. IMO, this is partly due to structured data providing pseudo explicit data on the website. I would try to add as much of the explicit data from your GMB (store hours etc) in schema to your footer. At the very least I want to add those details to the home page footer. IMO, the recent local updates have moved further away from using GMB explicit signals in Local Search rankings choosing to rely more and more on the website's verifiable implicit signals.
That said, I'd add the caveat that the information in your GMB should be included in the footer and marked up with structured data. If Local SEO were a poker game I'd say that an optimized GMB page and NAP marked up with structured data is the ante; Oraganization schema markup and an optimized GMB is the first call bet and website authority is your "all in" bet!
Link and CitationsI have felt for a very long time that the metrics many SEO's have used for prospecting for links have been misguided or down right bad! I don't see the point for a local plumber to be prospecting out of the service area unless all the prospects in the region are exhausted because that matches where they can get business from and the SERPs/keywords they are targeting would be searched from. Basically assuming that the user centric location/distance algo does not change much if your business is regional in nature I would keep my prospecting to topic centric relevant sites within the region/country my business was in.. For my local Toronto clients I target Toronto and Ontario (the province) and Canada (country) sites relevant to the topic.
I especially don't really see the point in gathering unstructured citations from anywhere else. I find Link Prospector to be by far the best for local link prospecting because you can target blogs and other Google verticals, country indexes and presets for techniques like finding guest blogging, news and other specific types of sites like associations, resources etc. I'd like to also point out that some link prospecting techniques like guest blogging and press releases may be more suitable to local SEO then for ranking in general. Especially if local SEO is a strong motivator for trying to get a link from a site.
Conclusion to Local Search Primer ChangesIf someone asked me for the elevator pitch version of this post I'd say "Google is now using more implicit signals from your website and links as the most important source for ranking your website and put less emphasis on the explicit data in structured citations and your Google GMB page!". That said I would not stop all activity on your Google Plus page, however, I would decrease the activity substancially preferring to use the resources spent in these activities to techniques that garnered local links. For instance offline events that provide local content and promotion would result in links and citations that are local.
Source: Mastering Local Search Engine Optimization (SEO): A 2015 Primer
No comments:
Post a Comment