Buy AutoTrafficRSS script now for $27 only!
We will send the script to your PayPal email within few hours,Please add FullContentRSS@gmail.com to your email contact.Source: Why Local #SEO Is About to Become Even More Important #Jim_Carrey
Buy AutoTrafficRSS script now for $27 only!
We will send the script to your PayPal email within few hours,Please add FullContentRSS@gmail.com to your email contact.Search engines "cut-off" titles that are too long. This can lead to the rather unfortunate consequence that your most important information in your titles and descriptions are not visible in organic search resuls leading to fewer clicks.
Used to be that there was a fixed amount of characters Google would show. That was because the search engine result pages were using a fixed font where every character takes up the same amount of space.
But now Google uses a variable font where the letter i takes up less width than the letter w. Consequently the point at which a title is cut-off depends not just on characters but on which characters.
These days a report looking at title length will look at the width of the title in pixels: does it fit the 512 pixels we have on Google's desktop SERP?
You can easily preview your title in a tool like Protent's SERP Preview Tool or Moz's Title Tag Preview Tool. But what if you want to go back and see which of your existing titles need to be optimized? Want to copy & paste hundreds or thousands of titles?
Enter A1 Website AnalyzerA1 Website Analyzer is the site-crawler for serious technical SEO's.
With it we can crawl a whole web site and see how long its titles (and meta descriptions!) are in characters and in pixels.
To deal with any future changes all font options for the titles and meta descriptions can be customized in "Options - Program options - Font Pixel Calculations" , if needed:
How To Calculate Site-Wide Title Tag WidthStart A1 Website Analyzer and enter your domain name. As the default settings already are set to collect the information required, all you need is to wait for the scan to complete.
Once the scan has finished, we want to pick which data columns will be visible (tip: the data columns visible are the same that would show in your report if you export the results!):
You can do so using either the columns toolbar button or navigate the top menu to "View - Data columns - Extracted content" and make sure that the desired columns are checked.
For an easier viewing of the results, you can in the top menu "View - Data filter options" check "Only show URLs that are pages" followed by checking "View - Data filter active" .
The end result after the scan has finished and adjusting the filters will look something like this:
Note: You can view all pages in both "list" (flat) and "tree" (indented) mode:
However, if you have a big website, this may all be a little too much to look through - especially if you need to generate a CSV export for Excel or similar. That is why A1 Website Analyzer has built-in reports whereof one allows you to quickly see if your titles are within what Google will show in organic search results.
If you pick the one called "Show only pages with titles over 70 characters - and relevant data columns you get something that looks like this.
It is important to realize that you can manually change the filters applied and activate them , so you can limit the visible result to pages with descriptions or titles less/larger than whatever number you want - and calculated as either pixels or characters.
Now Read* Title images adapted from avrene,Sean MacEntee, wwarby
Buy AutoTrafficRSS script now for $27 only!
We will send the script to your PayPal email within few hours,Please add FullContentRSS@gmail.com to your email contact.A couple of months ago, Google Inc. (NASDAQ:GOOGL) released yet another algorithm update that was meant to affect SERPs of web pages based on their mobile-friendliness. After the algorithm was officially announced, Moovweb – a cloud-based platform that helps optimize mobile experiences, discovered that 83% of the top result of 1000 eCommerce keywords were tagged mobile-friendly by Google.
Earlier in November 2014, Google introduced the "Mobile-Friendly tag or label" into their mobile search results to help the people using Google on their mobile devices to identify mobile-friendly pages among the search results in order to offer a better user experience. At that time Google did confirm that they were planning whether "Mobile-friendliness" of a website be considered as a ranking signal. Later in February 2015, Google announced that starting April 21, the search engine will be using "Mobile-Friendliness" as a ranking signal.
Owing to this announcement, Moovweb decided to study the changes in the SERPs of over 1,000 eCommerce keywords from different industries in the span of 6 weeks since the formal launch of the new ranking signal.
Key Findings of the Analysis by Moovweb:Given the fact that mobile searches have already surpassed desktop searches on Google, it is quite clear that the mobile platform is the future. This might be the core reason why Google introduced the new algorithm update popularly touted as "Mobilegeddon" in the first place. Although the mobile platform is embraced by users worldwide, mobile web and mobile SEO are not fully welcomed by each industry evenly.
There was a significant difference between each industry when considered the percentage of mobile-friendly sites in their top 10 keyword positions. Keywords from Retail industry registered greater percentage of mobile-friendly pages within their top 10 results while Transportation industry remained at the bottom of the pool. However, Healthcare, Insurance, and Travel/Hospitality niches weren't too far behind in terms of adapting mobile-friendly feature.
Mobile is the Future!According to Internet Live Stats, there are little over 963 million websites worldwide while the number of mobile devices clocked to over 7.7 billion by the end of 2014. Organisation and individuals are betting big on apparently promising future of mobile Internet. In the US alone, people are spending more time on Smartphone or Tablet than TV or any other device (excluding Desktop), finds out mobile analytical company Flurry. On an average, a user spends nearly 3 hours and 45 minutes each day on his mobile devices. This is 63 percent jump as compared to a year ago period.
Many webmasters have already begun developing responsive websites to offer a better user experience across all devices. Out of 200 factors responsible for search engine rankings on Google, user-friendliness is one of the key factors and the most webmasters, as well as search engine gurus, have already warned everyone about the risks of not having a responsive website. However, with the announcement of "Mobilegeddon", Google has officiated such warnings by introducing "Mobile-friendliness" of a website as one of the ranking signals that helps determine top SERPs.
To motivate webmasters around the globe, Google initiated the process by introducing "Mobile-friendly" tag on mobile search results and later on announced the "Mobilegeddon" two months prior to its release. Such an early announcement actually offered enough time for webmasters to update their website into a clean mobile-friendly one.
If you're yet confused about what Google means by a mobile-friendly website then you must read the instructions or criteria that Google follows to determine whether you website is mobile-friendly or not.
Other than just improving your search engine traffic, the mobile friendliness of your website can also be helpful in improving the user-experience of the traffic generated from mobile devices. With rising population of mobile users worldwide, it will be unwise not to optimize your website for Smartphones and Tablet devices.
Some helpful tips based on the report:Buy AutoTrafficRSS script now for $27 only!
We will send the script to your PayPal email within few hours,Please add FullContentRSS@gmail.com to your email contact.Buy AutoTrafficRSS script now for $27 only!
We will send the script to your PayPal email within few hours,Please add FullContentRSS@gmail.com to your email contact.Although it's hard to keep up with the growing number of SEO tools that have been launched in the last few years (along with the new functionalities of the existing tools), it's necessary to test them in order to identify how their features can support and help advance our SEO activities more efficiently.
This is especially true when it comes to tasks that are critical or highly beneficial to the SEO process but are complex and/or time-consuming to execute.
That's why I want to share with you seven such SEO tasks that now can be partially or completely automated with the support of some tools.
1. Assessing Your Industry Traffic PotentialOne of the first activities when launching a new website or SEO campaign is to assess traffic potential (ideally per channel) and identify the potential competitors in the market. Estimating this can be challenging, especially when starting to work in a new market that you don't know anything about.
Nonetheless, SimilarWeb "Industry Analysis" reports can greatly help by allowing you to easily obtain the most important traffic data for any industry in many countries; it also shows the traffic per sources, most popular sites per channel, and trends.
However, remember to take these numbers as references, not absolutes; and whenever you can, validate with other data sources.
2. Identifying Keyword Opportunities For Your SitesFinding new keywords opportunities is important when focusing your SEO process and establishing profitable yet still feasible goals.
In the past, doing this type of analysis was time-consuming, but now it can be completely automated with Sistrix's "Opportunities" feature. With this feature, you can include up to three competitors, and it will show which keywords you're still not targeting for which these competitors are already ranking and the level of traffic opportunity and competition.
3. Identifying Related Relevant Terms To Use In Your Content By Doing A TF-IDF Analysis Of The Top Ranked Pages For Any QueryTF-IDF stands for "term frequency" and "inverse document frequency." According to the OnPageWiki:
With the TF*IDF formula, you can identify in which proportion certain words within a text document or website are weighted compared to all potentially possible documents. Apart from the keyword density, this formula can be used for OnPage optimisation in order to increase a website's relevance in search engines.
Although it's known that TF-IDF has been used to index pages, there hasn't been a popular tool offering it to identify relevant term variances of our topics that we should be using. This information can be used to improve our site relevance for other terms our audience uses.
OnPage.org includes a handy TF-IDF tool in their on-page analysis and monitoring platform, which can be used to identify more term variances or combinations that our competitors are already using, but we still aren't (by analyzing both the top 15 page results and our own desired page to rank with). By focusing on terms related to our main keywords, we can increase our site content's relevance for the desired topic.
4. Visualizing Your Site's Internal LinkingI have written in the past about visualizing a site's pages and links as a graph to facilitate the analysis of a website's internal linking, which was doable but took a lot of effort. The process required exporting the URLs crawled, then processing them with visualization tools.
This has now been made easy by the "Visualizer" functionality of OnPage.org. It not only allows you to automatically generate the internal link graph of any site, but it provides functionalities to browse, filter the number, show the relationship of links, and show only the nodes (or pages) that follow certain pattern.
This can be extremely helpful to better understand how a site is internally linked, the cardinality of the links, if there are any "orphan pages" or areas of the sites that are not connected with the rest, etc.
5. Getting All Key Optimization, Link Popularity, Social & Organic Traffic Data For Your Top Site Pages In A Single PlaceGathering the data when doing an SEO audit can be time-consuming. This data includes a website's technical optimization, content, link popularity, current organic search traffic, and search engine rankings, which we used to obtain from different, non-connected data sources that were a challenge to integrate later.
This data gathering can now be largely automated thanks to URLProfiler, which directly retrieves much of the required data while combining many other tools' data. For example, in order to get all the key SEO metrics for the highest visibility pages of your site, you can download the "top pages" CSV from the "Search Console" Search Analytics report, import them to Screaming Frog SEO crawler in the "list mode," and crawl them.
Once crawled, you can import them directly to URLProfiler with the "Import from Screaming Frog SEO Spider" option. Then, you should select the additional metrics you want to obtain for these pages: Mozscape link popularity and social shares metrics, Google Analytics organic search traffic data (you'll be able to select the segment you want), and Google PageSpeed and Mobile validation (these will require that you get and add a free API key from Moz and Google).
Now, you can run URLProfiler and get the results in a few minutes in one spreadsheet: All the data from Screaming Frog, Google Analytics, MozScape link and social shares, Google PageSpeed and mobile validation for your top pages with the highest visibility in Google's Search Console. It will look like this (and I can't imagine the time I would have needed to put this all together manually):
There's no excuse to not develop a quick SEO audit for your most important pages, taking all the key metrics into consideration.
6. Getting Relevant Link Prospects With The Desired Requirements And Direct Contact InformationObtaining a list of sites that are highly relevant to your business might be not that difficult — doing so when looking only for highly authoritative sites, from a specific country, with visible contact information (among other criteria) is a bit more complex.
All this can be easily done now with the LinkRisk Peek tool, which provides many advanced filters to only get the sites that will be relevant and feasible to use for outreach.
7. Tracking Daily Rankings Of Full SERPs For Your Relevant KeywordsThere was a time when we tracked the rankings for our most important keywords, for both our own sites and our top competitors. Due to ongoing ranking fluctuations, sometimes we have new competitors that we were not tracking, and it is hard then to identify the correlations of gains and losses vs. them.
Additionally, once we got the ranking information, we had to analyze the pages to identify the potential reasons for the ranking shifts. We did this using tools to obtain the domain/page link popularity, among other factors.
This is now easier to do with tools like SERPWoo. Rather than tracking specified URLs (yours and your competitors'), SERPWoo tracks the top 20 results for your keywords by default. It also includes useful metrics such as page and domain link popularity, social shares, etc., to help marketers more easily analyze the potential causes of a rankings fluctuation.
I hope that these functionalities help you as much as they have helped me! Which other SEO activities are you now automating that used to take you a lot of time? Please, feel free to share in the comments!
Image used under Creative Commons from Flickr
Some opinions expressed in this article may be those of a guest author and not necessarily Search Engine Land. Staff authors are listed here.
Be a part of the world's largest search marketing conference, Search Engine Land's SMX East. The robust agenda covers the latest tactics in paid search, SEO, mobile, analytics and more. Register today and save $300, or come as a team and save 10%-20%.
About The Author Aleyda Solis is an International SEO Consultant & founder of Orainti, a digital marketing consultancy focused on helping businesses -- from startups to multinational companies in Europe, Asia, North & South America -- to grow their multi-device organic search visibility, traffic and ROI in multi-lingual & multi-country environments. She is also a frequent blogger, a teacher at the Online Marketing Master of Pompeu Fabra University and Kschool, as well as an international speaker at online marketing conferences, having spoken in more than 12 countries & 60 conferences. (Some images used under license from Shutterstock.com.)Buy AutoTrafficRSS script now for $27 only!
We will send the script to your PayPal email within few hours,Please add FullContentRSS@gmail.com to your email contact.'Search intelligence' is becoming an increasingly popular phrase within the SEO industry. It refers to the realisation that we should be using the 'big data' readily available to us in search to make our business decisions. For example, by analysing the words displayed in the Google search engine results pages (SERPs) we are able to gain a better understanding of demand, behaviour and the marketplace in general. Through these findings we are able to then gain a deeper understanding into brand engagement levels and therefore adapt marketing activity accordingly.
In order to put this into context I want to reflect on traditional advertising and its evolution. Advertising was built on the need to shape people's thoughts and feelings of a brand. Millions of billboards, newspapers and commercials were all born out of this idea and have been responsible for shaping the industry into what it is today. But what tools and data were available to help advertisers measure engagement and brand sentiment in the past? Post campaign surveys and sales figures were all advertisers had to depend on which provided very little detail. But with the changing face of advertising and the increasing influence of search marketing the way campaigns are measured has completely shifted, and the need for so called 'search intelligence' has never been so important.
Think of the landscape of modern search. In comparison to traditional advertising the level of insights available is phenomenal. From any brand you can get a wealth of information from the SERPs which include everything from the brand website, Wikipedia information, news articles and retailers selling the brand's products to information about their competitors and User Generated Content (USG), such as reviews on the brand and their products or services. All this information can help provide a much more in depth and detailed picture of brand engagement and sentiment, helping to understand how your brand is perceived in the public domain.
So what exactly is 'Search Intelligence' and how is it going to change how we measure and analyse brand engagement online?
To answer this it's important to firstly understand how the user is engaging with the SERP. Recently at iProspect we conducted a Click Through Rate (CTR) study that covered 250,000 keywords across 120 brands, this internal research allowed has to develop a bespoke modelling for traffic estimations. What we discovered was that position one for branded terms tended to get 51.4% of all clicks, as illustrated Apple in the SERP would get 51.4% of all traffic.
Illustration of page one SERP with calculated CTRs for the search term 'Apple':
Therefore, from this research we were able to determine that 48.6% of all brand search terms do not always go directly to the advertisers site. The users are still engaging with the brand online but not with the owned assets that the advertiser creates and controls, but rather other assets presented to them in the SERPs, such as news articles, reviews, other retailers selling their products or event competitors.
To expand upon this example in the image above, the Telegraph article that is 6th in the organic position would get 3.5% of all the branded searches. This would amount to 35,000 visits per month for branded Apple traffic to the Telegraph site after people searched the term 'Apple'. The searcher would then be faced with the content the Telegraph has on Apple and it's positive or negative sentiment. It could therefore have a considerable impact on brand impression and engagement online, despite having not been produced by the brand itself.
So what do these insights mean for brand management? Understanding who is engaging with your branded traffic outside of your owned assets is vital to help build upon your brand presence and SEO ranking. At this moment in time all SEO ranking tools currently just focus on a brand's domain when reporting the SEO position, which could potentially be giving an incomprehensive understanding of your brand online.
However, with access to the right sort of data it is possible to gain a deeper understanding of who the key operators are in the branded space from a paid (PPC & Affiliate) and organic perspective. These insights can then be applied to inform online marketing activities, such as online PR, relationship & retailer management. By digesting the wealth of branded information available online and breaking this into the key areas of Bought, Owned and Earned opportunities this will enable marketers to prioritise their efforts to achieve the maximum online visibility for brands and in turn increase brand sentiment and engagement by addressing areas of concern.
So next time you want to understand your brand and how your products are perceived amongst the online community have a look at the SERPs and think beyond just your own brand assets.
Jeremy McDonald
Contributor
Jeremy McDonald, Head of SEO Strategy at iProspect UK
Buy AutoTrafficRSS script now for $27 only!
We will send the script to your PayPal email within few hours,Please add FullContentRSS@gmail.com to your email contact.Although it's hard to keep up with the growing number of SEO tools that have been launched in the last few years (along with the new functionalities of the existing tools), it's necessary to test them in order to identify how their features can support and help advance our SEO activities more efficiently.
This is especially true when it comes to tasks that are critical or highly beneficial to the SEO process but are complex and/or time-consuming to execute.
That's why I want to share with you seven such SEO tasks that now can be partially or completely automated with the support of some tools.
1. Assessing Your Industry Traffic PotentialOne of the first activities when launching a new website or SEO campaign is to assess traffic potential (ideally per channel) and identify the potential competitors in the market. Estimating this can be challenging, especially when starting to work in a new market that you don't know anything about.
Nonetheless, SimilarWeb "Industry Analysis" reports can greatly help by allowing you to easily obtain the most important traffic data for any industry in many countries; it also shows the traffic per sources, most popular sites per channel, and trends.
However, remember to take these numbers as references, not absolutes; and whenever you can, validate with other data sources.
2. Identifying Keyword Opportunities For Your SitesFinding new keywords opportunities is important when focusing your SEO process and establishing profitable yet still feasible goals.
In the past, doing this type of analysis was time-consuming, but now it can be completely automated with Sistrix's "Opportunities" feature. With this feature, you can include up to three competitors, and it will show which keywords you're still not targeting for which these competitors are already ranking and the level of traffic opportunity and competition.
3. Identifying Related Relevant Terms To Use In Your Content By Doing A TF-IDF Analysis Of The Top Ranked Pages For Any QueryTF-IDF stands for "term frequency" and "inverse document frequency." According to the OnPageWiki:
With the TF*IDF formula, you can identify in which proportion certain words within a text document or website are weighted compared to all potentially possible documents. Apart from the keyword density, this formula can be used for OnPage optimisation in order to increase a website's relevance in search engines.
Although it's known that TF-IDF has been used to index pages, there hasn't been a popular tool offering it to identify relevant term variances of our topics that we should be using. This information can be used to improve our site relevance for other terms our audience uses.
OnPage.org includes a handy TF-IDF tool in their on-page analysis and monitoring platform, which can be used to identify more term variances or combinations that our competitors are already using, but we still aren't (by analyzing both the top 15 page results and our own desired page to rank with). By focusing on terms related to our main keywords, we can increase our site content's relevance for the desired topic.
4. Visualizing Your Site's Internal LinkingI have written in the past about visualizing a site's pages and links as a graph to facilitate the analysis of a website's internal linking, which was doable but took a lot of effort. The process required exporting the URLs crawled, then processing them with visualization tools.
This has now been made easy by the "Visualizer" functionality of OnPage.org. It not only allows you to automatically generate the internal link graph of any site, but it provides functionalities to browse, filter the number, show the relationship of links, and show only the nodes (or pages) that follow certain pattern.
This can be extremely helpful to better understand how a site is internally linked, the cardinality of the links, if there are any "orphan pages" or areas of the sites that are not connected with the rest, etc.
5. Getting All Key Optimization, Link Popularity, Social & Organic Traffic Data For Your Top Site Pages In A Single PlaceGathering the data when doing an SEO audit can be time-consuming. This data includes a website's technical optimization, content, link popularity, current organic search traffic, and search engine rankings, which we used to obtain from different, non-connected data sources that were a challenge to integrate later.
This data gathering can now be largely automated thanks to URLProfiler, which directly retrieves much of the required data while combining many other tools' data. For example, in order to get all the key SEO metrics for the highest visibility pages of your site, you can download the "top pages" CSV from the "Search Console" Search Analytics report, import them to Screaming Frog SEO crawler in the "list mode," and crawl them.
Once crawled, you can import them directly to URLProfiler with the "Import from Screaming Frog SEO Spider" option. Then, you should select the additional metrics you want to obtain for these pages: Mozscape link popularity and social shares metrics, Google Analytics organic search traffic data (you'll be able to select the segment you want), and Google PageSpeed and Mobile validation (these will require that you get and add a free API key from Moz and Google).
Now, you can run URLProfiler and get the results in a few minutes in one spreadsheet: All the data from Screaming Frog, Google Analytics, MozScape link and social shares, Google PageSpeed and mobile validation for your top pages with the highest visibility in Google's Search Console. It will look like this (and I can't imagine the time I would have needed to put this all together manually):
There's no excuse to not develop a quick SEO audit for your most important pages, taking all the key metrics into consideration.
6. Getting Relevant Link Prospects With The Desired Requirements And Direct Contact InformationObtaining a list of sites that are highly relevant to your business might be not that difficult — doing so when looking only for highly authoritative sites, from a specific country, with visible contact information (among other criteria) is a bit more complex.
All this can be easily done now with the LinkRisk Peek tool, which provides many advanced filters to only get the sites that will be relevant and feasible to use for outreach.
7. Tracking Daily Rankings Of Full SERPs For Your Relevant KeywordsThere was a time when we tracked the rankings for our most important keywords, for both our own sites and our top competitors. Due to ongoing ranking fluctuations, sometimes we have new competitors that we were not tracking, and it is hard then to identify the correlations of gains and losses vs. them.
Additionally, once we got the ranking information, we had to analyze the pages to identify the potential reasons for the ranking shifts. We did this using tools to obtain the domain/page link popularity, among other factors.
This is now easier to do with tools like SERPWoo. Rather than tracking specified URLs (yours and your competitors'), SERPWoo tracks the top 20 results for your keywords by default. It also includes useful metrics such as page and domain link popularity, social shares, etc., to help marketers more easily analyze the potential causes of a rankings fluctuation.
I hope that these functionalities help you as much as they have helped me! Which other SEO activities are you now automating that used to take you a lot of time? Please, feel free to share in the comments!
Image used under Creative Commons from Flickr
Some opinions expressed in this article may be those of a guest author and not necessarily Search Engine Land. Staff authors are listed here.
Be a part of the world's largest search marketing conference, Search Engine Land's SMX East. The robust agenda covers the latest tactics in paid search, SEO, mobile, analytics and more. Register today and save $300, or come as a team and save 10%-20%.
About The Author Aleyda Solis is an International SEO Consultant & founder of Orainti, a digital marketing consultancy focused on helping businesses -- from startups to multinational companies in Europe, Asia, North & South America -- to grow their multi-device organic search visibility, traffic and ROI in multi-lingual & multi-country environments. She is also a frequent blogger, a teacher at the Online Marketing Master of Pompeu Fabra University and Kschool, as well as an international speaker at online marketing conferences, having spoken in more than 12 countries & 60 conferences. (Some images used under license from Shutterstock.com.)Buy AutoTrafficRSS script now for $27 only!
We will send the script to your PayPal email within few hours,Please add FullContentRSS@gmail.com to your email contact.The page you requested does not exist. For your convenience, a search was performed using the query blog posts 2015 06 23 5 best seo plugins firefox browser.
6 SEO Myths that Absolutely Need to Die in 2015and strategies Linkbuilding is the best option for you. (6 seo myths that absolutely need die 2015, login ... Marketing Forecast, 2009 to 2014, by Forrester estimated the SEO industry to be worth over $5 billion ... of Pagerank 5 sites. Does that make them useless for SEO? NO. Pagerank was last updated years ago, ...
Blog entry - Anonymous - 23 Mar 2015 - 7:37am - 1 comment - 1 likes
Buy AutoTrafficRSS script now for $27 only!
We will send the script to your PayPal email within few hours,Please add FullContentRSS@gmail.com to your email contact.On June 17, Google's algorithm seemed to get a mysterious jolt with a mysterious update that Google deemed a routine, non-major change. Google specifically said the update was not related to Panda, Penguin, or HTTPS. They wouldn't comment further.
They just said, "We're always making improvements to our search algorithms and the web is constantly evolving. We're going to continue to work on improvements across the board."
According to MozCast, which measures the "temperature" of patterns of the Google algorithm, the update came in at 101.8°F. For comparison, the first Penguin update only registered at 93.1°.
Since initial reports on the update, Searchmetrics has analyzed it a bit, and found that news sites are benefiting most, and that it appears to be related to trending keywords and real time hot topics. Top winners, according to the company, were WSJ.com, USAToday.com, Dailymail.co.uk, BusinessInsider.com, Time.com, NBCNews.com, LATimes.com, NYPost.com, TechCrunch.com,FoxNews.com, Steampowered.com, BizJournals.com, TheVerge.com, Fortune.com, Gizmodo.com, Dict.cc, HollywoodLife.com, TechTarget.com, and WindowsPhone.com.
Most of these sites publish "fresh and newsworthy" content on a regular basis, as Searchmetrics notes.
The update coincides with a major refresh of Google Trends, which now provides data in real time, and takes into account trends on YouTube and Google News. Google hasn't confirmed the connection here, but it seems like the most likely explanation at this point.
As Searchmetrics notes, Google also has the Twitter fire hose now, so that's more real time data it can use. It's unclear whether or not that's connected. The only use of this data that has actually been announced comes in the form of tweets appearing in Google's mobile results, but it's probably safe to say that Google can tap into this for other reasons, that have bigger implications than that specific feature.
When the update was spotted by Moz, they dubbed it the Colossus update. Searchmetrics is calling it the seemingly more fitting "News-wave" update. I don't know if either name will stick, but it does appear that freshness is once again a major priority of the Google algorithm. This has been taken too far by Google in the past, in my opinion, so we'll see how it goes this time.
Interestingly, Searchmetrics reports that Wikipedia has seen a bit of drop in SEO visibility as a result of the update. There was some speculation initially that the shakeup in tools like Mozcast was related to Wikipedia switching to HTTPS. Since Wikipedia is usually the top result for many pages, it any changes with the site could significantly change SERPs in general. According to Searchmetrics, however, Wikipedia's placement has dropped a little due to news sites ranking for some terms.
No result found, try new search!
Buy AutoTrafficRSS script now for $27 only!
We will send the script to your PayPal email within few hours,Please add FullContentRSS@gmail.com to your email contact.Tom Williams looks at the latest search engine optimisation news, including Google's latest core ranking change and Bing's move to TLS protocol with HTTPS
Google Implements Core Ranking ChangeGoogle rolled out another update on June 17, one that was not Panda, Penguin or HTTPS-related. While Google remained evasive on the update, Search Engine Land confirmed it was a core ranking change, something Google does regularly throughout the year.
Google told Search Engine Land: "This is not a Panda update. As you know, we're always making improvements to our search algorithms and the web is constantly evolving. We're going to continue to work on improvements across the board."
Many automated tracking tools reportedly saw huge spikes in terms of changes happening to Google search results.
In addition, Search Engine Land reported that Google had said a core search algorithm update was on the cards for the future as the search engine continues to work on increasing search quality.
Bing Prepares for Move to HTTPS This SummerBing has announced its support of the industry's move to TLS with the news that it will shift to TLS protocols this summer. As part of the search engine's move to expand encryption across its network, traffic from Bing will begin to move from http://www.bing.com to https://www.bing.com in the coming weeks.
Marketers and webmasters will still be able to identify traffic as coming from Bing, but the search engine will not include the used query terms in its referrer string. This move mirrors that of Google, which took similar action three years ago.
Bing said that some limited query term data will still be available through its webmaster and advertiser tools, including its Search Query Terms Report in Bing Ads UI or through API; by adopting Universal Event Tracking to see performance metrics such as bounce rates, duration per visit and pages visited; and by signing up for Bing Webmaster Tools to see keyword and ranking data.
Duane Forrester, senior product manager at Bing, said: "While this change may impact marketers and webmasters, we believe that providing a more secure search experience for our users is important. With this change, you will still be able to see Bing as the origin (referrer) of the encrypted traffic, though analytics tools you are using to analyze your traffic generally have their own, proprietary way of including this information in their search reports."
Google Adds New Quotes Answer BoxGoogle's Direct Answers have a new addition – quotes from the rich and famous. The search engine is now showing quotes by famous people at the top of SERPs on desktop and mobile.
The quote box will appear if you search for a famous person and add "quotes" to the end of your query.
Google does not link to the source nor does it attribute the quotes, which is likely to hit the CTR of websites that offer this type of content.
EU: Right To Be Forgotten Appeals Work Just Fine12 months after the landmark case concerning Google Spain SL and Google Inc. v Agencia Española de Protección de Datos (AEPD) and Mario Costeja González, whereby the EU's Court of Justice handed down a judgement enabling data subjects to request to search engines, under certain conditions, the delisting of links appearing in the search results based on a person's name, and six months after European DPAs adopted common guidelines on the implementation of the CJEU's Judgement, a survey has found that the Right To Be Forgotten appeals system is working efficiently.
WP29 – the working group charged with overseeing the appeals process – launched the survey to evaluate its practice regarding the delisting requests. Some 2,000 people were surveyed and the results found there to be consistency of decisions and that in the great majority of cases the refusal by a search engine to consent to the delisting request is justified by the fact that the information is directly related to the professional activity of the individual or that it relates to current events.
Not surprisingly, the majority of complaints concerned Google, which is the largest search engine in Europe.
Read last week's SEO news roundup: Mobile Innovations as Siri Suggests and Google Now Drops Formal Search
Want to discover 15 best-practice technical tips for solid SEO? Download your free Technical SEO Best Practices eBook and learn from the experts.