≡ Menu

Google Has Axed Authorship in SERPs

John Mueller has announced on Google Plus that they have stopped showing authorship in Google search results. Here is a quote from the post explaining why the experiment has been stopped.

If you’re curious — in our tests, removing authorship generally does not seem to reduce traffic to sites. Nor does it increase clicks on ads. We make these kinds of changes to improve our users’ experience.


{ 0 comments }

Google’s Penguin 3 Update Coming Soon

Brace yourselves, Penguin 3 is coming soon. Google has been silent for a while ever since Matt’s leave but John Mueller hinted that it would be coming soon.

Something interesting that comes with these Penguin updates is how it opens up opportunities for negative SEO extortion. This is definitely not the first case of it happening, but you can’t help but George Zlatin gave a good quote in the post that stood out to me.

“This is a prime example of why I don’t agree with Google’s policy about penalizing webmasters for spammy links”

 

{ 0 comments }

Google I/O Keynote Speech 2014

The main focus of this keynote was Android, Chrome and the investments that Google is taking in order to make the user experience even better.

{ 0 comments }

Matt Cutts had a new video up on 6/2 which explained how content ranks if there are few links pointing to the page. He compared it to Google before links and how they looked at text on the page. The words within the page is important, and the frequency the words appears matters, but overdoing it causes suspicion of over-optimization. Also, Google looks at the domain authority and if it is reputable or not.

{ 0 comments }

What happens  when two identical links are placed on one page? Well this question was recently asked in the Google Webmaster forum. Google’s Matt Cutts explains that it might not be that important. He suggests that one should look at the “higher mountain top of it,” which presumably means that you should look at big picture stuff instead of worrying about the possible effects of identical links to PageRank.

[click to continue…]

{ 0 comments }

Google has updated its powerful Webmaster Tools platform with a feature that allows users to see what Googlebot sees when their websites are crawled. “Fetch as Google” shows server headers and HTML with the results to help webmasters see problems in coding and other technical issues. Google renders a specified web page by attempting to access linked CSS and Javascript files. The page is then rendered with analytics data in the results.

So how do you use this feature? In order to use this new tool, you to the Crawl section of Webmaster Tools and click “Fetch as Google.” Paste the URL of the website you want to render and submit. The process may take a few moments as the page is rendered.

Aside from helping webmasters diagnose technical problems, it is also useful for discovering any pages that not being crawled by accident. Pages you have restricted on purpose via robots.txt will not be rendered. You can also opt out of rendering social media buttons and other elements that do not contribute directly to layout and visuals.

Use this feature along with myriad others in Google’s Webmaster Tools kit to diagnose and optimize your websites with ease.

{ 0 comments }

Google Rolls out Major Panda Update 4.0

Google has just rolled out a big update to its Panda algorithm this week. The version, dubbed Panda 4.0, was announced by Google engineer Matt Cutts on Twitter. In the past, the search giant did not announce changes to the Panda algorithm because they were monthly rolling updates; this is significant because it means that this update is bigger than those before it.

[click to continue…]

{ 0 comments }