On April 24, 2012, Google updated their algorithm under the not-so-ominous codename “Penguin”. Penguin targets websites that violate Google’s Webmaster Guidelines. But what does that actually mean? As an SEO Manager, here are some ways I’ve seen Penguin alter the search engine optimization landscape:
It has become increasingly important to make title tags and meta descriptions more succinct and natural-sounding. Websites with overly long title tags and/or meta descriptions can be downranked by Penguin. The repetition of keywords in the metadata can also result in a decrease in rankings.
Websites that link back to your site should be relevant to your business or area of focus. More importantly, the backlinks should be high quality and non-spammy. Black Hat SEO practices like link farming are a surefire way to kill rankings.
Webpage content should be high-quality and relevant to the keywords. Each webpage should have unique content. Ixnay on keyword stuffing and duplicate content.
- Robots.txt and Sitemap.xml
Google has searchbots that crawl your site. If you imagine a Sitemap.xml as a city, Robots.txt is the map that the Googlebots use to get around. With the Robots.txt file, you can tell the Googlebots where and where not to go. If Google can’t access your site, then how can they determine if you are relevant for certain keywords?
This goes back to having unique content. A common problem I’ve seen is if you click on the home button navigation for a site, it will take you to a page that looks identical to the homepage but has a URL that is something like briancarverwebsite.com/index.php rather than briancarverwebsite.com. This is a no-no. Google views these as two separate pages with duplicate content and may hurt your rankings accordingly.
- Quality Results for Users
This is the biggest change I’ve noticed–overall quality search query results. Google wants you to find what you’re looking for…and fast. Gone are the days of clicking on various spammy links until you find something relevant to your search.
Play by Penguin’s rules and prosper. Break the rules and pay the penalty!
Brian Carver is one of the Account Managers at SEOhaus. If you would like to stay up-to-date on all of the latest SEO industry news and tips, you can subscribe to our blog here. Thanks for reading the SEOhaus blog!
Whether you’re clicking, surfing, or browsing either the depths of the web or scrolling across any given website, users require a certain amount of mobility as we travel from one page to the next. However, as user experience comes to the forefront of the online community, navigation becomes increasingly critical for users going from page to page, or just trying to find their way back home. But how do you make sure that users know where they’re going?
Like any journey, it’s always best to travel with a map. Though clear onsite navigation should be a priority in site design, like well-placed street signs, sitemaps act as a safety net, offering a bird’s eye view of a site’s architecture. Online, Sitemaps provide an overview of all of the pages of your site, including how each page relates to one another, and how important each URL is to the structure of the site.
There are two types of Sitemaps that each website should try, and each caters to a very different type of web traveler. The first is the HTML Sitemap. This sitemap is visually identifiable on your site, and offers a visible guide to act as an alternative to your primary site navigation or menu. The sitemap should provide some clarity and additional clues about contextual pages on your site, and ultimately lead to easier navigation. This is especially important if your site uses Flash, AJAX, or other feature-rich dynamic content that may not be picked up from as visitors go from page to page.
The second sitemap is the XML Sitemap, which caters to non-human visitors to the site: Google’s Robots and other search-engine crawlers that need work to index all of the pages and information on your site. The XML Sitemap is critical for SEO. This uses metadata to let Google know the most important pages and landmarks on your site, as search engines find the way in which each page relates to one another. These sitemaps can be submitted directly to Google through Webmaster Tools, and help search engines to recognize the salient features of each different page.
Sitemaps are a component of SEO that focuses on on-page optimization and can never hurt as sites struggle to become more optimized and user-friendly. Though Google does not guarantee it will crawl every URL, a sitemap will make sure that the search engines become more aware of your site’s structure, and in turn, know how to best direct traffic.
If you ever log into your Google Webmaster Tools account and notice that Google has experienced some crawl errors, it would be a good idea to fix them ASAP. When Google sends their spider bots to your website and finds errors, it not only causes the bots to crash into dead ends, it prevents them from continuously crawling your website. No one really knows how much time the bots spend on your website, but if they run into errors, it’s pretty certain that it’s less time than if your website’s navigation was flowing smoothly. To top it off, it’s makes them mad; so mad they report it to Google. Continue reading “Resubmitting XML Sitemap (to GWT) After Fixing Crawl Errors”