SEO in 2013 encapsulates much more than it once did. Rather than dousing a site’s pages in keywords and purchasing thousands of crummy links, brands are now receptive to the idea of doing more than just ranking on Google’s first page. It’s partly due to the updates in search engines’ algorithms, but also due to the fact that we as searchers have raised our expectations for what is acceptable on a website. This post, however, acknowledges the significance in fixing a website’s errors in order to make the pages more easily crawled and indexed. There’s undoubtedly value in recording content ideas in your handy sketchbook, but before pursuing those creative sparks, it’s best to pump the breaks and ensure your site is in ship shape.
Heed Webmaster Toolsa Warnings
If you’re not checking into Webmaster Tools on a regular basis, you need to start doing so. Google is a greedy bastard sometimes, but one thing we appreciate is its ability to communicate when its crawling minions stumble upon issues on a website that raise red flags. 404 Errors Pages that link to non-existing URLs can cause a kink in a visitor’s flow through your site. We recently switched the URL structure of a few pages and experienced some consequent 404 issues. Fortunately, Webmaster clarifies which URLs donat exist and which pages are linking to them, so while it may be time consuming, it’s certainly easy to either remove or update the broken links. Low quality links Google’s Penguin update pummeled websites that purchased links in order to manipulate their PageRank. In order to stay on Google’s good side, it’s important to only seek out quality links. Obviously, “natural” links are ideal, but we think it’s just as valuable to reach out to website that would appreciate sharing your content with its audience. Regardless, buying links is no longer a feasible tactic, so if you’ve received a letter from Google stating that your site has suspicious links, we recommend requesting their removal (another tedious job, but you don’t really have a choice). Important pages blocked by robots.txt As mentioned in last week’s post, robots.txt is a file that lists which URLs a webmaster has forbidden bots from accessing (and thus indexing, usually). Usually the pages that are added to this file aren’t meant to be accessed via search engines, but if a seemingly important page, perhaps one that has many inbound links, is added to the file, Webmaster Tools will alert administrators. Pages Indexed Webmaster Tools makes it pretty easy to figure out why a site isn’t being indexed properly, especially if you have submitted a sitemap. If the number of pages submitted via sitemap is significantly higher from the number of those pages that are indexed, there might be a large portion of URLs on the sitemap that are failing to return a 200 OK response code.
Additionally, if more pages are indexed than are included in your sitemap, your website is likely suffering from duplicate content, which is very common and very easy to fix. For example, many websites have multiple versions of the homepage: www.homepage.com www.homepage.com/index.html homepage.com homepage.com/ In order to convey to Google which URL should be the one to rank, you must assign a rel=”canonical” attribute in the head sections of the non-canonical (non-preferred) pages and implement 301 redirects to keep website traffic consistently on one of the many versions.
By Advice from your friendly neighborhood SEOs
There are some tweaks Google won’t tell you to make because they’re more strategic, but fortunately you have us to steer you in the right direction. URL Structure Google has conveyed that it prefers words in a page’s address to be connected with hyphens instead of underscores, and it has become common practice to do so. It’s also recommended that URLs include keywords that pertain to the page. There’s even a video: httpv://www.youtube.com/watch?v=AQcSFsQyct8 Title Tags It’s really important that every page of your site has a unique, descriptive title tag. Google uses this as a major indication of what the page is about. Webmaster Tools does report which pages have duplicate title tags, but we have a few suggestions for optimizing them for both the search engines and humans:
- Use keywords that are relevant and specific to each pageas content
- Separate keyword phrases with dashes (-) and/or lines (|)
- Include branded terms to tie keywords to the brand
- Do not exceed 70 characters
This title tag works because it explains to readers how many tips they can read on the article; 44 is quite a lot, so many searchers will probably want to check out the list. However, I’d recommend adding the brand if it can fit. Meta Descriptions Similar to title tags, meta descriptions should be unique to each page, but unlike title tags, they don’t help a page rank for a term. Rather, they help influence a searcher’s decision to click through or continue scanning the results. Here are some tips for writing meta descriptions that will help your pages reap some clicks:
- Copy: should be enticing and relevant and written like ad copy.
- Length: 2-3 sentences, no longer than 150-160 characters.
- Keywords: use 1-2 relevant keywords, if possible to do so naturally.
Image names Search engine spiders are fast and thorough, but they can’t see images. That said, it’s critical to name image files and alt attributes with keyword-rich (if possible) descriptions. This also makes it easier for photos or graphics on your site to display in image search results.
One of the greatest tools also happens to be grossly underrated. Simply searching for your site in Google can alert you to all sorts of issues, whether failing to rank locally for relevant queries, finding pages that shouldn’t be indexed, and gauging your brand’s reputation (either via instant results or ratings on review sites).
So now what?
There’s been a lot of talk lately about SEO being extinct or irrelevant. Long live content. Hail the king. Well, until we stop seeing sites bounce back after implementing the tweaks we mentioned, it’s safe to say SEO is alive and well, it’s just that ranking along isn’t enough. Housing content that provides answers, sells, or entertains is critical, but it’s important to first make sure your site can perform at its potential. Our on-site analysis of the SEO Audit covers all of these issues; we outline specific steps that a brand can take in order to step it up.