Are your link building efforts outdated?

Are your link building efforts outdated?

Last week we mentioned how to use keyword research organically. While it may seem elementary to some, the reality is Google’s algorithm has changed in the past couple years, forcing some brands to think twice about the way they are using (and sometimes abusing) SEO elements like keywords. (If this is you, be sure to check out the post so you know where you can safely use keywords on your site.) Today we’re going to dig into link building, or outreach, as it’s often referred to in 2013. Even if you haven’t purchased links for your site, there’s a pretty good chance your old methods are quickly fading into oblivion. So let’s look at a brief overview of which tactics are losing their relevancy and what you can do to get back into the game.

1. Put away the credit card.

This is probably obvious, but if you arenat completely convinced, it’s only a matter of time before The Goog catches you red-handed and makes you suffer the consequences. Don’t buy links. Don’t let anyone convince you it’s a good idea.

2. Stop posting links in comments.

Most blogs (especially those that have high authority) automatically assign “nofollow” attributes to any links in the comments section. So go ahead, spam the comment sections of blogs that attract the audience you wish you could have. Not only will you annoy everyone reading the post, but you’ll also waste your time on a totally futile venture.

3. Back off the directories.

There was a time when it was really beneficial to be included in directories because it made the crawling/indexing process easier for search engines. Nowadays, some directories still hold value, but there are plenty that will end up hurting your reputation more than helping it. We recommend pursuing more relevant measures.

4. Bring something to the table.

If your link building strategy has always consisted of contacting webmasters for a link, just because, well, you’ve probably noticed that it’s not working anymore. And that’s because while it used to be flattering to be asked for a link, it is now aggravating for webmasters. Why should they share the authority they’ve worked hard to build? Unless you have something to give them in exchange (i.e., content their audience will appreciate), donat waste your time. Or theirs. This brings us to the core of outreach strategy: content! Yes, we know you’re sick of hearing about it, but try to put on your grown-up pants and realize that you’re going to have to do things the hard way from here on out. Content generation is not easy, but it starts with keywords. This is because keywords are essentially real queries from real customers, so you can understand what kinds of questions you should be answering. (That’s the aha moment.) We’ve recently written several posts about content, so if you’re stuck in a rut, and let’s face it, this happens to everyone, you can hopefully gain some inspiration. First figure out what purpose your content should serve; there are eight purposes we came up with. Keep in mind that while content can sell for you, it doesn’t always have to be overly promotional.

Outreach is still a very difficult, time-consuming process, so we took the time to outline our method. It’s literally a step-by-step instructional guide for conducting outreach. We’ve actually assembled a couple of versions, one that is geared toward tech-savvy folks who have access to the kind of software we use at Evolve Digital Labs. The other is for non-nerds, so it’s a bit easier to digest. Whether you’re a fellow SEO, marketing director, or student, we invite you to download these outreach documents and get started.

Using Keyword Research for Organic Search

Using Keyword Research for Organic Search

We’re big fans of Keyword Research. Huge. We include it in our SEO Audit because we know how a brand can benefit from targeting the right audience with its site. It’s pretty obvious how a client can use keyword research for Paid Search; bidding on suggested terms will prompt the display of a relevant ad to the searcher. If implemented strategically, the campaign will direct paid clicks to keyword-specific landing pages.

How do I safely use keywords on my site?

What many clients don’t understand, however, is how to use keywords for organic search. Recent updates in Google’s algorithm penalize websites that try to manipulate the system. One example is the creation of thin content; webmasters will build out thousands of pages containing one or two paragraphs. These pages provide very little value to the visitor; rather, they exist only to house keywords and trick Google into thinking the site is authoritative on a subject.

If your site has experienced a sudden drop in traffic, it could be the pages are in violation of Google’s policies. Before you lose all hope, let’s look at some best practices for organic keyword usage. We can fix this.


The web address of a page is a critical element; it conveys to Google a summary of the content on a page. We definitely recommend using a keyword in a URL, preferably close to the domain. The URL doesn’t necessarily have to match verbatim the title of a page or post, which is helpful for blog posts especially.

Title Tags

Much like the URLs, the title tag is a very important element; it reinforces the subject of the page. Every page should have a unique title tag that contains a keyword close to the front. Because title tags show in SERPs and typically play a significant role in helping searchers decide whether to click through or not, we also recommend adding a branded phrase, such as a company name, to the title tag. Make sure to keep everything under 70 characters.

Meta Descriptions

These elements are also found in SERPs, but search engines don’t take the contents into consideration when ranking a page. Still, people searching for a solution rely on meta descriptions to deliver a sneak peek of a page, so it’s a good idea to use keywords when relevant.

Body copy

It’s fairly easy to build pages that arenat overstuffed with keywords; just make sure you are writing for humans and not search engines. Paragraphs should sound natural when reading aloud. Additionally, ask yourself if any new content you’ve added to your site is serving a real purpose. These litmus texts may seem elementary, but they will help you develop a new mindset of what it means to create quality content on the web. Finally (and this should be a given) a do not duplicate content in order to switch out keywords. This is an outdated practice that doesn’t fool anyone – including Google. The only time repeated content is acceptable is when A/B testing multiple landing pages (but in this case, the variable pages should be assigned “no index” attributes).


Here’s where you need to be careful. Penguin, Google’s most recent major algorithm update, punished an extraordinary number of sites that engaged in unethical linking tactics, such as link buying, cloaking, and abuse of keywords in anchor text. The first two mentioned are quite obviously sketchy practices, but the last one can be more difficult to catch. When a site has too many links that are tagged with keyword-rich anchor text, Google will perceive this activity as intentionally manipulative.

For example, if your site is trying to rank for tax services, it would be a huge red flag for the pages to be swarming with link after link called “tax services,” “tax services,” “tax services.” First of all, a link like that doesn’t help visitors understand what the page will cover (aside from the broad topic of tax services). Keywords are completely fine to use; just try to vary the words used in the full anchor text. So instead of “tax services,” use “tax refunds for small businesses,” (assuming that’s relevant to the page, of course).

Instant Search

It’s really challenging to produce valuable content, especially for brands in less-than-exciting industries that struggle to deliver solutions in new ways. One underrated use of a keyword list is as a search tool for specific queries. The keyword “grass seed,” for example, would be typed into Google, followed by “a,” then “b,” and so on. Google Instant Search automatically fills in a query as you type, revealing common long-tail searches that relate to a keyword. It’s not always a home run, but it certainly can inspire new content ideas that will answer specific questions for the segmentation of your audience.

This method is fantastic because it’s showing real questions that real potential customers are asking online. It provides your brand with a chance to create something from it, whether a detailed blog post, landing page for a product, or even a visually dynamic infographic.


Keyword research is still a critical element of attracting customers to a site. Why wouldn’t we want to know what industry-relevant terms are frequently typed into Google? The problem with keywords is when brands greedily try to either rank for unrelated (or hardly-related) phrases AND when brands try to rank for relevant terms in unethical ways. Hopefully, this post will help you remember to focus on customers’ needs when generating new content pieces and optimizing your pages. If you’re feeling left out because you don’t even have a list of keywords to refer to, send us a note. We’d be happy to do the research for you. We’re really good at it.

Put content on hold; first fix what’s broken.

Put content on hold; first, fix what’s broken.

SEO in 2013 encapsulates much more than it once did. Rather than dousing a site’s pages in keywords and purchasing thousands of crummy links, brands are now receptive to the idea of doing more than just ranking on Google’s first page. It’s partly due to the updates in search engines’ algorithms, but also due to the fact that we as searchers have raised our expectations for what is acceptable on a website. This post, however, acknowledges the significance in fixing a website’s errors in order to make the pages more easily crawled and indexed. There’s undoubtedly value in recording content ideas in your handy sketchbook, but before pursuing those creative sparks, it’s best to pump the breaks and ensure your site is in ship shape.

Heed Webmaster Toolsa Warnings

If you’re not checking into Webmaster Tools on a regular basis, you need to start doing so. Google is a greedy bastard sometimes, but one thing we appreciate is its ability to communicate when its crawling minions stumble upon issues on a website that raise red flags. 404 Errors Pages that link to non-existing URLs can cause a kink in a visitor’s flow through your site. We recently switched the URL structure of a few pages and experienced some consequent 404 issues. Fortunately, Webmaster clarifies which URLs don’t exist and which pages are linking to them, so while it may be time-consuming, it’s certainly easy to either remove or update the broken links. Low quality links Google’s Penguin update pummeled websites that purchased links in order to manipulate their PageRank. In order to stay on Google’s good side, it’s important to only seek out quality links. Obviously, “natural” links are ideal, but we think it’s just as valuable to reach out to a website that would appreciate sharing your content with its audience. Regardless, buying links is no longer a feasible tactic, so if you’ve received a letter from Google stating that your site has suspicious links, we recommend requesting their removal (another tedious job, but you don’t really have a choice). Important pages blocked by robots.txt As mentioned in last week’s post, robots.txt is a file that lists which URLs a webmaster has forbidden bots from accessing (and thus indexing, usually). Usually, the pages that are added to this file aren’t meant to be accessed via search engines, but if a seemingly important page, perhaps one that has many inbound links, is added to the file, Webmaster Tools will alert administrators. Pages Indexed Webmaster Tools makes it pretty easy to figure out why a site isn’t being indexed properly, especially if you have submitted a sitemap. If the number of pages submitted via sitemap is significantly higher from the number of those pages that are indexed, there might be a large portion of URLs on the sitemap that are failing to return a 200 OK response code.

Additionally, if more pages are indexed than are included in your sitemap, your website is likely suffering from duplicate content, which is very common and very easy to fix. For example, many websites have multiple versions of the homepage: In order to convey to Google which URL should be the one to rank, you must assign a rel=”canonical” attribute in the head sections of the non-canonical (non-preferred) pages and  implement 301 redirects to keep website traffic consistently on one of the many versions.

By Advice from your friendly neighborhood SEOs

There are some tweaks Google won’t tell you to make because they’re more strategic, but fortunately you have us to steer you in the right direction. URL Structure Google has conveyed that it prefers words in a page’s address to be connected with hyphens instead of underscores, and it has become common practice to do so. It’s also recommended that URLs include keywords that pertain to the page. There’s even a video: httpv:// Title Tags It’s really important that every page of your site has a unique, descriptive title tag. Google uses this as a major indication of what the page is about. Webmaster Tools does report which pages have duplicate title tags, but we have a few suggestions for optimizing them for both the search engines and humans:

  • Use keywords that are relevant and specific to each pageas content
  • Separate keyword phrases with dashes (-) and/or lines (|)
  • Include branded terms to tie keywords to the brand
  • Do not exceed 70 characters

This title tag works because it explains to readers how many tips they can read in the article; 44 is quite a lot, so many searchers will probably want to check out the list. However, I’d recommend adding the brand if it can fit. Meta Descriptions Similar to title tags, meta descriptions should be unique to each page, but unlike title tags, they don’t help a page rank for a term. Rather, they help influence a searcher’s decision to click through or continue scanning the results. Here are some tips for writing meta descriptions that will help your pages reap some clicks:

  • Copy: should be enticing and relevant and written like ad copy.
  • Length: 2-3 sentences, no longer than 150-160 characters.
  • Keywords: use 1-2 relevant keywords, if possible to do so naturally.

Image names Search engine spiders are fast and thorough, but they can’t see images. That said, it’s critical to name image files and alt attributes with keyword-rich (if possible) descriptions. This also makes it easier for photos or graphics on your site to display in image search results.

Brand Search

One of the greatest tools also happens to be grossly underrated. Simply searching for your site in Google can alert you to all sorts of issues, whether failing to rank locally for relevant queries, finding pages that shouldn’t be indexed, and gauging your brand’s reputation (either via instant results or ratings on review sites).

So now what?

There’s been a lot of talk lately about SEO being extinct or irrelevant. Long live content. Hail the king. Well, until we stop seeing sites bounce back after implementing the tweaks we mentioned, it’s safe to say SEO is alive and well, it’s just that ranking along isn’t enough. Housing content that provides answers, sells, or entertains is critical, but it’s important to first make sure your site can perform at its potential. Our on-site analysis of the SEO Audit covers all of these issues; we outline specific steps that a brand can take in order to step it up.

Robots.txt vs. noindex: What’s the difference and when to use which?

Earlier this week, I discovered that several of our uploaded images were being assigned a unique URL and indexed in Google. While itas not likely that someone would be performing a site search of, it is still alarming to see pages I didnat know existed floating around the SERPs.

There are a couple ways to prevent search bots from showing a websiteas pages in the results. One is adding the URL(s) to the robots.txt file. Another way a the more effective way in many cases a is by using meta tags to instruct robots what you want them to do. This post explains each route and provides some scenarios so you can better understand which way to go. It can be confusing.