Friday, December 6, 2013

2014 SEO Playbook: On-Page Factors

Welcome to part 2 of my annual SEO playbook. (Click here for part 1.) I have to thank Danny Sullivan and the Search Engine Land team for giving me the perfect outline for the 2014 playbook, the Periodic Table of SEO Success Factors. Part 2 will cover on-page factors, including content, HTML and architecture. You’ll find more than enough food for thought and some very actionable steps. This is not a step-by-step SEO guide, and it’s pretty informal. Before embarking on a search engine optimization campaign, do your research or consult with an expert.
Periodic Table of SEO

Content: Quality

Quality was a big discussion item during 2013, especially topics like word count and deep content.
After Panda, you’d think we would be well past the age of producing short “fluff” articles. However, too many websites, especially business sites that struggle to post fresh content, continue the practice. Recently, I saw a corporate blog post listing 10 must-read books on a topic — the article consisted of 10 thumbnail images and the names of the books, linked to an online bookstore. You can’t afford to keep putting out cut-rate articles like that; in bulk, they are perfect Panda-penalty bait.
On the opposite end is deep content — pages or articles of around 1,500 words or more. Websites have seen success with this content, so it may make sense to take the time spent creating lots of short, “fluffy” posts and use it instead to produce a few longer, more meaningful articles. Whatever you do, make sure content is well written, with attention to grammar and spelling. Don’t just say something; back it up with thoughtful opinion or researched facts. Put some meat on the bones. Personally, when it comes to article content, if I cannot easily pass 450 words, I will combine it with other content or deem it not worth writing about.
As for e-commerce descriptions, I used to deem 250 words as the sweet spot. Nowadays, I am less concerned about word count and more focused on creating a great list, matching features with benefits.

Content: Keywords

Keyword research is not going anywhere and is still the foundation of all on-site SEO. The difference is, after the Hummingbird update, we are discussing the role of entities, where topics take the place of keywords in the result pages. Google has made great strides in synonym identification and concept grouping — some have even called it the death of the long-tail keyword. (But, as with all the supposed death knells in our field, this, too, is probably an exaggeration.)
My advice is to make sure each page stands on its own as a topic. Do not create multiple pages about the same exact thing in order to optimize for different keywords. Instead, stick to single, well-written, citation-worthy, topic pages and optimize them for multiple keywords. This can be another good reason to use long-form content.

Content: Engagement

Engagement is about whether visitors spend time reading your content or bounce quickly away. Once again, meaningful content is key. It’s amazing how it all comes back to quality. Are you publishing something your audience or target personas will want to read, or are you just filling holes in an editorial calendar — or perhaps publishing out of guilt because you have not published anything recently?
Engagement isn’t just limited to text content, either; Web page design is equally important. Words don’t just have to read well to be engaging — they have to look good. Readability includes everything from page layout to font selection to letter and line spacing. Additionally, pay attention to navigation and the presentation of links to other content, as these elements can have a huge impact on time, bounce rates and other visitor engagement metrics such as time on page/time on site.

Content: Ads

Another part of layout is the placement of ads. Search engines will not ding you for having advertisements. That would be hypercritical. What they will penalize is too many ads or inappropriate ad placements.
I do not foresee big changes in this area beyond the enhancement of current search engine policies. In addition to display ads, be especially wary of text link ads. Make certain they are content-appropriate or matching, and that you nofollow them. If you still use automated phrase link advertising inside your content, I strongly suggest you consider removing this. If you use interstitial or pop-up advertising, make sure it doesn’t interfere with the ability of search engines to crawl your pages.

Content: Freshness

I am a big proponent of fresh content — this includes not just posting about hot topics, but also ensuring that you are publishing new content on a regular or frequent basis. Not only is new content important to attract readership, it also improves crawl frequency and depth. Earlier, I wrote that you should not create content just to check off your editorial calendar. Not to backtrack, but if you do not have an editorial calendar in place, you probably should create one and get to work creating content.
Think of your content as a tool to generate awareness and trust. This means you must get beyond writing about just your company and its products or services. Go broader and become a resource — a real, viable, honest-to-goodness resource — for your target market and the people or companies that your target market serves.
Taking this broad approach will give you more to write about, allowing you to focus on topics of interest to your target market. This is the kind of content you can build an audience with. In my opinion, if you are not trying to build an audience at the top of the marketing funnel, you are probably doing it wrong. Obviously, there are exceptions to this; though, I think a lot more companies fail here than don’t need to worry about it.

HTML: Titles & Headers

Title tags are interesting right now. The usual rules for writing optimized title tags and headers have not changed. I do foresee search engines (Google especially) rewriting more title tags algorithmically. If you see Google rewriting your title tags, test changing your HTML to the same text Google presents in the SERPs. By test, I mean change a judicious few, then observe what happens to performance indicators. If you see improvement, a broader title tag optimization program could prove worthwhile.
Going back to entity search and optimizing for multiple keywords… when you are doing topic optimization, you must be cognizant of which keywords you use in the title and H1 tags. I wish I could give you a surefire formula, but one does not exist. As you look at synonyms, pay attention to which words or phrases received the most exact match searches and trust your intuition when it comes to popular language use.

HTML: Description

I don’t see anything changing with Meta description tag optimization. Write unique descriptions for every page. They will not change your rankings; but, well-written descriptions can increase click-through rate.
I always pay attention to length, around 150 words. In reality, the actual length depends on the combined pixel width of all characters, but from a practical standpoint just make sure your descriptions are not getting cut off when they appear in the results.
For pages that appear in site links, be sure that the portion of the description that appears beneath each link forms a coherent thought. This is a place where many enterprise sites and brands can improve.

HTML: Structured Data Markup

Each year, it seems structured data markup is always a big topic.
First is the question of whether or not you should use it for organic search engine optimization. Some long-time experts do not like structured markup or machine-readable language because they do not want to help the search engines present information in a format that does not generate visits.
For example, if you type in the name of your favorite NFL team, Google will show you information about that team, including their next scheduled game, right on the SERP. Here’s an example I fondly remember: someone once asked, if you ran a zoo website, would you want Google to show your business hours at the top of the search results, or do you want people to visit the website, where they will learn more about current exhibits and events? This is a fair question — to which I think the fair answer is, whatever will get the most bodies through the door.
Google, Bing and Yahoo are going to show the data they want and in the format they desire regardless of how you or I feel. Personally, I’d much rather be a trusted source, even if it means my website information is made available in the SERPs. For this reason, I am a huge proponent of structured data markup like schema.org and RDFa.
Other forms of structured markup, like the author and publisher tags, are not controversial and have entered the realm of best practices. Use them.

HTML: Keyword Stuffing & Hidden Elements

Negative ranking factors like keyword stuffing and hidden text are so old that many of us practitioners brush them off as search engine optimization 101. Unfortunately nothing is ever so easy.
Stuffing is definitely a factor in e-commerce shopping cart optimization. It can be tricky not to use the same word or phrase over and over again when they are used as categories or descriptions for products. Different shopping carts have different levels of control. Some are more easily optimized than others. On category pages, it may be as simple as limiting the number of products you display on each page. Without going into an entire lesson on shopping cart optimization, what I will tell you is: if you have not done a shopping cart review in the last two years, it is time. Make certain your e-commerce platform is keeping up.
It still surprises me how often I see unintentional cloaking. Usually, it’s a result of the template writer getting around a quirk of the content management system. But I have also seen static links in a template that are cloaked using display: none on some pages while they appear on others, depending on something such as the category. The bottom line is this: if it appears on the page, it should be in the HTML. If it does not appear on the page, it should not appear in the HTML.

Architecture: Crawl

Not enough search engine optimizers pay attention to crawl. I realize this is a pretty broad statement, but too many of us get so caught up in everything else that this becomes one of the first things we ignore unless there are red, flashing error messages. Obviously, you want to make sure that search engines can crawl your website and all your pages (at least the ones you want crawled). Keep in mind that if you do not want to botch the flow of PageRank through your site, use the meta noindex, follow tag to exclude pages, not robots.txt.
The other concern you should have is whether or not search engines crawl and capture updates to existing pages in a timely manner. If not, it could be an overall domain authority issue or that PageRank is not flowing deep enough in sufficient quantities.
There are tricks to resolve this, such as linking to updated pages from your homepage or a level-one page until the updated deep page gets reached. The more wholesome approach is to make sure that the content which gets updated is naturally close to content or sections of content with higher authority, or to build legitimate internal links from related content that has its own off-site PageRank.
I am not telling you all your content should be crawled all the time. Search engines budget crawl frequency and depth for good reasons. What I am saying is manage your website crawl budget and use it well; don’t just leave everything up to chance.

Architecture: Duplicate Content

Earlier this year, Matt Cutts stunned the search engine optimization community by telling us not to worry about duplicate content. He assured us that Google will recognize this duplicate content, combine the disbursed authority, and present one URL in the SERPs.
This is really not a big surprise, as Google has been working toward this for quite some time. Webmaster tools has had automated parameter identification and Google spokespersons have discussed duplicate content consolidation for some time.
To repeat what I have written before, Google is not the only search engine out there and reality does not always work the way Google says it does. The bottom line is: keep managing your duplicate content by preventing or eliminating as much as possible, and as for the rest, put your canonical tags in place.
Speaking of canonical tags, I know a popular hack has been to use one canonical URL, improperly, on all the pages of multipage articles. There are other canonical hacks out there, as well. I’d be wary of these. If you’re using canonical tags, machine-readable content or advanced meta-tags, you’re basically waving a big red flag telling search engines that your website is technically savvy and using search engine optimization. In other words, you’re begging for additional scrutiny.
It would not surprise me if Google becomes more fierce in penalizing websites for this type of technical misdirection. Search engines tend to use a soft touch on levying penalties algorithmically for fear they will burn innocent websites. But as we have seen with Panda and Penguin, as they become more confident, they also become more aggressive. If you are optimizing for an employer, keep it clean.

Architecture: Speed

Most websites are not going to see an SEO benefit from increasing the speed of their website. Google has always said only a small fraction of sites are affected by this part of the ranking algorithm. This view seems to be borne out by correlation studies. Honestly, the best test of speed is to take your laptop to the local café and surf around your website. If you are not waiting for pages to load up, then you are probably okay.
The exceptions (sites that should be concerned about speed) are large enterprise and e-commerce websites. If you optimize for one of these, shaving a few milliseconds from load time may lower bounce rates and increase conversions or sales.

Architecture: URLs

The current best practices for URLs should hold true throughout 2014. Simple and easily readable URLs are not just about search engine optimization. With today’s multi-tabbed browsers, people are more likely to see your URLs than they are your title tags.
I will also add that, when seen in the search engine results pages, readable URLs are more likely to get clicked on than nonsensical ones. If your content management system cannot create readable URLs based on your title tags or will not let you customize URLs, it is probably time for a CMS review. This is now a basic search engine optimization feature, so if your CMS cannot handle it, I wonder about the rest of your CMS’s SEO efficacy.

Architecture: Mobile

2013 was an interesting year for mobile SEO. Google and Bing agree that the ideal configuration is for websites to have a single set of URLs for all devices and to use responsive Web design to present them accordingly. In reality, not all content management systems can handle this, and Web designers have presented case studies of situations where the search engine standard is neither practical nor desirable.
If you can execute what Google and Bing recommend, do so. However, if you cannot or have a good reason not to, be sure to use canonical tags that point to the most complete version of each page, probably your desktop version, and employ redirects based on browser platform for screen size.
You will not risk a penalty from the search engines as long as your website treats all visitors equally and doesn’t make exceptions for search engine spiders. Basically, this is similar to automatic redirection of visitors based on their geographic location for language preference.
That about wraps it up for on-page SEO factors in 2014. Be on the lookout for Part 3 of my 2014 SEO Playbook, which will cover off-page SEO factors relating to link building, local search and social media.

Monday, July 15, 2013

7 Reasons to Remove "Link Building" from Our Vocabulary

1. Link building isn't a process or goal

Our goal is almost always direct or indirect profitability. Where organic search marketing is concerned, profitability comes from qualified traffic, and qualified traffic comes largely from favorable search engine positions. Favorable search results are achieved to a significant extent by acquiring links from diverse high-authority domains.
Nothing above looks too controversial yet, but why then should we not focus on links? If links lead to higher rankings and eventually to profitability, we should build links, right? This makes sense until we expand the diagram.
Direct link building is a process that only a spammer or link buyer can do. I prefer "link earning" — a phrase I’ve borrowed from Danny Sullivan’s legendary rant and Rand Fishkin’s Whiteboard Friday — but I see no reason why our efforts and successes should be constrained by links. Some online marketing tactics may also contribute directly to rankings, and some definitely contribute directly to traffic.
Even those who neither spam nor buy links have become so focused on link acquisition that many de-emphasize or even ignore what comes before or after. We heard some amazing forward-thinking talks at Mozcon, almost all about real, legitimate, and sustainable marketing. Even then, we heard far more about the number of links obtained than we did about rankings, traffic, or profitability.
I am not suggesting that we stop caring about links. Link data can be used for many valuable tasks including the following:
  • Find external pages that appear to have generated awareness and increased visibility. We can, for example, use Open Site Explorer to understand industry challenges and past successes.
  • Provide valuable insights into campaigns that are still in progress.
  • Find potential marketing targets (e.g. those who shared a similar piece of content).
  • Explain current rankings.
There are plenty of additional reasons why link data is fantastic. I am merely suggesting we stop leading people to death by Penguin.
When we focus on links as a process and a goal, we're working towards the measurement rather than the goal the measurement was intended to measure. Profitability is the goal — events, guest posts, or content pieces are the methods and tactics to get there. If we achieve the goal through a combination of organic traffic, cross-coverage, and direct traffic, I doubt anyone will complain. We might even be more effective as marketers by considering more pieces in the puzzle.

2. Google wants to kill "link building" as a process

This isn't about being a "white hat" anything. I, for one, cringe when referenced as a "white hat" marketer — it stings like a label for someone adhering to dogma set forth by infallible Google. I’m with Dr. Pete on this hat nonsense. If I thought buying links was a smart risk-free way to make money, I would suggest we all buy links. I am simply a believer in sustainable marketing tactics.
"The philosophy that we've always had is if you make something that's compelling then it would be much easier to get people to write about it and to link to it. And so a lot of people approach it from a direction that’s backwards. They try to get the links first and then they want to be grandfathered in or think they will be a successful website as a result."
-Matt Cutts in an interview with Eric Enge
Matt says link building isn't inherently evil, but only when we get it mixed up. We run afoul of search engines only when we look at links with tunnel vision, as in the first diagram above, as an activity rather than an outcome.
We should care what Google wants, if only because it’s dangerous and difficult to fight against them in the long run. I once warned about what would eventually be called "Penguin" in March of 2012 — just one month before the first Penguin update — and met some strong resistance claiming Google would never penalize for links, but only devalue them.
It’s a mistake to underestimate what Google can and will do. Counter-spam might move slower than spam most of the time, but I suspect Penguin won't be our last reality check for artificial links.

3. Modern Google is not a link-counting machine

Regardless of what Google will do in the future, we should also consider what Google can already do today. What were links meant to measure in the first place? Why did Google use them, and how did they help? We know that links help to filter out the garbage on the web, and they are still heavily used because link data helps to measure the popularity and authority of a site and page.
We know Google understands more than followed links and anchor text. Embeds have been called "links for videos." Citations are "links for local." Google uses URL text for discovery, even if the text isn't an explicit link. The search engine has long understood which words are related to one another, and which brands relate to which words — as anyone who has used the Google keyword tool can attest. We just heard a presentation from Dr. Matt showing a correlation between social shares and brand mentions with rankings.
We don’t know everything about Google and the algorithm. Perhaps Google is using co-occurrence as a ranking factor, but can we really doubt the search engine looks at good-old-fashioned occurrence as a measuring stick for site authority and popularity? It’s not unlikely that Google is using a combination of data sources — mentions, links, offline brand metrics, etc. — to measure or confirm popularity.
We also need to back up and consider the degree to which popularity, awesome products, useful content and great web pages drive all popularity signals, and to what extent they are used by Google. Facebook likes correlate with site traffic whether Google ever looked at them or not. Even with great statistics and a few tests, we can’t be totally sure about how much Google uses which signal or under what circumstances. Why focus on simply building links when Google uses more than links? Why obsess over a small HTML element when we have the ability and skills to improve multiple metrics and build visibility with or without Google?

4. Qualifying "good links" doesn't stick

Perhaps "building links" isn't a bad idea as long as the links are good. Even though I agree, we still need to stop talking about building links. Even if we could list every possible quality that defines a "good link," we find that we have an overly-technical and roundabout way of saying "market to your audience."
No matter how many caveats we add, or how precisely and carefully we define "high-quality links," people still seem to come away with their own version of what a good link is. The value of a link is far less intuitive than the value of coverage and visibility.
Adria Saracino wrote an enormous post last year about nothing other than qualifying link prospects. More could have been written, but I’m not sure more could have been retained or recalled. To keep clients focused on the real goals rather than links, Adria has begun pushing internally and externally for a stronger focus on revenue rather than links alone.
Rankings and links are benchmarks, not processes — a way to track progress on the way towards our real goals of qualified traffic and sales.

5. Link obsession can hurt relationships

Asking people to add links, change the post, or edit their existing links can appear selfish and demanding. I believe most people who do so are not selfish people, but rather people whose success is measured in terms of links above all else.
... And now I want to remove any mention of the source. This is a ridiculous example, but illustrative of where "link building" has led us.
Sometimes building awareness with an audience is better than link building. Coverage and relationships with publishers leads to more coverage, awareness, and — yes — even more links. But once again, links are not the goal; they are merely one outcome and benefit of marketing with the goal of profitability.
If you do want to risk additional requests from those who have already been kind enough to cover your topic, take Phil Nottingham’s advice and offer something of value (in his example, HD quality video) in the process.

6. Focusing on links leads to missed opportunities

I was recently reminded of a short-term consulting project I worked on where a large client had dedicated as much as $20,000 per month and a full-time employee's time to buying and renting links. They hadn't been caught yet, and their rankings were relatively solid, but improvement was minimal. The total traffic from paid links — mostly footer links — was in the low thousands.
The company was so risk averse (a pet peeve of mine) that they were unwilling to stop because their competitors were also buying links. To my knowledge, we never convinced the client to spend half as much producing content or seeking real visibility.
Even giving the money away would be more effective marketing. What blogger wouldn't participate in a contest for a chance to win a free car? You could literally drop $20k in cash from your rooftop in a press event and generate more publicity, and probably from more and better sources if done well.
It’s true that the case above is the second or third most extreme example of link-centric myopia I’m aware of, but one need not look far to see less dramatic examples.
For instance, the cost of a typical unbranded guest blog post will also far exceed its value. From first contact to actual posting, the submission will easily take a few hours. More importantly, marketers focused on corresponding with blog owners for links are not focusing on building better businesses, products, content, or websites. The opportunity cost for tactics where link building is the only goal can be enormous — and why, when even guest posting could bring both links and awareness?

7. Marketers should differentiate their services from spammers

The results of emphasizing link building are predictable: marketers new to the industry hear so much about link building that they become desperate for links and turn to spam and paid links. Similarly, clients hear regularly that they need links, and set link goals for their employee or agency. Then, Penguin unleashes its wrath almost exclusively on those who focus on link-building as a process. And, we wonder why it’s so hard to change the perception of SEO in the industry.
Consider this, if you are "link building," you’re either spamming links or doing online marketing. Those who are practicing sustainable marketing tactics may do well to distance themselves and their activities from spammers using the same terms.

So, what should we do instead?

"Link building" is a phrase used by most industry experts, many of whom I respect deeply. Unfortunately, their use of such terms grants a sort of license, shelter, and reassurance to people doing a very different kind of link building. The ambiguity can take new marketers some time to figure out, and our industry and personal reputations suffer at the hands of ineffective marketing.
Among those who agree with the philosophy presented above, the required change is simple: it's just a matter of using new words. Many others may find adjustment more difficult; I hope and believe they will also find it more rewarding.

Use better words, track better metrics

When we talk about obtaining visibility, awareness, traffic, or coverage, we immediately ensure we and our clients are talking about similar goals. For processes, we can talk about the actual tactic, whether it's outreaching for an infographic or hosting a webinar.
If, for some reason, we need to refer to these processes in aggregate, terms like "inbound marketing," "online marketing," and "content marketing" might be right, depending on the breadth and focus of services offered.
Changing our choice of words admittedly has less impact than changing what we do, but even altering the use of words can have a surprising effect. "What can we do to get links?" sets an unnecessary and artificial constraint on marketing activities, thereby limiting our marketing to a few activities and making the goal of links explicit.
For the last several months, I've been trying to ask better questions. "What can we do to increase visibility and generate awareness? What can we do to drive more qualified traffic? What can we do to increase profit per qualified visitor?" Followed links may be a facet of the resulting strategy, but they are unlikely to be its entire purpose.
We ensure that we're building better businesses when we track the results of our efforts and report on their impacts. To ensure we are working effectively it's wise to continue tracking the places we have requested and received online coverage, but we're more interested in revenue first, traffic second, and rankings third. Coverage (sometimes as standard links) and rankings matter, but only to show progress while working towards traffic and revenue.

Do better marketing

I'm truly excited that we have the skills and knowledge to do something better in a way that other marketers cannot. We have tools that other marketers don't use in their research, giving us insights into what works before we start building or emailing anyone. We understand the Internet, search engines, and traffic generation. The future looks bright if we can get our priorities straight.

Awareness over links

It is easier to slip links into posts about diverse topics than it is to write a post about a product, service, or company. We all know the kind of guest post I'm talking about: guest posts on mommy blogs that suddenly include suspiciously-targeted anchor text. Posts on pet blogs somehow slipping in a link to a web hosting company. They look like this:

How many people out of a thousand would click on that link? One, maybe two? Compare that with a guest post on the Wall Street Journal by the CEO of a relatively small company.

for more information visit this link: http://moz.com/blog/7-reasons-to-remove-link-building-from-vocabulary

Saturday, July 6, 2013

Major Google algo update coming – a few weeks left to get your site ready for Penguin 2.0.

Analyze and clean-up your link profile 

First and foremost, Penguin update combats spammy and low-quality backlinks.
The new Penguin version is expected to use even more sophisticated techniques to spot spammy links in your backlink profile, so even if your backlinks seem quite OK for Google right now, you'd better run another thorough audit before Penguin 2.0 arrives.
So the first thing to do now is to analyze your site's backlinks, identifying any potentially spammy links and getting rid of them before the update.
The basic rule here is to avoid backlinks that:
Come from sites built exclusively for the purpose of SEO.
Use overly-optimized anchor text.
Come from adult or other "bad neighborhood" websites.
Come from sites that are irrelevant to your own.


Now here's how to find and identify suspicious links pointing to your website. This may take some time, but recovering from the update will for sure take you much longer.

http://www.link-assistant.com/news/new-google-penguine-update.html?icf=email