This is a challenging question for many in the SEO and web marketing fields. There are hundreds of “best practices” lists for where to place keywords and how to do “on-page optimization,” but as search engines have evolved and as other sources of traffic — social networks, referring links, email, blogs, etc. — have become more important and interconnected, the very nature of what’s “optimal” is up for debate.
My perspective is certainly not gospel, but it’s informed by years of experience, testing, failure, and learning alongside a lot of metrics from Moz’s phenomenal data science team. I don’t think there’s one absolute right way to optimize a page, but I do think I can share a lot about the architecture of how to target content and increase the likelihood that it will:
- A) Have the best opportunity to rank highly in Google and Bing
- B) Earn traffic from social networks like Twitter, Facebook, LinkedIn, Google+, etc.
- C) Be worthy of links and shares from across the web
- D) Build your brand’s perception, trust, and potential to convert visitors
With the help of some graphics from CreativeMarket (which I highly recommend), I created a number of visualizations to explain how I think about modern on-page optimization and keyword targeting. Let’s start with a graphical overview of what makes a page optimized:
In the old days of SEO, “on-page optimization” referred merely to keyword placement. Search engines liked to see keywords in certain locations of the HTML code to help indicate a page’s relevance for that query. But today, this simple approach won’t cut it for two key reasons:
- The relevancy and keyword-based algorithms that Google and Bing use to evaluate and rank pages are massively more complex.
- Gaining a slight benefit in a keyword placement-based algorithmic element may harm overall rankings because of how it impacts people’s experience with your site (and thus, their propensity to stay on your pages, link to you, or share your content socially — all of which are also directly or indirectly considered in ranking algorithms).
Below is a pie-chart breakdown of how the 128 SEO professionals surveyed for Moz’s annual ranking factors project rated broad algorithmic elements’ impact in Google:
If <15% of the rankings equation is wrapped up in keyword targeting, no wonder smart SEOs in the modern era have evolved to think more holistically. Personally, I’m happy to sacrifice “perfect” keyword placement in the title element or a URL for better user experience, a higher chance of having my content shared on social networks, or a better click-through rate in the search results.
But, for the purposes of this post, let’s put some of those caveats aside and dive into the best practices for each element of a page. It may be unwise to optimize all of these purely towards search engine-based best practices, but we can temper the advice with notes on usability and user experience for visitors, too. Below, I’ve attempted to go tag by tag, and element by element through the keyword targeting and on-page optimization canon to expand on the more basic advice in the “Elements of an Optimized Page” graphic above.
An optimized page doesn’t just provide unique content, but unique value. What’s the difference?
- Unique content simply means that those words, in that order, don’t appear anywhere else on the web.
- Unique value refers to the usefulness and takeaways derived by visitors to the page. Many pages can be “valuable,” but few provide a truly unique kind of value — one that can’t be discovered on other pages targeting that keyword phrase.
Whenever I advise marketers on crafting pages, I ask them to put themselves in the minds of their potential visitors, and imagine a page that provides something so different and functional that it rises above everything else in its field. Here are a few of my favorite examples:
- The Baby Name Wizard — a terrific page that provides clear value above and beyond its competition for searches around baby names.
- How Much Does a Website Cost — Folyo surveyed their designers to create a distribution of prices that accurate, credible, and massively valuable to those seeking data on pricing.
- Scale of the Universe — this interactive feature will take you from the tiniest parts of an atom all the way to universe-scale. No wonder it ranks for such abstract queries as “the size of things.”
- The Best Instant Noodles of All Time — The Ramen Rater has tried literally thousands of packets of instant noodles and determined these ten to be the outstanding few. I’m actually excited to try them 🙂
- Top Social Networks by Users — Craig Smith puts together an update to this list every month or two, and has compiled this invaluable resource to help those of us wondering just how big all the networks are these days. I’ve personally used this for numerous posts and presentations — it’s an excellent example of creating unique value by aggregating data from varied sources (and it, deservedly, outranks stalwarts like Nielsen as a result).
Unique value is much more than unique content, and when you have a page that rises to the level that these do, social shares, links, and all the other positive associations, branding, and ranking signals are apt to follow.
Provides phenomenal UX
A user’s experience is made up of a vast array of elements, not unlike the search engines’ ranking algorithms. Satisfying all of these perfectly may not be possible, but reaching for a high level will not only provide value in rankings, but through second-order impacts like shares, links, and word-of-mouth.
At the most basic level, a great UX means the page/site is:
- Easy to understand
- Providing intuitive navigation and content consumption
- Loading quickly, even on slower connections (like mobile)
- Rendering properly in any browser size and on any device
- Designed to be visually attractive/pleasing/compelling
Smashing Magazine has my favorite article on the subject: What is User Experience Design? Overview, Tools, and Resources.
Search engines still crawl the web using automated bots, and probably will for at least the next decade or more. While there have been plenty of leaps in the sophistication level of these crawlers, the best practice is not to take chances and follow some important guidelines when building pages you want engines to crawl, index, and rank reliably:
- Make sure the page is the only URL on which the content appears, and if it’s not, all other URLs canonicalize back to the original (using redirects or the rel=canonical protocol)
- URLs should follow best practices around length, being static vs. dynamic, and being included in any appropriate RSS feeds or XML Sitemaps files
- Don’t block bots! Robots.txt and meta robots can be used to intelligently limit what engines see, but be cautious not to make errors that prevent them from crawling and indexing your content.
- If the page is temporarily down, use a status code 503 (not a 404), and if you’re redirecting a page to a new location, don’t go through multiple redirect chains if possible, and use 301s (permanent redirects), not other kinds of 30x status codes.
Geoff Kenyon’s Technical Site Audit Checklist is still one of the best resources for those seeking more in-depth information about crawler-based accessibility.
As I mentioned in the opening of this post, it may be the case that perfectly optimized keyword targeting conflicts with goals around usability, user experience, or the natural flow of how you write. That’s OK, and frequently, I’d suggest leaning in those more user-centric directions. However, when it’s p