There comes a time in every SEO’s career when it’s time to move beyond the basics of the trade and tackle the big questions. Everyone knows that you need solid on-page, authority backlinks, and high quality unique content. But let’s take a look at some advanced SEO questions that industry professionals need to learn.
1. What makes the Penguin 4.0 update different from the previous ones?
The Penguin 4.0 update continues its goal to advance content quality and stop dishonest backlinking.
Penguin 4.0 refreshes in real time
In previous versions, Google updated your site ranking monthly, while Penguin 4.0 refreshes in real time. Websites with spammy links will hamper SERPS drastically – but these sites can fix their issues faster than before. Instead, because of the real-time detection, your offending website can have its ranking salvaged within a day of making repairs.
Page-specific instead of site-wide
In the past, spam on one landing page could plummet your SERPS, but with Penguin 4.0 only the offensive page will lose ranking – leaving the rest of your domain free to keep ranking.
2. How Do I Use HrefLang Tag Correctly?
The hreflang attribute in HTML let’s search engines know which language your content is in. You can generate the hreflang code for your website very simply using aleydasolis.com. Once you’ve got that done, there are three common methods for implementing it correctly.
Method 1: Placing the tags in the <head> of the website
For example: <link rel=”alternate” href=”http://www.shiftwebsolutions.com” hreflang=”en-us” />.
Method 2: HTTP headers for non-HTML content such as PDFs.
Link: <http://es.shiftwebsolutions.com/document.pdf>; rel=”alternate”; hreflang=”es”,
<http://en.shiftwebsolutions.com/document.pdf>; rel=”alternate”; hreflang=”en”,
<http://de.shiftwebsolutions.com.com/document.pdf>; rel=”alternate”; hreflang=”de”
Method 3: Sitemap markup using the xhtml:link attribute to annotate each URL.
The downside of HTTP headers and <head> is the overhead it adds to page loads. Using a sitemap works the same as the <head> method with less database calls, but does require a bit more setup. The following example demonstrates using two other languages for one URL:
<xhtml:link rel=”alternate” hreflang=”en” href=”http://www.shiftwebsolutions.com/” />
<xhtml:link rel=”alternate” hreflang=”en-au” href=”http://www.shiftwebsolutions.com/au/” />
<xhtml:link rel=”alternate” hreflang=”en-gb” href=”http://www.shiftwebsolutions.com/uk/” />
3. Link equity is passed if and only if the link is placed in…?
- Relevant content preferably in the page body
- An authoritative website with a trusted background
- Crawlable with followed-links and via robots.txt
- Pages with HTTP status resolving 200s or permanent 301 redirects
- Among an optimum number of other links. There is no hard fast rule, but do not use too many
4. What are the best practices for generating optimized meta title and description?
- 70 words is the maximum length to display on search engine results. This allows the user to make a brief glance and glean the page topic.
- Include your exact target keyword in the meta title.
- Use unique, specific, and descriptive titles for every page. Refer to your Google Webmaster Tools – get suggestions by generating an HTML improvement report from Google.
- Avoid keyword spamming by having a distinct theme for each page.
- Limit meta descriptions to 150-160 characters in length – allowing the entire tag to fit in the Search Engine Results Page.
- Write short descriptions rich with keywords, making sure to include your exact target keyword in the text. Think of these as quick sales pitches.
- As with titles, avoid keyword spamming by keeping each page specific to one unique topic.
In both cases we reference unique and descriptive titles. The old adage is still true – that Google ranks pages, not websites. If your website theme revolves around ‘pet sitting’, for example, the user may be more interested in your ‘overnight prices.’ As such, you need to flesh out each page with relevant and specific meta tags.
5. What is keyword competitiveness and how is it measured?
The difficulty in ranking higher on the front page of Google than your competition in the exact same unique category is known as keyword competitiveness. There is major competition for high value keywords – and unless you are willing to budget time and money, it is unlikely you can survive. To assess your competition use https://kwfinder.com.
Some factors for measuring keyword competitiveness include:
- Page title – The header and main topic of the page, displayed in blue. If a keyword does not appear in the heading but is on the front page, the subject matter has a low competition.
- On-page tags – Including the keyword in headers and image alt tags are necessary to compete, and those without them likely do not have much of an SEO budget.
- Content – If the content is quality and authoritative, it has a better chance at beating out the top 10 on the front page.
- Backlinks – The most important way to compete. Connect links from multiple different high ranking and frequently trafficked websites.
6. What does a disavow file do?
A list of domains that the publisher does not want Google to include when measuring backlinks is a disavow file. It is more useful now than ever – since Penguin 4.0 operates in real time. If your backlinks have been purchased (strictly against Google’s rules) or include spam or unscrupulous content – a webmaster can disassociate or disavow themselves from the toxic links with a disavow file. Learn how to build a disavow file here.
A disavow file is useful when referencing irrelevant content on your page. For example, Wikipedia cites all of an article’s references at the bottom of the page with nofollow links. The Wiki page about the village of Thackley references a page about its resident Christine Alvin and her poetry, a topic that is not relevant for backlinking to the village of Thackley.
7. Is it a good practice to use a disavow file with Penguin 4.0?
Yes it is good practice; however, it may not be necessary to use a disavow file. The first step is downloading a backlink profile. Majestic is a fine tool for this. The following red flags indicate the backlinking page is not quality:
- If the site is not indexed in Google search results
- The content is irrelevant to your business
- It contains malware or looks spammy
- Forum spam with many links likely used to attempt to build site authority by building links
- Pages with a low domain authority
8. Google’s mobile first index focuses on…?
Google’s long awaited mobile first index has some important implications for SEO. Going forward, the smartphone or tablet content of a page will be determined first for ranking and indexing. Because of this, websites with responsive layout or dynamic serving will be indexed using mobile-first. Fast-loading mobile content will help dramatically for mobile-first indexing, and having mobile-friendly content is helpful in ranking higher in mobile search rankings.
The goal of mobile-first Indexing is to provide accurate, mobile-friendly content higher for searches made on smartphones.
9. What is cloaking? Is it a good practice?
Cloaking is the practice of suggesting page content, but displaying something different. Basically, it involves showing one thing to visitors and another to Google. This is a violation of Google’s Webmaster Guidelines – as the displayed results are different than what the user expected.
This method of disguising a page’s content is used commonly by hackers. By displaying friendly information relevant to common search queries, hackers can lure users to sites full of malware or phishing schemes.
Because it is against Google’s Webmaster Guidelines and has a disingenuous connotation, cloaking should not be practiced.
10. How is the trust flow from a domain measured? Who introduced this metric?
Trust flow was introduced by Majestic SEO in order to measure how trustworthy a page is based on the neighboring links it contains. A site about the Brandywine Zoo that contains a backlink to BBQ recipes would be given a bad trust flow score since the content is irrelevant.
Strong internal and external links are required for a domain to be measured higher. A strong home page is the first step – as flow metrics travel within internal links, so make sure your entire site is connected well and has related, relevant content. As always strong external linking is the key to determining a trustworthy site.
11. Is it a good practice to use LSI keywords for content optimization?
Latent Semantic Indexing helps Google understand the context of a search query. LSI helps ensure, for example, that someone searching for Gravity, the movie starring Sandra Bullock, is not given the equations to determine the rate of acceleration towards the planet.
LSI keywords are helpful because Google no longer searches for keyword density, instead examining the content context. When you have used your target keywords to their limit, LSI keywords can be used to develop topic specific content.
While they are not synonyms, LSI keywords are variations on your keyword, and it is good practice to use them for content optimization. Here are a few examples of LSI keywords found with an LSI keywords tool:
- Mashed Potatoes – Garlic, sour cream, crock pot
- Rye Whiskey – Absinthe, Bulleit, most expensive, brands of boubon
- Harry Potter – About the characters, prisoner, azkaban, goblet, cursed child
12. What’s the difference between a do-follow and no-follow link?
Links that add to the flow of ‘link juice’ throughout a site and its neighboring sites. Whenever a popular blog references a smaller one, they both get a boost in ranking. Relevant backlinking with neighbor sites is key to placing in SERPS, and all hyperlinks count as do-follow links, unless they are marked otherwise.
No-follow links are given the ‘rel=nofollow’ tag and do just that – they do not count towards page ranking. One would use a no-follow link for irrelevant content that might otherwise hurt SERPS placement or page ranking. No-follow links are common among forum posts, blog comments, and social bookmarks. While they do not pass link juice, they are useful for establishing a diverse link profile.
13. Is the crawl-delay function usable with Googlebot?
The crawl-delay robots.txt mandate is not recognized by Googlebot. Users can limit the Googlebot’s crawling in the Search Console – however a higher limit will not increase crawling automatically.
14. What happens when your domain’s trust flow exceeds your domain’s citation flow?
Trust Ratio = Trust Flow / Citation Flow
Higher citation flow means you are providing plenty of links, but they are untrustworthy and could count against a site’s content relevancy.
When the domain’s trust flow exceeds the citation flow, Google will determine the site has relevant and trustworthy links and more of the coveted ‘link juice’ will flow. A useful tool in determining a site’s trust ratio is the Majestic SEO Chrome plugin.
We hope this has been a helpful look at some of the more advanced SEO questions to help boost your rank in Google. We’ll see you in the SERPs (ranking high, we hope).