A few times a month, Google Webmaster Hangouts is a place for SEOs and webmasters to get their questions answered about ranking their sites in Google Search by John Mueller and other people on the search team. All hangouts are publicly recorded and available on YouTube — you can either participate live or watch them after they’re recorded.

Below, you’ll find the notes from the November 14th, 2017  English Google Webmaster Central Office Hours. Subscribe to the YouTube channel, so you don’t miss a recording.

Have there been any changes with regards to the Disavow File?

Nothing has changed with the disavow file for quite sometime, according to John Mueller. If you have a manual action on your site and you cannot reach out to a webmaster to NoFollow or remove those links then Google recommends using the Disavow File to cleanup your link profile.

Once you’ve added new domains or URLs to your disavow file, you can submit a Reconsideration Request to have the webspam team review your request to remove the manual action on your website.

For sites with no manual action, Google tries to automatically take those links out of the equation when ranking a website. If you’re unsure whether Google is taking spammy links out of the equation, then the disavow file is the most sure way to request that Google not take into account any of those spammy links.

According to John Mueller, you don’t need to go through your website and enumerate links one by one, but if you see a larger link scheme that you don’t want to be associated with then you can go ahead and use your disavow file.

Time: 2:50 – 6:50

I have a client and in his link profile, there are links from forums with new forum members starting healthy discussions with links in context in natural forums linking to my client’s site with the URLs only — should I be worried about these links because they are from new members that don’t have reputation and lack posts without links?

John Mueller spent a few seconds throwing shade, because let’s be honest, look at this question! The more shade the better here. He recommends backing off on forum comment spam — because this is why we (as marketers) can’t have nice things.

Since forum links are predominantly NoFollow, it doesn’t seem to me like this would have a meaningful impact on rankings, and like JM says, the forum doesn’t appreciate that.

Nobody likes forum spam comments.

Time: 6:50 – 10:03

About 3 months ago our site suffered a drop in rankings. We’ve taken steps to rectify this by adding better content, improving the user experience and as a result our user engagement metrics have improved and our domain name is trending on Google Trends. How long does it typically take to see improvement in the quality of the site reflected in Google rankings?

According to Google, there is no fixed timeframe for when the changes you make on your website are reflected in Google Search. Site improvements can have both primary and secondary benefits, so it can take time to see the full impact of your efforts.

Aside from the primary improvements to user experience, people might also share your site more and you could experience the added benefits of more links over time. JM suggests that the timeframe could take up to a year for your site to be crawled, indexed and for the signals to be recalculated.

Time: 10:03 – 12:34

Page speed has an enormous impact in our rankings at the moment. We’re seeing an enormous loss with every performance issue we’ve had in the past couple of weeks. Would you say Google is using Page speed data from analytics as a ranking factor or user data, like increase in bounces due to bad performance?

No, Google doesn’t use analytics data at all. With regards to speed, Google differentiates between sites that are really, really slow and others that are normally fast. If you’re seeing Page speed differences in the order of a couple hundred of milliseconds then that would not be reflected in search — it’s not something that the algorithms take into account.

JM said he could see that maybe playing a bigger role in the future at some point, but certainly at the moment it’s not taken into account that much. At the moment this is pretty much implemented the same across mobile and desktop. Obviously some of the factors are different across desktop and mobile because Google is working with different data on each platform, so the ranking wouldn’t be the same in general. JM said he could imagine us [Google] changing this with the mobile-first indexing or something like that.

He also noted that the raw HTML request from the server is what can impact the overall crawling rate of a website. If Google finds that its requests for the HTML is very slow, then they’ll back off their crawling rate.

Google won’t request as many pages per day from that website as before, because the team wants to make sure that it’s not the crawling that is slowing the website down.

If it’s really complicated for Google to render a page, then it’s possible that Google will not be able to render the page completely. If the JavaScript request involves too many include files to process the rendered view then Google will have a more difficult time crawling and rendering the page.

The crawl stats in Search Console are only about the fetches, which includes everything from images to the number of JavaScript files fetched.

Time: 12:34 – 17:36

What % of auto-translatable pages is acceptable to send for indexation? Does Google Translate provide high-enough-quality translations for indexation?

0 and no. From Google’s point of view, automatically translated pages are automatically generated content. It’s not something that Google recommends making available for indexing.

JM recommends using Google Translate as a tool for manually translating pages, but not as something you’d use as 1:1 content on your website.

If you want to provide translated content, then make sure that it’s actually translated content and not something that’s automatically generated.

Time: 18:52 – 20:16

My site rarely goes up in search and when it does it just goes back down. Today it dropped nine places, though the two sites that were above it dropped the same amount of places as well. In September, I took out a request for links and in October I improved titles and headings of 1,200 forum posts. Should I just NoIndex my forum completely — what’s up with that?

According to Google, it’s not normal for a site to stay in the same ranking forever. In general, if you’re seeing your site fluctuate wildly that’s more a sign that you’re kind of on the edge with regards to the algorithms.

They’re sometimes thinking, “Oh, this is actually pretty good.” And other times, “Oh, this is not as good as I thought.” That’s kind of a hint that it doesn’t take a lot more to push your site in one direction or another.

Time: 20:18 – 22: 42

Have there been any indexing issues recently? Some webpages are out from ranking on their main keywords and less related pages are appearing?

There are always changes in search — thousands of changes a year. It’s kind of normal to see changes. If you’re looking at things granularly, then you may see changes that appear large but are considered small by Google.

Time: 22: 44 – 23:30

My website uses H1 on the homepage which appears letter by letter, as if someone is typing the phrase, is that a problem? Is it a good practice or bad practice?

If the H1 contains all of the text that’s contained in the phrase, then Google would probably be able to pick that up and use the full H1 tag as the heading on the page.

The really simple way is to search for that text explicitly in search. If the page comes up, then Google is picking up the heading. If a site is using JavaScript to swap out the characters individually, then Google may only see individual characters come and go and only index a subset of characters that Google sees when it renders the page.

Time: 22:32 – 25:26

Why would my homepage list above a dedicated page for a specific service?

Google says that this can happen when the content between the homepage and the service page are similar. This commonly happens when the whole text of a blog post is displayed on the homepage of the blog. It can become unclear to Google which page to actually rank in search.

Making it easier for users to find the content on the detail page, which you’d like them to visit, may actually help that page rank in search. It won’t have an immediate impact on your rankings, but over time if users go to your detail pages directly and recommend them directly, then Google may pick it up as an indirect signal.

Time: 25:28 – 27:29

Is it OK for a site not to use H2 and organize its content only using the main H1 and all subsections have an H3 only?

You don’t need to use headings. You can use headings to make it clear which parts belong together, but Google is not picky with regards to headings. Some sites have multiple H1s on a page, while others have a much more defined structure.

What if a website doesn’t have any H1s?

It’s a bit harder for Google to understand the content if it can’t understand the pieces of it, but it’s possible using bold or CSS styles to let Google know what is important on the page. It helps Google understand the content, which flows into ranking — especially when Google is trying to understand which chunks belong together.

If there is text below an image, then Google is able to pull out the words from that text and apply it to the image for ranking in Google Image Search.

Time: 29:47 – 31:50

We see a lot of URLs blocked by robots.txt but no impact on indexation. Is this something to be concerned about? Our guess is that Search Console is just telling us here where Googlebot is disallowed. Would it be OK or is there a way to check which URLs are blocked?

Yes, in Search Console you have the blocked URLs report (i.e. Blocked Resources) where you can see which URLs have blocked content on them. The idea with this report is not that the landing pages themselves are blocked, but that there is some content embedded on those pages which is blocked by robots.txt.

Time: 31:53 – 35:28

My blog traffic dropped in half in the past two months and now even Google index status said it dropped in half. I checked everything, backlinks doesn’t seem to be the problem. What might be the issue there?

If index status is going down then this sounds like a technical problem, not a quality problem. It’s not the content or the links, but it sounds like Google is not able to crawl the pages in a way that it can use them for indexing.

Use Fetch as Google in Search Console to make sure that Google can crawl the pages and Fetch and Render to make sure Google can see the full content of those pages. If you’re seeing the index status report go down, then it’s a really strong sign that there is something technical askew.

Time: 36:10 — 37:50

Using ReactJS we create links available on user interaction in order to filter products on the product pages, which is useful for a certain price or delivery range. Bots can only see the root URL in the source code without URL parameters. Is this considered cloaking or not?

In general, cloaking is a big problem when it applies to webspam issues (i.e. Google sees completely different content when it crawls a page). Usually this would be less of an issue, but it makes debugging things difficult because you don’t know exactly what Googlebot is seeing.

JM’s recommendation is to make sure that the developer uses all the normal techniques for faceted navigation even if someone is using JavaScript to make the faceted navigation.

Time: 37:53 – 39:30

Is it possible that Google shows a different Answer Box for a search depending on the place where the search is made, even within the same city? So, if I search for Starbucks Madrid then it shows me the nearest Starbucks phone number?

This isn’t an answer box — Google calls them One Boxes. The user is actually seeing a local search result info from map listings. For just moving around within a city, JM doesn’t believe Google would show different rankings or featured snippets.

JM says that you would likely see differences based on personalization, not because you’re one street down on the road.

Time: 39:40 – 41:00

For the mobile version, I see almost 100% of AMP results ranking for all queries. What if I had a whole website in AMP instead of just having AMP versions of article pages. Would it improve the user experience?

Google would love for you to make your whole website in AMP. AMP is shown on a per URL basis. On mobile, if Google has an AMP equivalent of a specific URL it will try to show that, if not it will show the desktop or existing URL — whatever is available.

There are whole sites developed on the AMP platform, including the AMP Project, as well as others. 

Time; 41:01 – 42:14

If you redirect Googlebot between mobile and desktop pages based on user agent do you still need rel alternate tags on desktop pages and canonical tags on the mobile pages?

Google says, yes, yes you do. The clearer the connection between pages, the better.

Time: 41:02 – 42:50

Does it make sense to redirect old AMP pages to new AMP pages?

Yes, the AMP cache will try to update the AMP pages as well and if the old AMP pages returns a 404 then they’ll be dropped out of the AMP cache instead of understanding that they’ve moved to a new URL — images need to be redirected as well or they’ll be dropped from image search results, which (according to Google)  take much longer to be refreshed.

Time: 42:51 – 43:56

I’d like to know if I need to add JSON-LD markup on all pages of my website or just the homepage. For example, the organization or web page markup?

JM believes that organization markup is just something that you can put on the homepage because that is where it is picked up from. Other types of markup, like recipes, need to be implemented on individual pages.

It ultimately depends on the type of markup you’re implementing.

Time: 43:57 – 44:33

One month ago, I submitted a sitemap and all 17 pages are crawled. After the one submitted yesterday, it chose only one page to index. What might be the problem where not all pages are being indexed?

A sitemap file does not guarantee indexing. Google uses the sitemaps index count on an exact basis per URL that you specify. So, if your sitemap file has a URL that is subtly different than the one Google uses for indexing, then Google does not count that as being indexed.

Double-check your sitemap file and compare it to what you have linked within your website to make sure the URLs match.

Time: 44:34 – 46:08

If someone wanted to start an online store would you recommend them to use their name brand as the domain name or would you recommend them to choose a domain that includes some keywords?

If you’re planning on being online in the long run: be yourself. Make room for product expansion with a broad name.

Time: 46:10 – 47:43

Do RSS feeds help with SEO?

Google uses RSS feeds as a way of finding new content — the same way as Google uses sitemap files. If you have an RSS feed, but you don’t have a sitemap file, then this can help Google pick up your content.

Simply having an RSS feed does not improve rankings, it’s more a matter of discovering content.

Time: 47:45 – 48:20

We are rebranding from A to B. Is it advisable to redirect b.com visitors to a.com before rebranding? We will probably launch b.com next month.

Just do it all at once, says Google. There is no additional value in redirecting people from the new domain to the old one and then redirecting them to the new one at some point.

Time: 48:21 – 49:09

I’d like to know if Google looks at WordPress tags sitemap as important for SEO. I see photographers using tags excessively.

Google sees tag pages on a site the same as other links on a site. Tag pages help to crawl through the rest of the website, but they are not critical for the ranking of the content within the website.

Time: 49:10 – 49:50