Trending February 2024 # Google Updates Remove Outdated Content Tool # Suggested March 2024 # Top 3 Popular

You are reading the article Google Updates Remove Outdated Content Tool updated in February 2024 on the website Flu.edu.vn. We hope that the information we have shared is helpful to you. If you find the content interesting and meaningful, please share it with your friends and continue to follow and support us for the latest updates. Suggested March 2024 Google Updates Remove Outdated Content Tool

Google Search Console is updating the remove outdated content tool which allows site owners to request removal of URLs they don’t own.

Not to be confused with Search Console’s remove content tool, the remove outdated content tool is designed for entirely different use cases.

The remove content tool is designed for site owners to quickly remove their own pages from search results.

Whereas the remove outdated content tool is used to request deindexing of pages on other websites.

Anyone can send a request for outdated content to be removed from search results if, for example, the SERP shows content which is no longer present on a page.

That’s called an outdated cache removal. There’s also outdated page removals which are used in cases where the content no longer exists at all.

All request are subject to Google’s approval. Google will deny the request if there’s no reasonable grounds for removing the content from search results.

Now that we’re up to speed on the difference between the two similar tools, let’s take a look at what’s new.

Remove Outdated Content Tool: What Has Changed?

The updated remove outdated content tool has a refreshed interface which makes existing capabilities more intuitive to use.

Capabilities of the remove outdated content tool have not changed, which means SEOs and site owners are not losing any functionality.

Here’s an example of the old interface compared to the new interface.

As you can see, Google updated the look and feel of the tool to be more consistent with the new Search Console interface.

It’s also better optimized for mobile devices, should you find yourself wanting to remove outdated content while on the go.

There’s a notice at the top of the screen of the old version, which is still accessible, indicating it will not be available after January 19, 2023.

For more information, see the official Google support page here.

You're reading Google Updates Remove Outdated Content Tool

Is Google Neglecting Your New Content?

This is a sponsored post written by Botify. The opinions expressed in this article are the sponsor’s own.

One of the greatest challenges of SEO is to make sure search engines have visibility over your newest content. Otherwise, they may only display older content in search result pages.

You need to be aware that search engines probably don’t explore your entire website, because of crawl budget limitations.

Also, if your website publishes new content regularly (like e-commerce marketplaces, news publishing websites, classifieds websites, forums) your most important content may not be discovered fast enough, not in full – or neither.

Here are some important questions you should ask yourself to make sure you are not missing key traffic opportunities on your new pages:

How Much of Your New Content Does Google Explore?

To get an answer, we need to do two things:

Get the inventory of new pages, with a crawler set to explore the website periodically (weekly or daily for instance) to track changes,

Find out which of these new pages are explored by search engines. Web server log files are the only source for this information: they record every single hit on the website, wherever it comes from – a user or a search engine robot.

A report that combines both aspects sheds light on Google’s behavior on your new pages.

Let’s take a look at the example of an e-commerce website.

The first chart details the website as discovered by the website crawler, showing only new pages (those not found in the previous report, one week earlier). Only those in blue were explored by Google. The search engine’s behavior will vary depending on the type of page (product lists, product pages, etc.), so each type of page is shown in a separate bar.

Does your website aim to generate significant traffic from mobile devices? Then, you may also be interested in finding out which bot explored your new pages (Google’s generic bot used for desktop search traffic or its mobile bot), as shown in the bottom chart.

Do New Pages Start Generating Traffic Right Away?

Among new pages explored by Google, some start generating traffic while others don’t. Which are bringing in traffic? Those that do – a subset of pages crawled by Google – are called active pages in the chart below (in green).

If the website crawler has visibility over the date the page was added to the website, for instance by extracting the publication date from article pages on a publishing website, then you can get even more insight by looking at the delay between each step in the organic traffic conversion funnel: publication date vs. date of first crawl by Google vs. date of first visit generated from Google results.

Does Your Website Structure Promote Your New Pages the Way They Deserve?

Some of your new pages weren’t even crawled by Google. Why is that?

Here is, for our previous example, the distribution of incoming links to new pages. The blue line indicates the number of links by percentile for the whole website (pages in the top percentile receive on average close to 10,000 links each, while those in the lower percentiles receive a single link on average). The distribution of new pages in the different percentiles is shown in yellow.

If we look only at pages that were crawled by Google (top chart), and only at pages that were not crawled (bottom chart), the impact of internal linking is pretty obvious: most pages that receive a significant number of links are crawled, most of those with few incoming links are ignored.

What About Content?

Internal linking is about page accessibility to crawlers. Another potential reason for pages not being crawled may be poor content quality. If a certain type of page tends to have thin content or little unique content – which Google will know, based on pages of the same type crawled previously –  it does not provide an incentive to try to crawl more of these pages at every opportunity.

Poor content quality may also be one of the reasons why some new pages are crawled by Google, but don’t generate traffic: they are not good enough to rank.

Botify provides full visibility on your new pages, as well as the rest of your website. It is the only SEO tool in the industry that combines a website crawler and web server log analysis along with segmented views of your website and tons of other key metrics, offering unparalleled insights into SEO priorities. With Botify, you get everything you need to make the right decisions and get results.

Try it for free for 30 days!

Image Credits

Google Updates Its Guide On Preventing Spam And Abuse

In a welcome update for website owners, Google made clear changes to its spam and abuse resource center on Google Search Central.

The biggest updates include more robust suggestions to prevent abuse and identify spam accounts, instead of focusing on how to monitor for it.

Preventing Spam and Abuse

Previous to this page update, the first section discussed “Web hosting services that are available without payment”. It is now replaced with “Prevent abuse on your site and platform”.

The previous section seemed to focus on websites that used hosting services without payment. However, spam and abuse happens on even the most secure sites.

The new and improved language gives all website owners more content to take actionable, preventable steps to protect their website.

Suggestions from Google to help prevent site and platform abuse include:

Publish a clear abuse policy during sign-up process

Identify spam accounts by reviewing certain interaction patterns

Use manual approval for suspicious user interactions

Use blocklists to prevent repetitive spam attempts

Block automated account creation

Monitor site and platform for abuse

Additional new sections to the page include a combination of manual and automated approvals for suspected spam.

Google lists suggestions to identify and block certain IP addresses, as well as helpful plugins that can help automate the process.

Why The Change?

According to chúng tôi almost 85% of all emails are spam. Now imagine the number of account creations you get on your website each day.

Website owners have had to be more reactive to this type of spam. As soon as one type of spam pattern is blocked, another one appears almost immediately.

This type of manual monitoring costs companies time, money, and efficiency.

Google has updated its developer guide to take a more proactive approach instead of a reactive one. By starting with how to prevent abuse in the first place, it can save companies vastly in the future.

Additionally, Google subtly changed its language to target all website owners, not just low-cost or low-budget websites. It’s a more inclusive approach to preventing spam and abuse for everyone.

Summary

Whether you’ve been in a reactionary state of spam and abuse management or a brand new website owner, read up on Google’s updated guide.

It provides actionable content and resources to help prevent additional abuse and spam from the start, giving you more security from the get-go.

Will Google Penalize Duplicate Content In Reviews?

Editor’s note: “Ask an SEO” is a weekly column by technical SEO expert Jenny Halasz. Come up with your hardest SEO question and fill out our form. You might see your answer in the next #AskanSEO post! 

Welcome to another installment of Ask an SEO! Today’s question comes from Guy in Wales, who asks:

I run a photography business and have been gathering good reviews for my business over a few years. After I finish a job I send the client a list of about 4 links to review sites and ask them to complete a review for me. These may be Google, Yell, Scoot, and Freeindex, for instance.

Clients are happy to do this and usually copy and paste the same review to each site. I also take a copy and pop it on my ‘Reviews’ page. I might also put it on another relevant page like one dedicated to the venue.

My question is: Will Google penalize me for duplicate content for the review on my Google listing appearing on other review sites and for copying the Google review and adding it to my own site once or twice? Can I add it to my site and site that it came from Google originally? If so how do I go about this?

Google will not penalize you for this under their current guidelines.

You can’t control what your reviewers do, and if they choose to copy and paste the same review to multiple locations, that’s not anything you can be expected to be responsible for.

How Do Reviews Impact Google’s Ranking?

Google isn’t transparent about how they consider reviews in ranking.

We know that the Quality Rater Guidelines indicate that Trust (and Authority and Expertise) are important. We can guess that reviews are one way they determine that trust.

In my opinion, Google must feel like they can trust you.

So if you don’t have any unique reviews on the platforms – if every single review is copied to every review site – then that’s probably a strong indicator of mistrust.

However that’s just an opinion, and there’s nothing that a Googler has said or written that I can point you to.

With regard to copying these reviews on to your website (as well as them being on other sites), I don’t see a problem with this because the content did originate with your customer.

It’s also unlikely that you’re going to see that significantly impact your ranking since the content isn’t original.

Ultimately, review type keywords (like “best” or “top”) reward aggregator sites rather than individual sites. It’s a lot different, for example, if you say you have 5-star reviews as opposed to Google My Business saying you have 5-star reviews.

Protect Yourself Legally

One thing to keep in mind with reviews is that they are becoming increasingly litigious.

We’ve seen several cases in the legal system so far of fake reviews or reviews that were paid for, and I expect that we will see more.

So whatever you decide to do, talk it over with your legal team to make sure it aligns with FTC (Federal Trade Commission) guidelines, Truth in Advertising laws, and any other areas your legal team feels is prudent.

What’s Your Goal With Reviews?

A final consideration for these kinds of keywords is that often Google already limits the results in the algorithm based on sites that have a 4.0 aggregate rating and above.

See this screenshot for a search for “top cardiologist” as an example:

Ultimately, you need to decide what your goal is with reviews.

If it’s to show prospective customers that you offer great service, that’s ideal.

If your goal with reviews is to help you rank better in Google, then Google My Business is currently the way to go.

Have a question about SEO for Jenny? Fill out this form or use #AskAnSEO on social media.

More Resources:

Image Credits

Screenshot taken by author, October 2023

Google Explains How To Remove Pages For Seo

On an office hours hangout, Google’s John Mueller discussed a common mistake publishers make when removing content for SEO.

He then focused on the best way to remove content for more traffic.

Can it Help SEO to Remove Pages?

The publisher asking the question wanted to know if removing non-performing pages helped SEO.

The idea is that removing “dead wight” will help Google focus on the pages that matter.

Question:

Can it help SEO by reducing web pages by marking our product pages noindex, which have almost zero impressions in the last 16 months.

Currently 10 to 15% pages are like this and they’re just dead weight on our site.

I was wondering that after noindexing such pages we will submit fewer pages to Google in the sitemap and Google could focus on the rest of our site better.

It Depends on Why a Page Doesn’t Perform

Mueller answered that this is not a yes or no question.

He said that removing pages does not automatically cause the remaining pages to perform better.

John’s answer:

It’s something that I know some sites do.

I think it is not a totally unreasonable approach to say that the pages that nobody cares about I essentially removed from my website.

But it’s something where I wouldn’t just blindly do this.

So if you’re just blindly focusing on the number of impressions that you have for individual products and you drop them from search then it’s very easy to drop things that are actually useful but they’re just not that common.

Ask: Why is a Page Not Performing?

Mueller’s answer means that whether a page it depends on why the page is removed depends on the reason why the page is not performing well.

Mueller explains that maybe not that many people are searching for that particular keyword phrase.

So the metric of impressions isn’t necessarily the best one to use for identifying pages to noindex and block Google from crawling.

John continued with his answer:

It might be that maybe it’s an archived version of a product or page where people after a certain period of time they need to go back there to find instructions for repairing this product or they want to look up historical information about this item.

And that’s not something that happens every day. So if you just purely look at the number of impressions and it’s easy to accidentally include a lot of things that are actually still useful for the web, they’re just not that commonly used.

On the other hand looking at the number of impressions and the types of pages that you have on your website, that can give you a little bit of a better understanding of which types of pages are more important for users.

And that can either guide you to saying, well this type of page is something that maybe I don’t want to provide anymore or perhaps it can guide you into saying, well this type of page is currently not seen as being that useful.

Ask: Can this Page be Improved?

Mueller now touches on the important analysis of whether a page simply needs updating.

For example, if a product has been replaced by a newer and better one, ranking for the old product can be an opportunity to announce that the product is no longer available (good information!) and that a better product is available.

If the new product costs about the same then note that fact to encourage a user to consider the newer product.

This is Google’s John Mueller’s recommendation on removing content:

For informational content it might be useful to see if new techniques, technology, or jargon has changed. Not all informational content is evergreen. If it can be improved then that’s definitely a good thing to consider doing.

Maybe if I significantly improved it, it would be different. And that’s also something where you don’t just go and… blindly look at the number of impressions but rather you have to make a judgment call and look at that and see does it make sense to.. remove this?

Does it make sense to improve it?

And a lot of times it does make sense to improve things on the web.

Fewer Pages Doesn’t Cause Higher Rankings

There is an idea that removing non-performing content will provide lift to the rest of the webpages.

Mueller affirms that this isn’t an automatic outcome.

It may be helpful for some sites and not so helpful for others.

John’s remarks on removing pages for SEO:

With regards to just having fewer pages and those fewer pages then ranking higher, I don’t see that happening so much. It can help for a very large website to reduce the number of pages that they provide just purely from a technical point of view and that if we can like crawl 1/10th of the pages on a website and it’s a lot easier for us to pick up those 1/10th of those pages a lot faster.

That can in turn help us to figure out well maybe these are the pages that are really important for the website. But if you’re just dropping a handful of pages here and there, I don’t think it changes anything for crawling and probably not much for the website in search overall.

The Right Way to Remove Content for SEO

The important consideration when removing content is that it’s not a one-size-fits-all solution for improving rankings.

Removing pages does not automatically help all sites rank better.

Furthermore it’s important to assess whether a page simply needs improvement.

Removing content is an old solution for traffic issues.

There was a related SEO solution for removing content, called Content Cannibalization.

Then somewhere around 10 years later the technique was rediscovered and renamed Keyword Cannibalization.

In this version of the strategy, it was hypothesized that content that was too similar would eat into the rankings of the other pages.

The solution was to remove pages.

The thing is, regardless if one is considering removing similar content or low performance outdated content, it might be useful to consider updating the pages to make them unique or serve a different user intent.

Removing content doesn’t solve all problems.

So take your time and analyze why a page is not performing well and then decide if the page can be improved or serve as an upgrade path for potential clients who need a product upgrade.

Watch the Google Office Hours Hangout:

Related: How & Why You Must Improve or Remove Your Old Content

Does Content Or Links Improve Trust With Google?

Google’s John Mueller answered a question in Google Office-hours hangout about improving trust with Google. Trustworthiness, along with expertise and authoritativeness are a hot topic. Mueller addresses the topic of trust factors in his answer.

Trustworthiness and E-A-T

Trustworthiness has become a big deal nowadays because Google’s Quality Raters Guidelines describes how Expertise, Authoritativeness and Trustworthiness are important things for the search quality raters to look for when evaluating search results for specific kinds of queries, particularly what Google calls Your Money or Your Life categories like those for finance and medical search queries.

So it’s natural that an SEO would want to know how to improve their trust with Google.

Google Search Results About Trust Factors

If you search Google for:

What are google trust factors?

Google responds with multiple sites making a variety of claims:

The number one site:

“What is the Trust Factor? Google’s trust factor is a combination of many factors that they use to apply a value of how trustful a site is. The more trustful a site is seen the more likely its articles will be ranked higher on specific Google searches.”

The number two search result:

“Google TrustRank helps Google and other search engines combat web spam. Specifically, TrustRank measures so-called “trust signals”.”

The number three search result:

” In reality, whether or not Google trusts your site depends on several factors. Security is a leading factor.”

Search result number 4:

“Google uses trust signals to evaluate the genuineness of other ranking factors.”

One site published a periodic table of SEO factors:

“Here we dive into the Trust elements of the Periodic Table of SEO Factors.”

There’s an irony in Google’s search results though because the content about a Trust Factor is incorrect.

John Mueller’s answer seems to contradict what all of those answers in Google’s own search results claim.

How to Improve Trust

The person asking the question wanted to know what the best way was to improve trust with Google, presumably to achieve better rankings.

Here is the question:

“Does a website which includes great content improve in trust with Google or is that only determined through links?”

Google Says There are No Trust Factors

Google’s John Mueller answered:

“I don’t think we have like a trust factor that we can look at and say, oh trust is at (I don’t know) nine of out of twelve or whatever number you would have there.

So that’s kind of (I don’t know) …it’s almost like a philosophical question at that point.

It’s like, does improving the quality of your content overall make a website more trustworthy with regards to Google?

And like well… I don’t know. There are no metrics specifically for that.”

No Metrics for Measuring Trust

A metric is way to measure and help evaluate something. John Mueller is clear that Google does not have a metric specific for measuring trustworthiness.

The only reason trustworthiness is important in the search quality raters guidelines is because that is what it wants the third party raters to look for.

But that does not mean it’s a part of Google’s algorithms and that there’s an algorithm at Google that is rating sites for trustworthiness.

Improving Content is a Good Approach

Mueller next validated the practice of improving content.

He continued his answer:

“I think improving the quality of your content is always a good idea.

But it’s uh …lots of things are involved there.

And when it comes to trust it’s definitely not a matter of just links that are pointing at a website.”

No Metrics for Trust and Not Just Links

It’s always a bad idea to take a part of what is written in a patent or what John Mueller says and then make it mean something outside of the context of the overall patent or statement.

John Mueller’s answer that there is no metric for trustworthiness was said in the context of answering the question of whether “great content” or “links” help improve trust with Google.

Rather than focus on whether links or content influences Google to trust a website, Mueller discourages the person from thinking in terms of affecting a non-existing trust factor.

He encourages the person to focus on improving the content. Content is one of the few things that publishers have total control of, something that can’t be said about legitimate links.

Citation Google Says There are No Trust Factors or Metrics for Measuring Trust

Watch Mueller pop the bubble on the idea of a trust metric at the 29:20 Minute Mark

Update the detailed information about Google Updates Remove Outdated Content Tool on the Flu.edu.vn website. We hope the article's content will meet your needs, and we will regularly update the information to provide you with the fastest and most accurate information. Have a great day!