Trending February 2024 # Google: Improving 3Rd Party Metrics Won’t Boost Rankings # Suggested March 2024 # Top 8 Popular

You are reading the article Google: Improving 3Rd Party Metrics Won’t Boost Rankings updated in February 2024 on the website Flu.edu.vn. We hope that the information we have shared is helpful to you. If you find the content interesting and meaningful, please share it with your friends and continue to follow and support us for the latest updates. Suggested March 2024 Google: Improving 3Rd Party Metrics Won’t Boost Rankings

In a Google podcast, Search off the Record, Googlers discussing spam digressed to discuss third party metrics and their impact on Google search rankings. They observed that improving the scores of third party metrics did not result in an improvement in search rankings and suggested expanding to a wider range of factors.

This part of the discussion involved Senior Webmaster Trends Analyst for Google, John Mueller and Duy Nguyen of Google’s Search Quality Team which focuses on catching spammy sites.

Third Party Domain Authority Metrics

Many tool and data companies provide metrics that help publishers compare their websites to other websites with a convenient metric that assigns a proprietary “authority” or “ranking” score to websites.

Some of these metrics use links and traffic (among other factors) to calculate an authority or ranking score.

The purpose of these metrics is to help publishers and SEOs do competitive analysis.

There are many publishers however who use these metrics as proof of site quality and will seek links from other sites with high 3rd party authority scores in order to improve their own authority scores.

What Google makes clear is that they’ve never seen a correlation between improving those third party metrics and a positive impact in search rankings.

Third Party Metrics and Search Rankings

Google’s John Mueller began this section of the podcast by remarking that many publishers focus on factors that have zero impact on search rankings.

[00:24:35] John Mueller:

“I think it’s also, like you said, one of those things where you don’t even know if it will actually help your site.

And potentially, it’ll just harm your site and then you’re just digging a bigger hole for yourself rather than working on something positive for your website to improve things for the long run.”

[00:24:56] Duy Nguyen:

“Yeah, an example of that that we observed was web masters or else spammers tend to focus on improving one or two particular metrics that are external, that we absolutely do not use.

They, for some reason, think that if they put on a lot of time and money in improving such scores, it would perform really well on Google Search. I’ve never seen a case where that actually work well.

And I find it pretty sad… because if all that time and money were spent on building up the websites with better user experience, more functionality, writing better quality content, producing high quality images, they’d probably do a lot better on Search and obviously a lot more sustainable for the site, itself.”

[00:25:43] John Mueller:

But it feels like… sometimes I see people in the forums just saying like, “I want to improve this metric.”

They don’t really want to focus on the site overall. They’re just like, “I just want to change this number from 7 to 25.

And I’m like, “Why?” It’s doesn’t change much.”

Don’t Focus on Just One Factor

Duy Nguyen next discusses the futility of focusing on one thing (like a third party metric) at the expense of focusing on the hundreds of actual ranking factors that have an impact on search performance.

Duy explains the benefits of focusing on a wide range of performance metrics.

Duy Nguyen: [00:26:33]

“Yeah, I love data, myself. I think the more data you have, the better you would be at your role, whatever that may be. As a site owner or an online marketer, I think it’s really great to have a bunch of metrics that you monitor and measure and try to improve as long as you don’t focus on one thing.

improve more.

Or for some reason I find that nobody really discover our Contact or Support pages. Why is that? Do we have a problem there?

If people need to contact us, maybe we should just put it somewhere else, rewrite better content.

the board and would rank your site better.”

Search Rankings Depends on Hundreds of Signals

In order to improve search rankings it’s important to focus on a wide range of relevance and popularity signals.

Some publishers might focus on links but their content might lack authoritativeness.

Other publisher might focus on authoritative content but neglect to make the web pages user friendly.

Another publisher might do nearly everything right but neglect to adequately promote each article in order to encourage links.

Publishers might focus on building links but neglect to build an audience, relationships with readers, or relationships with influencers.

Sites that rank well for a limited time then bounce around between the first and second page of the search results sometimes are sites that are neglecting a certain part of site promotion, user experience, content quality or content relevance.

Sites that seem to be locked into the first position as if they owned it tend to be sites that address all aspects related to user experience, signals of popularity and topical focus.

Citation

Tackling Web Spam, Search Quality, and More!

You're reading Google: Improving 3Rd Party Metrics Won’t Boost Rankings

A Week With Nookcolor: 3Rd Party Apps And Final Wrap

A Week with NOOKcolor: 3rd Party Apps and Final Wrap-Up

Welcome to the final installment in our “A Week with” review session with NOOKcolor. This particular installment will ask the big questions and seek the big answers. This installment will show you what lies beyond the reading, what’s on the mind of those who would seek developers for apps, and will deliver a final sentence on whether you, the everyday average dude or lady, should invest in this device. Behold! NOOKcolor’s fancy dressing.

If you’ve been following along with this epic journey we’ve taken with NOOKcolor, you know that it’s a lovely looking device, certainly feels nice to hold and to handle, and without a doubt is a unique device that is a centerpiece for Barnes and Noble’s electronics department. Along with the main reason it exists, that being that whole “reading” thing, there’s a tiny pack of apps, a simple web browser, and a files section in your NOOKcolor library.

First let’s chat about this files section. It seems to be rather secondary in that you’re going to have to know exactly what you’re doing as far as file format goes if you plan on using any files you didn’t download from B&N directly. After an hour of trying for a video file format that worked on the device (even after having check the manual hosted on NOOKcolor,) I was unable to get a file running. I’m no magician, but if I’m unable to find a file format that works, you bet your biffy Grandma Edna won’t be able to do it either.

The web browser is nice and big. Whenever you’ve got a screen this large and have a web browser that functions half-nice (this one functions quite well, actually,) that’s a big plus. A very forgiving keyboard and memory for addresses typed in the past make this browser a functioning piece of your day if you, for example, are the sort of person who doesn’t own a smartphone but does carry this device around in your backpack and wants to check their web-based email (gmail for example) or social networking sites.

[vms a2fd00c72faacf0cb8ac]

YouTube works but plays relatively low-quality video for what you know this device is capable of (you see a magical array of video when you first turn your device on), NetFlix does not function. Above you’ll see an example of how well the browser handles the first TRON Legacy sneak preview trailer – notice a couple things – the reflection you see of the camera I’m holing up to film the screen (this device has a somewhat lowered glare, but you can still definitely see yourself in it when the screen is black), and the sound- it’s a bit muffled as the speaker is on the back of the device.

In your “Extras” tab, aka your apps folder, you’ll notice instantly that these were never meant to be the main attraction. The name of the folder is “extras” – get it? This bin of everyday device mainstays like chess, Sudoku, and crossword enforce the fact that this device is aimed at those who read and enjoy games that live inside the reading universe. You’ll also find a contacts app which contains people who also own NOOKcolors with whom you can interact with elsewhere (this app acts as a sort of flip-file of all your people, where you can add or subtract them and their info.) There’s another app for LendMe (accessible also through your Library, as we spoke about in a previous post) where you can ask to borrow or lend books from and to your contacts.

There’s a super simple Music app where you can play music files you’ve added through your “My Files” – THIS is the simplest and best implementation of the My Files feature, just drop your music files (more than likely MP3s) into your Music folder upon finding it once you’ve plugged your NOOKcolor into your computer, and poof! They’re in your Music app. Simple. This app is in place so that you might plug in your headphones and listen to music while reading. Of course, you could also use this function as a giant music player if you’re the sort of person who plays music in your car through a headphone jack, but seeing as how MP3 players are basically given away these days, this might be a bit unreasonable.

Gallery works in a very similar way, upon dropping a bunch of photos into your pictures folder when you’ve got NOOKcolor plugged into your computer, they show up here instantly. A great way to carry along a simple portfolio for all of us creative types? Perhaps.

Then there’s Pandora. Clearly the folks at Pandora saw this as a giant opportunity to get their service out to the world in waves, as this app is developed above and beyond all the others. Works great, sounds great, connects easily.

Now the people at Barnes and Noble aim to make this device a place where developers make apps that are reader-based. Apps that work with readers, around readers, and to improve and enhance the whole reading experience. Will it work? That’s the million dollar question. Take a look at their strategy to make this happen and consider the following:

1. The profit share for developers selling apps is not yet released.

2. They plan on offering help in the form of personal support through NOOKcolor developer.

3. They plan on offering deals and promotions that would take place inside the B&N stores – what this consists of is still unclear.

4. These apps must circle around and stay within the boundaries of what Barnes and Noble considers reading-based. Again, what these limits are and will be remains unclear.

Do these limits and prospects for developers spell a certain name for the future of NOOKcolor? We’re in sort of an age where it’s being decided how open our tablet-based devices are going to be, devices are going to survive or die based on the applications that are offered on them as well as the functionality they present. If there’s not enough apps, will NOOKcolor fizzle? Or perhaps the question should be if there aren’t enough publications available for reading on NOOKcolor, will it fizzle?

Link Graphs And Google Rankings

Search engines create maps of the Internet called Link Graphs and these maps help search engines determine whether or not a site is relevant or low quality and how the site fits into the Internet. Link graphs are a part of ranking and for that reason it’s important to understand what they are how your strategies make sense with this way of mapping the Internet.

What are Link Graphs?

Search engines map the Internet by the link connections between each website. These maps of the Internet are called Link Graphs. Link graphs reveal multiple qualities about websites on the Internet.

Link graphs show how sites are connected to each other.

Link graphs can be used to identify what topics a website is about.

Link graphs can be used to identify spammy sites.

Sites Link to Other Sites Related to their Topic

Sites about software and technology link to other sites about software and technology. Sites about cooking tend to link to other sites related to cooking.

The important take away about link graphs is that they can help tell search engines what a site is relevant for.

The link graph can also reveal networks of spam sites. While spam sites link to normal non-spam sites, normal sites do not tend to link to spam sites.

This has the effect of isolating spam sites into their own corners of the link graph.

I promise that any jargon will be explained and what seems complicated will be simplified.

Link Distance Ranking Algorithms

There are some algorithms that rank links. Whether or not Google uses these kinds of algorithms is not known for certain. We just know they exist and that they perform very well for discovering which sites are spam, which sites are normal and what the topic of the sites are.

Read: Link Distance Ranking Algorithms

The way this works a map of the web is created that has multiple starting points. Each starting point is called a Seed Site. Each seed site which represents a site that’s expert, authoritative and trustworthy in its topic.

Sites that the seed site directly links to are also trustworthy and expert. What was discovered in this kind of algorithm is that the further away a link was from the original seed site the less trustworthy, expert and authoritative that site tended to be.

For the purposes of illustrating the link relationship:

If a seed site links to a site, let’s call it a child site.

If that child site links to a site, let’s call it a grandchild site (of the seed site).

Sites that are in between, we can call them etcetera.

The seed site-based link graph might look something like this:

Outbound Links And Relevance

Outbound links going out of a website (together with inbound links) can influence whether or not a site ranks at all.

When one site links to another site, they are connected within the link graph. All of those connections form groups, sometimes called neighborhoods.

Solar System Analogy

Stay with me, because now I’m going to make analogies of how sites link together to form link graphs, beginning with how a single site is linked together with itself.

For example, the Solar System could be thought of as a website.

The website home page could be thought of as the Sun. Earth, Mars, Saturn etc. can be considered analogous of pages from that website.

So the whole Solar System can be thought of as a website, as your website.

A Website is Analogous to the Solar System

Milky Way Galaxy

The Solar System exists within the Milky Way galaxy. The Milky Way galaxy consists of other suns and planets.

In our analogy, the Milky Way galaxy represents all the other websites that are like your website and that are also about your same topic.

So if your site is an ecommerce site selling auto parts, all those other auto parts ecommerce sites are interconnected with your auto parts site by links from forums, blogs, product sites, manufacturer sites, review sites, etc.

The Milky Way galaxy, in our analogy example, represents all the websites on the Internet that are specifically about auto parts ecommerce. But it can also be whatever your own website topic is.

This is something to think about:

Outbound links from one site to another site create a map of the Internet by topic.

So your website and all the other websites in your niche looks like this to Google:

Analogy of Interconnected Sites on Same Topic

But… Your Niche Exists in the Greater Internet

The example site of an auto parts ecommerce store (solar system) exists within the overall topic of all the auto parts ecommerce stores on the Internet, in this example represented by the Milky Way galaxy.

The auto parts ecommerce store topic exists within a greater entity, which is the larger and more general topic of ecommerce.

The Milky Way exists as part of a cluster of other galaxies. This cluster is called the Virgo Cluster.

The Virgo Cluster is an analogy of all the sites about ecommerce.

Analogy of All Sites About Ecommerce

The Internet Link Map Reveals Topic Clusters

When search engines map the interconnections between websites, all the different topics tend to form clusters similar to how suns and planets form galaxies, including some some overlap as we’ll see in a moment.

Sites about any given topic tend to be interconnected by the similar sites that tend to link to sites about those topics.

For example, human resources-related sites tend to link to the same group of human resources related software sites and recruiting-related sites.

The Milky Way exists within the Virgo Galaxy Cluster. The Virgo Galaxy Cluster can be said to represent all the sites that are about ecommerce.

So in that Virgo Galaxy Cluster example there are groups of interconnected sites about sports ecommerce, fishing ecommerce, toy ecommerce, makeup ecommerce, and so on across all the topics that ecommerce covers.

Cluster of Super Clusters

But the Internet is bigger than ecommerce. The Internet includes the topics of politics, social media, ecommerce, travel, handbag sales, toy ecommerce, legal, entertainment, news, everything.

Staying within our analogy of the Internet as cosmos, a supercluster of galaxy clusters, where the red dots in the image below are clusters of galaxies, this is what the Internet may look to Google as a Link-based Map of the entire Internet:

Supercluster of Galaxy Clusters

All of the websites of the entire Internet arrange themselves by links into structures that can be said to resemble galaxies that represent other sites that are in the same topic.

Those galaxies can be said to exist within clusters of other topics that are related, like all the sites about ecommerce, all the sites about news, all the sites about travel, etc.

And the entire Internet can be visualized as a giant supercluster of clusters.

The above illustration is an analogy of how the Internet self-organizes itself into a gigantic link-interconnected map that self-organizes by topic.

Six Degrees of Website Separation

There is an idea that all people are six friends away from other people. A friend of a friend of a friend of a friend of a friend of a friend will ultimately lead to a connection to virtually anyone.

Whether that’s true or not is besides the point right now. What matters is that a similar thing happens with links.

The only difference with links is that there is an end point where the further  away you get from a starting point that more difference there is between the starting website where you began following links and the ending website further away.

What scientific researchers discovered is that if you begin at a starting point that you might call Expert and Authoritative, the further away you get from it the likelier it is that the site is spam.

The sites that are linked closer to the starting point tend to be more expert and authoritative and trustworthy.

That is the idea behind a type of ranking analysis called Link Distance Ranking.

Multiple scientific researchers (in and outside of Google) discovered that when you create a seed set of sites as starting points, it becomes even easier to weed out spam sites as well as more accurate at mapping out the Internet according to topic.

Link distance ranking algorithms provide a more granular level of categorization by topic to the Internet beyond the natural ordering that links provide.

Link Graphs Reveal Legit Links and Spam Links

Spam sites exist in their own cluster because that’s how the Internet naturally arranges itself by links, especially when you overlay a link mapping algorithm over the link graph.

In the early 2000s the search engines used statistical analysis to discover which linking patterns were unnatural. The sites with unnatural linking patterns were called the “statistical outliers” and those outliers were spam sites.

Later on researchers published link distance ranking algorithm research papers.

Today Google uses machine learning and AI to catch spam at the moment it discovers it when crawling and also at the point where Google places sites within the index.  The exact processes involved in Google’s spam AI are not known, we only know that artificial intelligence is used.

Internet SEO scammers claim that they can trick Google by using mind-numbingly simple tricks. But when you understand how the Internet is ordered with a link graph, those child-level strategies are seen for what they are, implausible and sadly laughable.

Links Graphs and Ranking

There is somewhat very little you can do to control who links to you. Because of how link graphs work the task of identifying spam is easier.

Knowing how link graphs and associated link graph mining technologies work helps to make sense of why Googlers are so confident about their ability to catch link spam and stop it from working.

While there is little you can do to control the creation of links to your site, there is a lot that you can do to control what your site links to.

For that reason, in my opinion, it’s a good idea to be careful to link to pages that are useful to users in the context of your content.

In the interest of user experience it’s also a good idea to scan your outbound links to be absolutely certain that there are no outbound links to insecure HTTP web pages.

There is little one can do to control who links to a site. But the sites you link to are entirely under your control. Poor outbound linking practices may send a negative signal that reflects poorly on the site and may contribute to a negative ranking influence.

Citations

Link Ranking Algorithms

What is Google’s Penguin Algorithm?

Google Fights Spam with AI

15 Google Chrome Tricks To Boost Your Productivity

Google Chrome is unquestionably the most popular web browser out there, despite there being a number of solid alternatives. And that shouldn’t be surprising, given the robust set of features it possesses, such as deep integration with other Google services, a minimal UI, and much more. Now, I’m sure you all are pretty familiar with (most of) Google Chrome’s features. But is that all?

New Tab: Spawn a new browser tab, immediately right to the active one.

Reload: Refresh the content of the currently active tab.

Duplicate: Create a new tab to the right, with the same content as that of the active tab.

Pin Tab: Permanently pin a tab to the browser. Pinned tabs are moved to the left of all the other tabs, and stay pinned even when the browser is closed.

Mute Tab: Mute the audio playing in the tab. Obviously, this works only if the tab is playing some sound (Indicated by a little speaker icon on).

Close Tab, Close Other Tabs, Close Tabs To The Right: Self explanatory.

Reopen Closed Tab: Open the most recently closed tab, and load up the webpage loaded in it at the time of closing.

2. Perform calculations, unit conversions etc., directly from the Omnibox

The Omnibox, which is the combined search and address bar of Google Chrome, is a pretty versatile tool in itself. You can use it to quickly perform everything from basic arithmetic calculations (addition, subtraction, multiplications, etc) to unit conversions (Currency conversions, Miles to Kilometres, Pound to Kilogram, etc) of nearly every unit type. It can even be used to quickly compose email (using the configured default email client), and add events to your Google calendar.

To perform an arithmetic calculation, simply type it in the Omnibox, followed by the equal to (=) sign, and the result will be displayed in the drop down box, in real-time. Cool, isn’t it?

As of the latest tested version of Google Chrome (version 47), pinned websites act as desktop apps, and open up in their own windows, without any obtrusive browser UI elements, such as address bar and navigation buttons. How cool is that?

5. Essential Chrome keyboard shortcuts that you must use

Almost all desktop and web applications support keyboard shortcuts, given how handy they are, and Google Chrome is no different. It includes support for dozens of keyboard shortcuts, and these can be used to do everything from managing/navigating tabs, to clearing browser history. Here are some of the most useful ones:

Ctrl+Shift+N : Open a new window in Incognito mode.

Ctrl+Shift+T : Reopen the last closed tab. Up to 10 recently closed tabs are remembered.

Shift+Esc : Open Google Chrome’s task manager.

Alt+Enter : After typing a URL: Open URL in a new tab.

Ctrl/Shift+F5 : Reload the current page, ignoring cached content.

6. Use extensions to make Chrome do much more than just browsing the web.

If you use Google Chrome, it’s pretty much a given that you use a couple of extensions. The Chrome Web Store includes hundreds of thousands such useful extensions, and these let you use Google Chrome for a diverse range of purposes. Some extremely handy extensions include chúng tôi URL Shortner, which makes it easy to create goo.gl short links for sharing, and LastPass, that lets you securely save and manage all your web account passwords. Want even more? Check out the list of 35 best Google Chrome extensions we’ve compiled.

7. Run extensions in Incognito mode

The Incognito Mode in Google Chrome lets you browse the web privately, without having to worry about your browsing history (and other data) being saved by the browser. However, by default, the installed extensions don’t work in Incognito Mode.

8. Customize New Tab page to show useful information

By default, the New Tab page of Google Chrome is rather plain and dull looking. However, you can use numerous tab customization extensions available in the Chrome Web Store to not only make it more attractive, but also a lot more useful. These extensions make it possible to add information such as clock, current temperature, upcoming calendar events etc. to the New Tab Page. Take a look at the companion screenshot above, illustrating the New Tab page, as customized by Momentum, one such extension.

9. Block certain websites in Chrome 10. Quickly search any image using Google reverse image lookup

11. Remotely access other computers from Chrome

Remote Desktop Access is a pretty convenient method of securely controlling/sharing a remote computer over the network. There are many applications available for it, such as Teamviewer (the most widely known), or its many alternatives, but if you want, you can even use Google Chrome to view/control a remote computer over the network.

12. Enable hidden browser features using Chrome Flags

Want to unlock the hidden potential of Google Chrome? Check out Flags. Essentially, flags are experimental Google Chrome features (hidden under the chrome://flags page) that are not yet ready for prime time. Consequently they can change, or even stop working altogether with subsequent browser updates. However, they sure can impart lots of cool added functionalities to the browser, such as automatic password generation, multilingual spell-checking tools, and more. There are lots of Google Chrome flags available for you to play with, if you wanna live on the bleeding edge of the browser’s tech. Just make sure to use them carefully, or you may end up with a non-functional browser.

13. View listing of cached webpages in Chrome

Like any web browser, Google Chrome also saves local copies of webpages that you access, to speed up future page load times. And if you want to view the listing of all those cached webpages, all you have to do is go to the “chrome://cache” (without quotes) system URL. Take a look at the screenshot below:

14. Use Chrome as quick notepad

15. Manage running tabs & extensions with Chrome’s Task Manager

SEE ALSO: 20 Cool Microsoft Edge Tips And Tricks

Conclusion

Why Google Gets Gamed (And Why That Won’t Stop)

The Birth of Link-Based Algorithms

In 1993, the Mosaic browser was released and the world was given access to a graphical web. It wasn’t fast, it wasn’t real pretty, but it was visual – and that meant the start of a public internet. Sites were originally categorized and put into independently-owned directories, and major directories were the primary way typical users navigated the chaos that is the web. Then innovative search engines took the field, with key players at the time including Yahoo!, Lycos, and AltaVista (and feel free to give those names a moment of silence).

These original search engines looked through actual web pages, indexing the content found inside. It actually read text and matched it up to your query. The theory was simple: If a site uses the words you’re searching for, it’s probably on the topic you’re searching for.

Quickly, however, the search engine result pages plummeted into chaos. The reason was the first wave of search engine optimization, which recognized how the sites were being ranked and tried to put in the exact right number of keyword repetitions (that’s where the “keyword saturation” idea, and the obsession some SEOs still have with it, comes from).

The gaming got bad fast. Beyond simply optimizing their site for the given keywords, webmasters and first-generation SEOs were creating duplicate pages that had the exact same content but was located at a different address. An entire page of the search results could be from the exact same publisher.

That’s where Google came in. Larry Page and Sergey Brin were certain they could make a better search. This time they wouldn’t be using keywords found within a site as the mode of ranking (although it would, and still does, play a role). Rather, they were on a hunt for a method by which site popularity could be automatically tracked. What they came to was the idea of link popularity, and in the early phases, that was largely based on the site’s PageRank.

PageRank is, in the broadest (and admittedly least thorough) summary possible, a simple way of indicating how many links point to a site. Links are more powerful if they originate from sites that have a high PageRank themselves. So, theoretically, as people went around sharing the link – be it on their own site, on a blog, or on any other medium that could be indexed – those links would serve as evidence for the high quality the site being linked to.

Immediately, Google’s algorithm gave results that were substantially better than those on Yahoo! and other major search sites (which were basically just spam). However, it didn’t take shady SEOs long to catch up.

The Next Generation of Spam

Search engine optimization is, overall, a good thing. It’s a way by which webmasters can communicate with the search engines, telling them what their site is about, what they want to be discovered for, and so on and so forth. The problem is that there are certain optimizers who use a set of tactics that sabotage the aim of the search engines (which is providing high-quality content for searchers). In the Google era this spamming has happened in the form of link generation.

More inbound links bolsters your search engine ranking, so it’s not hard to see the solution to increasing rank: Get more links! Rather than getting these links by word-of-mouth marketing, providing guest content, or otherwise earning the links, however, most spam groups these days rely on the network of sites that sell links, the self-publishing article sites, and other mediums that allow you to push links by yourself.

The Capital Power Conundrum

So the quick answer to why Google provides spam results is that, since they have an automated system, people are always trying to go through and game the SERP. It’s impossible to afford a manual management system and it’s impossible to spam-proof an automated one. But there’s one other question that deserves attention: Why is it just spam sites doing this?

Why Social Web, Curation, Etc., Won’t Stop Gaming

There have been numerous propositions for how this situation can be improved or fixed. Social sharing as a ranking factor (including Twitter shares, Facebook likes, and Google +1s) sounds great. Human curation and contributions (such as we see on Blekko) are fantastic conceptually. Here’s the problem: The moment one of these factors becomes significant enough, it absolutely, definitely will be gamed.

The social web is on the rise and once its hit the mainstream, you bet that there will be groups who create hundreds – even thousands – of social accounts for the simple purpose of selling you their “likes.” In the same way, if Blekko became popular, then SEO groups would hire third parties or develop networks to game the human contributions.

For every step the search sites take toward “fixing” the problem and establishing countermeasures, the SEOs have come up with one more way to get around the countermeasure. The only long-term fixes would involve either an idealistic (probably impossible) way to detect those who are gaming the search sites or full, single-group human curation (which is far too expensive to pull off). Sadly, there isn’t a quick fix. The search sites will continue to create well-rounded algorithms that are more likely to surface good results, but it’s impossible to “kill spam” completely – at least in today’s world.

The good news is that webmasters who don’t invest in gaming will still see the best long-term results. Focusing on quality, basic promotion through guest blogging or social sites, and honestly providing value will get organic results over time – and won’t be tossed to the sidelines with algorithm updates.

Metrics For Success With Rpa

The collection of RPA success metrics listed below should be considered from both a technical and business project management standpoint. From both a technical and business project management perspective, the group of RPA success measures given below should be taken into account.

All of them should be recognizable to you?

Yes. All of the metrics given below should be familiarized with in order to better grasp what and how you may measure in connection to RPA activities.

Are all of these intended for use?

No. Over Measuring is far worse than performing no measurements at all. When you follow too many signs, you become overloaded with information and lose sight of what is really important. Choosing a modest number of steps and sticking with them over time is the best course of action.

Enterprise Metrics

Following are the enterprise metrics to measure the success of an RPA −

Time gains’ value (TGV)− Time gains’ value(TGV) compares the costs of processes carried out by humans with RPA bots. This equation may be used to calculate it.

TGV= (EC − AC)/AC x 100%

Where,

EC is the costs of processes performed by employees, including wages (including leaves and vacations), taxes, and office and workplace expenses (rent, utilities, etc).

AC stands for the expenses associated with the services provided by bots, including license fees, development costs, virtual machine/hosting costs, and maintenance costs.

Your baseline metric for determining RPA ROI is TGV.

RPA Return on Investment(ROI) − Processes Robotics Automated A key business metric called return on investment (ROI) should demonstrate how much money your organisation was able to make (or save) as a result of using RPA.

To determine the RPA ROI, consider the Value of Time Gains (above point). In order to fully understand RPA ROI, two more factors must be considered −

Process speedup A−Value − A−The value of process acceleration reveals how much actual money was earned by using automation. This may be shown, for instance, by how expediting the customer onboarding process affects the number of customers handled before and after automation, multiplied by the average benefit per client.

B−Value of error reduction − Value of error reduction− This measure shows how much money can be saved by eliminating mistakes. It should correspond to the priorities of your company.

It could, for instance, represent the total cost of business deals lost as a result of errors or protracted processing, the cost of manually correcting errors (represented by payroll costs, such as 30% of 1 FTE), the rise in the proportion of cases that meet SLAs, the improvement in data quality, or anything else that is essential for a specific process or organization.

Once VoPA, VoER, and VTG have been calculated, we may calculate the RPA ROI as follows.

[(TGV*AC + Value of Process Acceleration + Value of Error Reduction) − AC]/AC x 100%

Business Value Assumed − The RPA indicator called as Expected Business Value compiles all extra business−related KPIs connected to RPA. It is calculated by multiplying the overall cost savings (achieved through greater performance, better resource usage, and fewer errors) by the average FTE costs during a certain time frame.

Rise in productivity − The “gained productivity” metric is one of the most often utilized by enterprises to assess the performance of RPA. It demonstrates how many FTEs were added as a result of automating a certain operation. For instance, if automation replaced the manual labour of 4 people who would normally spend 2 hours per day doing the activity, the productivity gain would be 1 FTE/month.

Cost per Error − The cost of an error that occurred during a specific automation is disclosed by this measure. It should include the cost of the labour hours needed to fix the problem as well as the total value lost due to downtime.

Utilization of License − License utilization gauges how much the business uses their RPA provider license. If your UiPath−based bots are using the license to the greatest degree possible but are unavailable for 10% of the time over a given period due to maintenance or downtime, your license utilization will be at 90% in this case.

Technical Metrics

Following are the technical metrics to measure the success of an RPA −

Speed of Process Automation − When using an automated process, you may look at the process velocity to determine the typical time needed to finish one case or project.

Average Automation Development Time − These statistics show how long it typically takes to create and set up one RPA bot, as well as how many hours an RPA developer works on it, in terms of FTEs or the number of hours they work on it.

Bot Success Rate − Bot Success Rate/Bot Accuracy Rate shows the proportion of cases handled by the bot for a certain operation. RPA success rates of 80% or more are considered ideal. An RPA bot with a 99% accuracy rate is definitely possible to build.

Scalable automation − This statistic shows how easy it is to scale a certain method. The time and cost to install a new machine or reproduce the bot should be compared to the time and cost to hire and onboard a human processor to carry out the same tasks, including the payroll costs (for instance, 30% FTE).

In general, if the RPA bots are developed following all other best practices, they should grow pretty quickly.

Scalability in development − Demonstrates the portion of the old automation code that can be applied to the new automation. Since they can be created by re−constructing the existing ones, adding a new bot typically gets easier and takes less time as the amount of automation produced for one business increases.

Failure Rate − The proportion of time that your bot may be completing a task but isn’t due to a problem or maintenance is known as the downtime rate.

Person Hours for Break−Fix − Data on the amount of time needed to resolve a specific problem is available in the Break−Solve Person Hours metric. The quantity of FTEs or man−hours required to correct the automation and bring it back on track should be specified.

Update the detailed information about Google: Improving 3Rd Party Metrics Won’t Boost Rankings on the Flu.edu.vn website. We hope the article's content will meet your needs, and we will regularly update the information to provide you with the fastest and most accurate information. Have a great day!