Somewhere way back when I wrote a few posts mentioning how bad measurements of internet authority & trust were. They are based on numerical counts of links, good or bad, and I proposed that authority should instead be measured on real trust of real facts. E.G. right now if you Google “FEMA” or “Vaccination” you will find several clearly crazy kookspiracy posts within the first couple of pages of results.
This is because gaining bad notoriety on the Web is sometimes better for traffic than gaining good, or trustworthy notoriety. Being right gains you little but trust from folks who care about such things as truth, but being wrong can get you an avalanche of eyes and ad revenue. Go viral with your bad, and hey, there’s this month’s beer money.
In a case of bad notoriety the kooks link the perpetrator to back up their claims, while the non-kooks link them to point out how [awful, wrong, bad, crazy, evil,] they are. The more outrageous the grenade tossed, the more links the kooks are likely to get. This is how internet bottom feeders like Jim Hoft, Pamela Geller, and Chuck C. Johnson survive, and it’s a business model that works for them.
It’s good to see that Google might finally be getting around to improve that, however I won’t for a moment pretend that they are going to actually fix anything here, because they are dependent on those kooks and internet grenades for revenue as well.
UPDATE at 3/1/15 3:19:32 pm by Thanos
Ok so after stewing about this awhile, I’ve come back to ask: what could Google do to improve things?
A number of possible factors come into play. You also must keep in mind that the kook still has the right to find his favorite kookspiracy site without tons of labor.
An better search algorithm might factor in a few big data hard fact Knowledge-bases like this experiment. It would also look for links from known world trusted reference sites (e.g. OED dictionary, etc.) and grant them more weight. It would still crowdsoure popularity by number of total links weighted against net negative or positive social network up and down votes, but also factor in length of time spent on page, number of same person return trips to page, and overall number of more trusted sites linking than less trusted sites linking. The system would likely still need some Artificial artificial intelligence: when hordes of most trusted or least trusted sites link at once the post gets flagged and then perhaps a team of human moderators with an arbitration process takes a look. What would you suggest? What am I missing here?
The trustworthiness of a web page might help it rise up Google’s rankings if the search giant starts to measure quality by facts, not just links
THE internet is stuffed with garbage. Anti-vaccination websites make the front page of Google, and fact-free “news” stories spread like wildfire. Google has devised a fix - rank websites according to their truthfulness.
Google’s search engine currently uses the number of incoming links to a web page as a proxy for quality, determining where it appears in search results. So pages that many other sites link to are ranked higher. This system has brought us the search engine as we know it today, but the downside is that websites full of misinformation can rise up the rankings, if enough people link to them.
A Google research team is adapting that model to measure the trustworthiness of a page, rather than its reputation across the web. Instead of counting incoming links, the system - which is not yet live - counts the number of incorrect facts within a page. “A source that has few false facts is considered to be trustworthy,” says the team (arxiv.org). The score they compute for each page is its Knowledge-Based Trust score.
More: Google Wants to Rank Websites Based on Facts Not Links - 28 February 2015