Web

NIST research effort to measure bias in results we get from search engines: ‘Fair Ranking’

NIST wants to correct a bias problem in info retrieval that can have ‘real-world consequences’

The National Institute of Standards and Technology (NIST) launches a research effort to measure bias in search engine results.

“The perfect search engine would be like the mind of God” — Sergey Brin

Artificial Intelligence algorithm based search engines often inherit bias from previous searches, which NIST says can have ‘real-world consequences’.

“It’s now recognized that systems aren’t unbiased. They can actually amplify existing bias because of the historical data the systems train on,” said Ellen Voorhees, a NIST computer scientist.

“The systems are going to learn that bias and recommend you take an action that reflects it.”

Read More: Majority of Americans don’t trust mass media, why does Google?

As part of its long-running Text Retrieval Conference (TREC), which is taking place this week at NIST’s Gaithersburg, Maryland, campus, NIST has launched the Fair Ranking track this year, which is an incubator for a new area of study that aims to bring fairness in research.

The track has been proposed and organized by researchers from Microsoft, Boise State University and NIST, who hope to find strategies for removing bias, by finding apt ways to measure the amount of bias in data and search techniques.

“Search engines have the power to amplify exposure. Whoever is on the first page gets more.”

“We would like to develop systems that serve all of their users, as opposed to benefiting a certain group of people,” said Asia Biega, a postdoctoral researcher at Microsoft Research Montreal and one of the track’s co-organizers.

“We are trying to avoid developing systems that amplify existing inequality.”

Other problems the effort proposes to solve is the appearance of the same answers at the top of a list every time after running a particular search term.

Search Neutrality

NIST’s efforts lean towards supporting the principle of ‘search neutrality’, a concept that came into popular use in context of the Internet in around 2009.

The New York Times defined it as:

“A principle that search engines should have no editorial policies other than that their results be comprehensive, impartial and based solely on relevance.”

According to the NIST’s research track’s measure, a ‘fair algorithm’ wouldn’t generate the same list in the same order in response to a query, but instead would give other articles a chance to appear.

This means the same query will show both renown as well as less renown items in the list. As NIST says, “It would contain answers relevant to the searcher’s needs, but it would vary in ways that would be quantifiable.”

“A principle that search engines should have no editorial policies other than that their results be comprehensive, impartial and based solely on relevance”

Voorhees admits though that this can’t be the only criteria to determine fairness, also that a single research project isn’t enough to solve such a broad societal problem. However, she says, it’s a start.

“It’s important for us to be able to measure the amount of bias in a system effectively enough that we can do research on it,” she said. “We need to measure it if we want to try.”

What Skeptics Say

There has been some skepticism against search neutrality. Skeptics say that making search engines behave ‘neutrally’ will not produce the desired goal of neutral search results.

After all, what construes as the best search results are often not just the most prestigious and renown, but also the most useful. Meaning, the inherited bias in a search often works for most of us.

“Search neutrality is likely to make search results spammier, more confusing, and less diverse”

For example, if we are looking for the best dentists in town, we want to know the ones that are most renown and therefore, searched. Also, they must be near us geographically. As a person with a toothache, the searcher cannot be expected to desire a neutral search.

James Grimmelmann, the author of “Internet Law: Cases and Problems,” wrote in an essay:

“Search is inherently subjective: it always involves guessing the diverse and unknown intentions of users. Most of the common arguments for search neutrality either duck the issue or impose on search users a standard of ‘right’ and ‘wrong’ search results they wouldn’t have chosen for themselves.”

Read More: Politicians on both sides agree big tech needs regulation, American citizens are split

“Search engines help users avoid the websites they don’t want to see; search neutrality would turn that relationship on its head. As currently proposed, search neutrality is likely to make search results spammier, more confusing, and less diverse,” he adds.

Only the Best Answers at the Top

However, NIST argues that this can cause a problem when there are too many worthwhile answers. Very few searchers look beyond the first page, which obscures the rest of the results that could have been worth looking into.

“The results on that first page influence people’s economic livelihood in the real world,” Biega said. “Search engines have the power to amplify exposure. Whoever is on the first page gets more.”

“It’s important for us to be able to measure the amount of bias in a system effectively enough that we can do research on it”

So going back to the dentist example, if there are a hundred equally good dentists in town, every searcher will only see five of them on the first page, so that rest will never get much business, even though they offer the same quality.

NIST will make the official call for the Fair Ranking track in December for participation in 2020 TREC, which will take place from November 18-20, 2020, in Gaithersburg, Maryland.

Navanwita Sachdev

An English literature graduate, Navanwita is a passionate writer of fiction and non-fiction as well as being a published author. She hopes her desire to be a nosy journalist will be satisfied at The Sociable.

Recent Posts

Top 15 LatAm tech journalists and editors of 2024

Latin America’s tech industry is booming, with innovative new startups popping up across the region.…

2 hours ago

G20 announces initiative to crackdown on climate change disinformation

The Global Initiative for Information Integrity on Climate Change claims to 'safeguard those reporting on…

4 hours ago

How GPUs, widely used in gaming, are helping doctors get better look inside us

In the late 19th Century, physicians began inserting hollow tubes equipped with small lights into…

14 hours ago

Top Five Trends Shaping Gaming in 2025

This year wasn’t exactly what the video gaming industry expected — it declined by 7%…

2 days ago

Why data flywheels are the key to sustainable growth in 2025 

By Oren Askarov, Growth & Operations Marketing Director at SQream Becoming “data-driven” has become a…

2 days ago

Swiss-based Horasis to host its Asia Meeting in Dubai, United Arab Emirates 

Horasis Asia Meeting, led by German entrepreneur Frank Jurgen-Richter, will take place this year on the…

5 days ago