Algorithm bias under discussion at Davos
By Gao Yun, Guo Meiping
["china"]
At this year's World Economic Forum in Davos, topics regarding the importance of data and algorithm bias were discussed during a panel titled "Governing Data in Our Daily Lives" on Wednesday.

Data harnessed for the public benefit

In areas such as tourism, agriculture and emergency response, data plays an important role in helping people out in various fields.
Ctrip, China's largest online travel agency, has 200 million active users, and handles 50 TB of data every day, according to Sun Jie, CEO of the agency. "Data is very much used to help our customers to have a pleasant trip."
For example, the agency will provide alternative options for travelers if any changes are made to their flights, as the system "will automatically put a lot of emphasis on making sure your trip is guaranteed." 
It also customizes services for different groups of customers based on their budgetary concerns.
Data help to optimize the use of not only personal budget, but the government's too.
A governments' open attitude towards their data empowers citizens to know where the budget and funding has been spent, and citizens, in turn, can hand in ideas about how to use the money more efficiently and democratically, said Marietje Schaake, a member of the European Parliament.
Data save time, energy and even lives when crises happen.
"We use data a lot in emergency responses, where we can target resources most effectively," said Lauren Woodman, CEO of NetHope. "It allows us to make decisions about where we could take precious resources, especially in the times of crisis, and direct most effectively."
And for Ctrip, data help it to set up entire solutions to emergencies, from identifying consumers' locations to finding ways to send them back to their home countries.

How to prevent algorithm bias

There were discussions on algorithm bias during the debate, and panelists claimed that there are currently no effective ways to prevent it.
Algorithmic bias occurs when a computer system reflects the implicit values of the humans who are involved in coding, collecting, selecting, or using data to train the algorithm.
It's found across platforms including in search engine results and social media platforms, and can have impacts ranging from inadvertent privacy violations to reinforcing social biases of race, gender, sexuality, and ethnicity.
Schaake believes that without oversights, risks would occur when there is abuse of power by governments and companies with the help of technology.
“Think about Alphabet and all the services they have from Google to YouTube,” said Schaake, adding that the oversight model of many powerful companies with billions of daily users is not articulated.
“We are all biased,” said Schaake. “Before we can even say that an algorithm is biased, we need oversight. Once there is oversight mechanism, we can assess how we want to measure what the impact of the algorithm is.”
In terms of how to prevent bias, Woodman said that people should learn to analyze data before deciding whether or not to trust it. “The risks are very high if we don't.”
(Top image via VCG)