Now Reading
More information, not less, would help eliminate bias in sharing economy

More information, not less, would help eliminate bias in sharing economy

Rate this post

The booming sharing economy hasn’t been without its share of controversy, including accounts of discrimination by hosts against guests.

Jun Li, assistant professor of technology and operations at the University of Michigan Ross School of Business, took recent research in the field a bit further and determined the root cause of the discrimination and a simple remedy.
“Hosts, when lacking perfect information, infer the quality of a guest by race and make rental decisions based on the average quality of each racial group,” Li said. “When enough information is shared, hosts do not need to infer guests’ quality from their race, and discrimination is eliminated.”
In other words, more information is better, according to the researchers from U-M, Washington University in St. Louis and Indiana University. That finding conflicts with a recent solution by rental-home platform Airbnb, which, in part, would reduce the size of guest photographs in an effort to prevent host bias.
Dennis Zhang, assistant professor at Washington University, said the research shows that more information about guests, as opposed to less, is important in eliminating potential sharing economy bias.
In a working paper, Li, Zhang and co-author Ruomeng Cui, assistant professor at Indiana University conducted two randomized field experiments among more than 1,200 Airbnb hosts in Boston, Chicago and Seattle. The researchers used fictitious guest accounts and sent accommodation requests to the hosts using those accounts.
They found requests from guests with African-American names—based on name frequency data published by the U.S. Census Bureau—were 19 percent less likely to be accepted than those with Caucasian names.
However, when the researchers posted a single host review for each fictitious user, the tables turned: acceptance rates for both sets of guests evened out. This shows strong evidence of concept called statistical discrimination with Airbnb. More information eliminated the bias.
It wasn’t just positive reviews that swayed the hosts. The second portion of the experiment involved a negative review of the fictitious guests. Here, too, the acceptance rates for both sets of names were statistically even: 58.2 percent for whites and 57.4 percent for African-Americans.
“We thought a negative review might create distortion for the hosts,” Cui said. “However, based on our experiments, any and all information about a guest is important to fight discrimination.”
Li suggests Airbnb incentivize hosts to write reviews on new users, and also provide a more structured way for guests to signal their credibility. She said that at least two of Airbnb’s changes to its nondiscrimination policy last September could be counterproductive, according to the findings from this multifaceted research.
“Hiding user information, as some other studies have suggested, or making profile pictures less prominent doesn’t solve the problem, and may make it even more severe. Airbnb really has to think about how to provide more information instead of cutting it from guest profiles,” Li said.

Haz clic para leerlo en español:Más información ayudaría a eliminar segregación en la economía compartida

See Also

 

View Comments (0)

Leave a Reply

Your email address will not be published.

Copyright © 2014 - 2019 VIVEMICHIGAN | All rights reserved.

Scroll To Top