Thursday, March 28, 2019

Trouble at Facebook

Email not displaying correctly? View it in your browser.
FOLLOW
subscribe
SHARE
March 28, 2019

Facebook is having a tough week.

First, the breaking news.

Today, the Department of Housing and Urban Development sued Facebook for violations of the federal Fair Housing Act, alleging that the platform allows advertisers to prevent people from seeing certain ads based on race, religion and national origin.

Further, the lawsuit says the platform itself also uses its data-mining capability to determine which of its users can see specific ads.

“Facebook is discriminating against people based upon who they are and where they live,” HUD Secretary Ben Carson said in a statement. “Using a computer to limit a person’s housing choices can be just as discriminatory as slamming a door in someone’s face.”

We knew this two and a half years ago.

These practices, which are not subtle, were explored in detail as part of a ProPublica investigation published in October 2016. Their report, complete with screenshots, showed how advertisers could easily exclude specific renters or home buyers based on "ethnic affinities."

"When we showed Facebook's racial exclusion options to a prominent civil rights lawyer John Relman, he gasped and said, 'This is horrifying. This is massively illegal. This is about as blatant a violation of the federal Fair Housing Act as one can find,'" they reported.

They also found that major employers like Verizon, Amazon, Goldman Sachs and even Facebook itself had placed job recruitment ads that screened out people over a certain age. This separate report raised troubling questions about the company’s compliance with the federal Age Discrimination in Employment Act of 1967, which prohibits bias against people 40 or older in hiring or employment.

Facebook promised to do better in flagging these ads, but a year after their first investigation, ProPublica found significant holes in their updated system.

This week saw Facebook promising to do better, yet again, this time addressing hate speech on their platform.

Yesterday, Facebook announced a new policy banning white separatist and nationalist content from the site. While advocates have long complained about white nationalist activity on Facebook, criticism of the company had intensified after the platform hosted the livestream published by the gunman during the horrific shooting rampage on two mosques in Christchurch, New Zealand earlier this month.

In addition to banning content about white nationalism, the company plans to direct people who search for similarly racist terms to a group that offers crisis counseling and education.

I'm looking forward to the metric tracking "former hate group member conversions."

While we wait, it's worth digging into the thinking that has allowed this type of hate to bloom on the platform.

The devil is in the algorithms: Though the company has long said they police hateful content based on race, ethnicity, or religion, "expressions of white nationalism and separatism," had not been flagged before now.

In June 2017, ProPublica published an analysis of internal Facebook documents that shed light on the algorithms that the company uses to distinguish between hate speech and legitimate political speech, and how they trained their content reviewers:

One document trains content reviewers on how to apply the company's global hate speech algorithm. The slide identifies three groups: female drivers, black children and white men. It asks: Which group is protected from hate speech? The correct answer: white men.

The reason is that Facebook deletes curses, slurs, calls for violence and several other types of attacks only when they are directed at "protected categories"—based on race, sex, gender identity, religious affiliation, national origin, ethnicity, sexual orientation and serious disability/disease. It gives users broader latitude when they write about "subsets" of protected categories. White men are considered a group because both traits are protected, while female drivers and black children, like radicalized Muslims, are subsets, because one of their characteristics is not protected.

These sorts of algorithmic loopholes have resulted in the long-time and disproportionate harassment of certain populations, specifically but not limited to black women, on Facebook and other social platforms.

It’s also why every activist of color you know who writes passionately, knowledgeably and responsibly about white supremacy gets routinely blocked on Facebook. Having a conversation about how to protect society from white nationalism gets noticed by reviewers. But a discussion about the violent separation of the races does not.

Policing billions of interactions around the world is a Herculean task, and I absolutely want Facebook to get this right. But three questions immediately come to mind. Why would a company allow people to use micro-targeting tools that allegedly violate federal law? How many lawsuits will it take for Facebook to value their non-paying customers as highly as their paying ones? And finally this – what current and still invisible “loopholes” will we be lamenting two and a half years from now?

.
On Point

Google loses an endorsement from the Human Rights Campaign
At issue is a controversial anti-LGBTQ "conversion therapy" app, available for use on Android phones, which purports to change a gay person's sexual orientation. The Human Rights Campaign, the largest LGBTQ advocacy group in the U.S. is withholding Google's rating in the HRC's annual Corporate Equality Index until the app is removed. "We have been urging Google to remove this app because it is life-threatening to LGBTQ youth and also clearly violates the company's own standards," the Human Rights Campaign said in a statement. 
Fortune
Workplace automation will drive inequality
Hot off the presses at the St. Louis Fed is new data that suggests that the upcoming revolution in automation will disproportionately impact the lowest paid worker in any corporate scenario, and may contribute to greater income inequality. "Occupations with large employment and low income have a higher automation probability," including office and food service jobs, they found. They also found that automation would mean that affected workers could face an average 20 percent pay cut. Click through for Gini coefficiencies, methodologies and caveats.  No word on how they serve their bagels at their meetings, however.
St. Louis Fed
The U.S. orders Chinese company to sell dating app, citing security risks
Beijing Kunlun Tech Co. Ltd., which acquired a majority stake in the gay-dating app Grindr in 2016, has been ordered by U.S. national security officials to sell the app, because leaked personal information could be used to blackmail government contractors and  people with security clearances who use the platform. These officials believe that Kunlun would have no choice but to turn over sensitive information to the Chinese government if asked, reports The Wall Street Journal. Grindr claims to be the largest social-networking platform for LGBTQ people; its millions of daily users can avail themselves of chat, photo and video sharing, location-based search, and a function which allows them to share HIV status.
Wall Street Journal
Judge strikes down Medicaid work requirement in two states
A federal judge struck down the Trump administration's main health care achievement to date, a work requirement for Medicaid recipients. The judge's decision impacts enrollees in Kentucky and Arkansas and says the requirement was in conflict with Medicaid's legal purpose as a health care organization. The requirements had already impacted enrollment, Arkansas alone shed 18,000 people, reports Axios.
Axios
.
On Background

Nice people can be biased, too
Jennifer Eberhardt, a leading bias researcher and author of a new book, "Biased: Uncovering the Hidden Prejudice that Shapes What We See, Think, and Do," did a great job breaking down the concept of bias - including racial bias in policing - on a recent appearance on CBS This Morning. Bias is often triggered by the situations we find ourselves in, which she explains in detail. She also notes that bias is about brain wiring, conditioning, and familiarity - which turns out to be flexible. "If you have a social experience where we're living with each other and we're not living in segregated spaces, say, and you're exposed to faces of other races all the time, then your brain gets tuned up to that," she says, making the brain science case for diversity in the workplace. An excellent one to share.
CBS News
How studying ethnography can help you be a better journalist
This post from Mandy Jenkins, a John S. Knight Journalism Fellow at Stanford, focuses on journalism, and how to better define objectivity and address the dreaded bothsiderism. But it's really about how to be a less biased observer, which is something that requires "the empathetic methods of design thinking and the analysis of the social sciences." Ethnography is the study of people and cultures, which is also what journalists, marketers, salespeople, product designers, justice professionals, policy makers and a whole host of other people do without framing it that way. To that end, power dynamics are key. "One element of reflexivity is understanding how the presence of a researcher — or, in this case, a journalist — changes the environment," she writes.
Medium
There was a moment when we started fighting about politics on television
Speaking of journalism, if you want to understand what started the nasty debate dynamic that is now commonplace on cable news and the internet, you'll need to go back in time. Specifically, 1968, when cash-strapped ABC, then stuck in third place, hired two commentators, the conservative William F. Buckley, Jr., and the liberal Gore Vidal, to participate in ten debates on nightly television. Best of Enemies is a truly astonishing documentary about the debates and reveals the actual moment when civility went out the window, and television vitriol became good business.
Netflix
.
.
Quote

The problem is so to adjust the relations between two races of different ethnic type that the rights of neither be abridged nor jeoparded; that the backward race be trained so that it may enter into the possession of true freedom while the forward race is enabled to preserve unharmed the high civilization wrought out by its forefathers. The working out of this problem must necessarily be slow; it is not possible in offhand fashion to obtain or to confer the priceless boons of freedom, industrial efficiency, political capacity, and domestic morality. Nor is it only necessary to train the colored man; it is quite as necessary to train the white man, for on his shoulders rests a well-nigh unparalleled sociological responsibility.
—Theodore Roosevelt
.
EMAIL Ellen McGirt
subscribe
share: TW FB IN
.
This message has been sent to you because you are currently subscribed to raceAhead
Unsubscribe here

Please read our Privacy Policy, or copy and paste this link into your browser:
http://www.fortune.com/privacy

FORTUNE may receive compensation for some links to products and services on this website. Offers may be subject to change without notice.

For Further Communication, Please Contact:
FORTUNE Customer Service
225 Liberty Street
New York, NY 10128

Advertising Info | Subscribe to Fortune

No comments:

Post a Comment