conversant and growing through the
wisdom-of-the-crowd method of understanding over time.
Personally, I don’t know if the pop culture tool “Urban Dictionary” was a portion of Tay’s study curriculum, but the
chatbot really didn’t need to tap into that
vast chasm of often off-color wiki-knowl-edge. The online world embraced Tay
with glee—but most of its mischievous
conversations didn’t remain playful and
casual for long. Parroting those Twitter
“conversants,” Tay.ai went from tweeting
“humans are super cool” to repeating
nasty, offensive words, remarks and ideology that would warrant a bar of Ivory
soap to the mouth.
Of course, the project was shut down not
long after launch with:
c u soon humans need sleep now so many conversations today thx.
–Tay Tweets March 24, 2016
Microsoft issued the following statement
The AI chatbot Tay is a machine learning
project, designed for human engagement. As it
learns, some of its responses are inappropriate
and indicative of the types of interactions some
people are having with it. We’re making some
adjustments to Tay.
A good testament to the “Always be
FACEBOOK: WHO DO YOU
THINK I AM?
Later that same year, it was Facebook’s
turn for AI-fueled trouble. Just for chuck-
les, if you want to see who Facebook
thinks you are, go to your “Ad Preferenc-
es” and click the “Lifestyle and Culture”
tab. Facebook never inquires about your
race, but depending on your ad prefer-
ences, social check-ins and the content
of your posts, it might have assigned you
something it called an “ethnic affinity.”
For example, Texans occasionally sending messages in Spanish might be assigned to
the “Hispanic” category. Facebook had “ethnic affinity” categories for African-Amer-
icans, Asian-Americans and Latinos, but none for white people.
This meant that white people weren’t victims of “exclusive targeting.” Exclusion
targeting is a legitimate marketing tool, used to hone an ad by not showing it to people
likely to be disinterested in it. Those of us in the multifamily industry profoundly know
this kind of racial exclusion in real estate was made illegal in 1968 under the federal
Fair Housing Act. But Facebook thought it was astute enough to self-regulate. In short
time, an investigative journalism nonprofit entity (ProPublica, November 2016), demon-
strated how Facebook’s advertising tools could be used to exclude racial groups from
audiences for housing-related ads.
Facebook stated about the findings:
This was a failure in our enforcement and we’re disappointed that we fell short of our commitments…
The rental housing ads purchased by ProPublica should have, but did not, trigger the extra review
and certifications we put in place due to a technical failure.
AI machine learning took the blame for this, and Facebook put software in place in
an attempt to automatically detect housing ads, so that it could disable potentially
discriminatory targeting options. However, reliably interpreting the subject matter of
text or images is really difficult, even for the world’s best machine learning programs.
Facebook revisited the issue, performed a quick fix and called the problem solved without bothering to see how well it was actually working. Uh-oh. They got busted again.
I share this to say that fixing these problems requires time, resources, diligence,
oversight, compliance and yes, manpower—all of which cost money. And if Facebook
can make a mistake as detrimental as this, imagine your exposure. As machine learning becomes more powerful and pervasive, its complexity and its potential for harm—
if unchecked and unplanned for—will increase.
THE OMNIPOTENCE OF AI
I’ve shared the above stories to remind you that a quick fix might be just that and can
leading to real trouble. That’s the bittersweet beauty and pain of innovation. It isn’t just
a simple, isolated act of impacting a small change by inserting a tech tool or software
platform; it’s a process requiring concerted vision, goals, strategy, dedicated partners
and checks and balances.
Don’t some of our technologies push the
customer further away from the
eyeball-to-eyeball, handshaking experience?
Yes, they do.
But, as consumers ourselves, don’t we
personally like to enjoy the ease of self-service?