August 9, 2022 archive

FIR #276: Should I Tay Or Should I Go?

Remember Microsoft Tay? The 2016 Twitter chatbot suffered from a concerted effort to teach it all manner of vile things, prompting Microsoft to withdraw the service and apologize. That experience is top of mind for pretty much everyone talking about Meta’s new chatbot, now available in public testing in the U.S. The developers behind BlenderBot 3 insist that they, too, had the Tay experiment in mind when they developed the process for teaching this new chatbot to behave. In fact, the beta is designed so people can report when it says something inappropriate that it may have learned from another user. That information is used to teach it what is not appropriate. Will it work?


The next monthly, long-form episode of FIR will drop on Monday, August 22. FIR “shorts” — episodes under 15 minutes — will appear once or twice weekly.

We host a Communicators Zoom Chat each Thursday at 1 p.m. ET. For credentials needed to participate, contact Shel or Neville directly, request the credentials in our Facebook group, or email fircomments@gmail.com.

Special thanks to Jay Moonah for the opening and closing music.

You can find the stories from which Shel’s FIR content is selected at Shel’s Link Blog. Shel has started a metaverse-focused Flipboard magazine. Neville’s “asides” blog, Outbox, is also available.

Links from this episode:

The post FIR #276: Should I Tay Or Should I Go? appeared first on FIR Podcast Network.