Here we go again: Another chatbot is trained on a big pile 'o online utterances, and — surprise, surprise — having soaked up Internet bile, begins repeating it.
Haven't developers learned anything from Microsoft's Tay? Man, this happens basically every time someone tries digesting online talk through the four stomachs of their neural network. — Read the rest
from Boing Boing https://ift.tt/3t9htsx
via IFTTT
0 comments:
Post a Comment