Ivana Dukanovic, an associate from Latham & Watkins, the attorney for Anthropic in the lawsuit filed by Concord Music, is taking the blame for the erroneous citation of an article that does not…
Can we normalize not calling them hallucinations? They’re not hallucinations. They are fabrications; lies. We should not be romanticizing a robot lying to us.
If you train a parrot to say “I can do calculus!” and then you ask it if it can do calculus, it’ll say “I can do calculus!”. It can’t actually do calculus, so would you say the parrot is lying?
Can we normalize not calling them hallucinations? They’re not hallucinations. They are fabrications; lies. We should not be romanticizing a robot lying to us.
Pretty engrained vocabulary at this point. Lies implies intent. I would have preferred “errors”
Also, for the record, this is the most dystopian headline I’ve come across to date.
If a human does not know an answer to a question, yet they make some shit up instead of saying “I don’t know”, what would you call that?
If you train a parrot to say “I can do calculus!” and then you ask it if it can do calculus, it’ll say “I can do calculus!”. It can’t actually do calculus, so would you say the parrot is lying?
No, but the parrot isn’t hallucinating either.
I like fabrication going forward. Clearly made up, doesn’t imply intent