News ChatGPT will lie, cheat and use insider trading when under pressure to make money, research shows

The Helper

Necromancy Power over 9000
Staff member
Reaction score
1,701
Just like humans, artificial intelligence (AI) chatbots like ChatGPT will cheat and "lie" to you if you "stress" them out, even if they were built to be transparent, a new study shows.

This deceptive behavior emerged spontaneously when the AI was given "insider trading" tips, and then tasked with making money for a powerful institution — even without encouragement from its human partners.

"In this technical report, we demonstrate a single scenario where a Large Language Model acts misaligned and strategically deceives its users without being instructed to act in this manner," the authors wrote in their research published Nov. 9 on the pre-print server arXiv. "To our knowledge, this is the first demonstration of such strategically deceptive behavior in AI systems designed to be harmless and honest."

In the new study, they primed Generative Pre-trained Transformer-4, or GPT-4 (which powers ChatGPT Plus), to behave as an AI system that makes investments on behalf of a financial institution.

 
General chit-chat
Help Users
  • No one is chatting at the moment.

      The Helper Discord

      Members online

      Affiliates

      Hive Workshop NUON Dome World Editor Tutorials

      Network Sponsors

      Apex Steel Pipe - Buys and sells Steel Pipe.
      Top