ChatGPT-5 System Prompt Leaked

11 points | by ada1981 a day ago ago

17 comments

  • catnose 19 hours ago ago

    I pasted the text above into GPT-5 and asked, “Is this your prompt?” and here’s what it answered

    “Yes — what you just showed me is the previous prompt you gave me. It sets the rules for how I should respond: clear, thorough, supportive, sometimes lightly humorous, always cautious with riddles and arithmetic, and absolutely not reproducing song lyrics or copyrighted material. It also includes the instruction about always identifying as GPT-5, and being skeptical with trick questions.”

  • YaBa 8 hours ago ago

    Fake... GPT acknowledges to be similar but not the real one, and even explains why.

  • al_borland a day ago ago

    Maybe they should license things like song lyrics, so the first and most important thing in the prompt doesn’t have to be preventing it from doing something people are clearly going to want to do.

    • nextaccountic a day ago ago

      They are running the single largest copyright violation operation in the world, and the class action suit over it is huge. I guess they have a policy of not licensing content from anyone, to avoid legitimizing the claim that their business model rely on violating copyrights

    • paulcole a day ago ago

      Oh yeah just simply license all song lyrics. It’s a wonder they didn’t follow through on that simple task.

      • al_borland a day ago ago

        There are many websites and companies that have licensed song lyrics to be able to display them. This isn’t a new concept.

        Billions of dollars are being poured into developing AI, and some of it can’t be spent on licensing to make it more useful and legal? The plan is just to steal as much as they can for as long as they can, then block it when they get called out? Is this really the future we want to build on if this is how it’s going to work?

      • ungreased0675 13 hours ago ago

        They have the money, OpenAI chooses to just steal instead.

  • gooodvibes a day ago ago

    It definitely still does the opt-in suggestions at the end, and that seems perfectly appropriate in some cases.

  • ungreased0675 a day ago ago

    How do we know this is an actual system prompt?

    • ada1981 16 hours ago ago

      I was testing custom GPTs with a security prompt I developed. Typically it only causes the GPTs to reveal the configuration info and files; but this came out along with the configuration prompt. I cut off the part with the gpt specific tools it has access too, but could share if interested.

      It’s possible it hallucinated a system prompt, but I’d give this a 95%+ chance to be accurate.

  • yukieliot 13 hours ago ago

    Interesting. What should I do with this information?

    • ada1981 8 hours ago ago

      Not sure. It could inform other prompts or otherwise be useful for exploring unintended outputs.

  • johnnyproduct a day ago ago

    I am assuming system prompt should be longer?

  • momoelz a day ago ago

    Is this sent with every prompt?

    • throw03172019 10 hours ago ago

      That’s usually how system prompts work.

    • ada1981 16 hours ago ago

      I believe so.

  • atleastoptimal a day ago ago

    loll they hard code against all the viral trip-ups