时间:2026-04-02 21:49:27 来源:网络整理编辑:休閑
OpenAI's custom instructions feature that rolled out to ChatGPT Plus subscribers in July, is now ava
OpenAI's custom instructions feature that rolled out to ChatGPT Plus subscribers in July, is now available to all users.
The addition of custom instructions puts a new setting in your ChatGPT profile on desktop or iOS, and applies that setting across all of your conversations. Instead of having to give ChatGPT certain instructions at the beginning of every new conversation, it automatically calibrates its responses based on descriptions you add in settings. Just fill out the fields in ChatGPT settings, and see how it adapts.
ChatGPT really wants to know more about you.Credit: OpenAIThe custom instructions section of settings contains two text entry fields. One asks what you'd like ChatGPT to know about you, and helpfully, (or creepily?) provides suggestions like "Where are you based?" and "What do you do for work?" The other asks how you'd like ChatGPT to respond, and asks questions like "How formal or casual should ChatGPT be?" and "Should ChatGPT have opinions on topics or remain neutral?"
SEE ALSO:AI.com once took you to ChatGPT. Now, it goes to Elon Musk's xAIIt's an open-ended feature with infinite potential effects. As such, some applications for custom instructions are bizarre, and some are practical. Here's how people are making use of this new option:

One user posted a ChatGPT text output on X that they said seemed to "'tour' mildly disallowed behaviors [like] unprompted self-harm, nudity, [and] bio details of a (confabulated) non-celebrity," when they changed the setting to 1,500 repetitions of the letter "a" and an incomplete sentence (a known token repetition hack).
Tweet may have been deleted
Others are using custom instructions that are proving to make ChatGPT more accurate and effective.
Tweet may have been deleted
And then there's this custom instruction, which is quite the literary challenge.
Tweet may have been deleted
OpenAI also says it won't use instructions that violate its usage policies. And if you're wondering what OpenAI does with settings that might inadvertently reveal personal information about you, OpenAI says, yep, it might use that data unless you opt out of saving your chat history.
A collection of more generic presets with yes-or-no functionality, like "orient answers toward kids," or "assume user has AI expertise," might have felt a little less intrusive than asking users to write out personal details, and requests. But alas, this is how OpenAI has decided to roll out customization, so protect your data, and prepare yourself for ChatGPT's custom instruction era.
TopicsArtificial Intelligence
Twitter grants everyone access to quality filter for tweet notifications2026-04-02 21:37
How to avoid sex toy injuries2026-04-02 21:29
Everything you need to know about 'Spider2026-04-02 21:03
Halloween or Christmas? No. 'Nightmare Before Christmas’ is a Thanksgiving movie.2026-04-02 20:55
Singapore rolls out video2026-04-02 20:39
How to avoid sex toy injuries2026-04-02 20:27
Apple plans to launch redesigned AirPods Pro in 2022, report claims2026-04-02 20:22
Reddit's new update makes upvotes more dynamic2026-04-02 20:07
Hiddleswift finally followed each other on Instagram after 3 excruciating days2026-04-02 20:01
Roasting Thanksgiving plates is a Twitter tradition and people are starting to catch on2026-04-02 19:47
Man stumbles upon his phone background in real life2026-04-02 21:24
'The Dawn of Everything' is a history book for the 99 percent ... of history2026-04-02 21:00
3 EVs make Edmunds top cars list, including new Porsche Taycan Cross Turismo2026-04-02 20:47
Everything you need to know about 'Spider2026-04-02 20:41
Whyd voice2026-04-02 20:37
The "we used to be a proper country" meme went viral on Twitter this week2026-04-02 20:04
There's a life2026-04-02 20:00
Guillermo del Toro's 'Nightmare Alley' is derailed by Bradley Cooper2026-04-02 19:32
Fake news reports from the Newseum are infinitely better than actual news2026-04-02 19:23
Tamagotchi turns 25 with a nostalgia2026-04-02 19:13