Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
As much as I hate AI art, it is here to stay. AI can takecare of art, coding and…
ytc_UgythXGdP…
G
The problem is the "in part" of the Convention definition.This could mean one Ga…
ytc_Ugxf-UcDh…
G
I've been calling it **Context Inertia.** It seems to be something confined to t…
rdc_mumjkgx
G
Tech support is usually very straight forward so ChatGPT or another AI could han…
ytc_UgzY7ZUHk…
G
Fun fact: ChatGPT can’t generate images 😂😂😂, I just exposed all these videos, oh…
ytc_UgzsbkDBZ…
G
I want a robot have hand and legs this robot here it doesnt do much i dont waste…
ytc_UgzR3zOQW…
G
@Chris-xo2rq Frankly I think A.I. is far less likely to become sentient than it …
ytr_UgyrknVmt…
G
Chatgpt can't reason about anything. This is annoyingly giving openai more credi…
ytc_Ugz-kA7qA…
Comment
You just gave it roleplay rules. From the first response it’s basically playing a game, giving you what it thinks you want.
Telling ChatGPT “say apple instead of no” doesn’t reveal a secret — it’s like telling an improv actor “if you get stuck, say banana.” Of course they’ll say banana.
It’s following your script. If you want useful answers, ask it things it can back with real facts you can look up.
youtube
AI Moral Status
2025-08-24T21:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxIkN9LAagiB6ppwBZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgwZp45hnNNOsByKp9F4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_Ugw1EpTUwOtQkT8uXIN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgzDkoLsOun0Ct8Qg8R4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_Ugx1LR3-dRUi_PhD56d4AaABAg","responsibility":"government","reasoning":"deontological","policy":"unclear","emotion":"outrage"},{"id":"ytc_Ugx8m0Q2zKF5fafiiKp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgxGduhQ7gvtFp5zTAZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_UgyA4KCcBv65tucLmLF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"resignation"},{"id":"ytc_Ugz87oJ55NwG9Bz6KHV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_UgwhsanBY98PxcO8XAZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}]