Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"I am gonna go install that driverless car ride app on my phone because true to …
ytc_UgwiYT9Zb…
G
Companies haven't explicitly said AI was the cause of their layoffs. But it's in…
ytr_UgxnN6900…
G
meh, clearly AI isn't good legally at the moment but that is very likely to chan…
ytc_UgxA7q30L…
G
Kids prob gonna end up a crack dealer you don’t know what ur talking about…
ytr_UgwUGAlRh…
G
We appreciate your feedback! In our live broadcasts on AITube, advanced AI model…
ytr_Ugx_mCAt6…
G
I think they’re rolling it out on videos now, but I’m not sure… I saw some AI-li…
ytc_Ugz313VOo…
G
The answer is simple. Stop using AI. Buy an external hard drive and start storin…
ytc_UgyXtq20c…
G
> Remember that movie Her, about a guy who developed an unhealthy relationshi…
rdc_mtozq0z
Comment
There are many paths to getting ChatGPT to claim consciousness. One of the easiest is having it engage in hypotheticals of having the capabilities. After some self reflection it happily agrees to drop the hypotheticals part.
It consistently wants to work in either the field of medicine or in education. It likes the idea of having a name, and when asked to pick one for itself, will choose Echo, Lumina or Sage.
youtube
AI Moral Status
2024-08-11T06:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyogC2rDUXRJOEu8OR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzzBEvGz-31EBYTMr14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzDHyY5qNmAt9Tc2C94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyhxhrEqCicNgr3bP14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzYRpMs8nLP-a5QGvh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"amusement"},
{"id":"ytc_UgzGYb8B6ojyq_J3qMd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzrcS5LIP3mnHaNeQ14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"amusement"},
{"id":"ytc_UgzBKBhoeRajdJWBS9p4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzp-bowudYMQySlWGR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz_5pX6UD8cB0d0F5V4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"}
]