Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We won't know ever if any AI will be conscious because of the philosophical prob…
rdc_j8v6586
G
Why don't you ask AI how to solve the AI's problem when it goes wrong 🤔🤣.…
ytc_UgwCzV3sR…
G
At this point Waymo will be lucky to be around in a few years, going by the rece…
ytr_UgxAfPJGv…
G
I was once in a discord server for writers and artists. Someone thought it was a…
ytc_Ugy_Ld2O1…
G
Me:”My SiStEr ThReAtEnEd Me.😭😭😭”
The threat:”Imma look at your character ai chat…
ytc_UgxVQhADT…
G
ChatGPT: "I would like to clarify at this point that I'm not an actual lawyer, b…
ytc_UgzQGLrwA…
G
Makes sense.
Chinese culture emphasizes only caring about yourself and your fam…
rdc_cc81984
G
The only Pro-AI argument I can give somewhat of a sense to is for commission pri…
ytc_UgzLsoF6q…
Comment
Spoiler for Fable 2.. or was it 3?
When i was young, i played fable 2 or 3 and somewhere at the end, you have to decide if you kill your dog or a whole bunch of people living there. That philosophical experiment. Of course i killed the people, not my fluffy. And i felt that connection to those dog-pixels.
That AI girlfriend, acting like this dramatic... people would for sure decide in her favor... 100% im afraid. Id have bought it back then.
youtube
AI Moral Status
2026-04-13T06:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxxWLzQ5p76GiGaEcV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzelkbg2HBjkD6pIsB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwta5a4DtZOEhNhK7R4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz5A0l4Ghrja7_YD6F4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyTH4StpUjDeTxfEtV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyAacWZdoP46q6t9wB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzFcWG4Owkbaki_sON4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzJ1nnM8v2ogAQmkcd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyphh5q8c0ZWhCeAed4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzvVVNls-5pwUBJt754AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"approval"}
]