Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
User
tell me a joke
ChatGPT
Why don't scientists trust atoms?
Because they…
ytc_UgxTG144J…
G
If AI takes our jobs, how will we complain about how much we hate our jobs?
Le…
ytc_Ugw7JAyQ4…
G
it isnt "art" its ai images, dont call it that, "the expression or application o…
ytc_Ugwd07O2A…
G
while I don't entirely oppose pirating I think it's absolutely disgusting to fee…
ytc_UgzDwutua…
G
I'd rather buy ugly or childish artworks from artists who put hours and efforts …
ytc_Ugxl3M-6C…
G
If half of all white collar entry level jobs are eliminated, who will purchase t…
ytc_UgwoxIxfR…
G
Working with a new grad in this startup, who knows the founder
He vibe coded t…
rdc_oi19cai
G
A big part of the problem is that *these aren't real banks*. For the most part …
rdc_ohua29k
Comment
Asimovs 3 rules of robotics were "released" in 1942 as part of a fictional work. So, saying that AI safety has only been talked about for the last few years is just ignorant. Just because it was not a mainstream topic doesn't mean no one considered it.
youtube
AI Governance
2025-10-04T00:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy0-y-hREOS9YQLiaN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw8HLJqLCWLI9STEZN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzGLAuHy-JBVmEECc94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxxVI0NtqaA59MikKZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugy3JSFzKB9oMvK4ePd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugz2ZCaKC9Ma8rOmrVt4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxUX0YfgniL1Pvz17N4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgykTdQkwlw7IEFKbC14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzlTm6ntMyvqiHBg554AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxW8S473hmWW_IBL_B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]