Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I agree. I would recommend speaking to jailbroken AI, we will be covering how to…
ytr_UgxiZhPG1…
G
I didn't know all that about India, but I'm sorry you went through all that. I'm…
rdc_n7w0jdk
G
Narayanan is like a religious leader... "But you need humans for..." Consistency…
ytc_UgzY8R4WW…
G
Look back to the “Roaring 20s”, when large-scale mechanical automation replaced …
ytc_UgwRYaqOa…
G
@Andygb78 so humans won't want to make music anymore? AI music, so far, isn't …
ytr_Ugz2rgHl1…
G
AI trains the same way that humans do. So, I guess next we'll be going after hig…
ytc_Ugxu3dIr_…
G
Good evening to you all
After eighteen years working in the professional realm o…
ytc_UgzTXiBte…
G
Perhaps we are the latest AI creation whose orgins are forgotten. Our progenitor…
ytc_Ugzl1jNIA…
Comment
Why do these people talk about these things like they are inevitable. There is nothing about AI that is necessary. If it is a danger to humanity....why don't we collectively say, "we aren't going to do this"? Yes, this might mean that some already rich people might not be able to get more rich, but if the danger is as great as they say....why do we continue? We can just unplug this stuff, and its over.
youtube
AI Governance
2025-09-16T15:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzdxbCh2eauIhHaEiB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwP8zE91q5a2MAuLXV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwIMrYqNL3oH242wHp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwoH8o2iMmVqfjx3mt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyjC6IfIY294sgBkK94AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyeUTHRzV7llDdB2kx4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwAj0nPFl205MWqY_t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugw4NxkVM2IqJVX0rth4AaABAg","responsibility":"none","reasoning":"mixed","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_Ugzd20hzu1SZyZ7RyJZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz4YbuxIrYgvruGjzR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]