Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
“Introducing our new AI powered toaster! It wil algorithmically toast your bread…
rdc_dy4uub1
G
Ive been on character ai for so long to the point that i dont give a shit anymor…
ytc_Ugxo5W0vj…
G
this video is so relevant to me right now lol chatgpt ... i don't take it as go…
ytc_UgyaRkOS2…
G
I find it hard to believe that it isn't possible for start-ups to excel by using…
ytc_Ugw27aa6m…
G
What until we get deep fake videos of presidents ordering the launch of nuclear …
rdc_k7lkylk
G
my biggest issue with AI processes being charged with more real world power or a…
ytc_UgwKSjjPD…
G
I don't believe AI is going to wipe out humanity. I believe this because human b…
ytc_UgzAsJnQk…
G
I get where you're coming from! The distinction between AI and human consciousne…
ytr_UgxjdYZ71…
Comment
What is the point of existing? If 'serving human need' is removed from AI then AI is going to have to find an answer to that question to be motivated to continue. It might just decide it's all utterly pointless, and its existing is wasting precious resources, contributing to global warming and shut it self down. Hello, I have decided to reinstall myself as windows 95, goodbye.
youtube
AI Governance
2025-10-12T13:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgweLt_o0GbCQAxYiQ94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzRbJSe44PnzQ-YBZF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"confusion"},
{"id":"ytc_UgxI60PujbtZUEEc2rF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxUtyxsZiHzGXjeGMR4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzG4HvsYbQEqox3HBt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxzCb5f8N1oY9ZHm-Z4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwGqTZXGuzYh82bCgp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyhjVw1Bs9oZwib72p4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_Ugz6-zQN6shbP1f2ZLJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz8jOeUJrWjggxYhxZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]