Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
When I "yuck it up" about the Jews on ChatGPT I 100% get biases and have to expl…
ytc_Ugxm9KDIe…
G
@thanveerdev The customer doesn't know everything. They often do not understand …
ytr_UgxBuQjSu…
G
FYI every instance of ChatGPT is entirely unique. And the reason for getting a d…
ytc_UgyV0jJXi…
G
Only reason the future is scary is that 300 years ago when they used to make woo…
ytc_Ugxc9tm3D…
G
Oh no. OpenAI cant control their language models they should be sued for million…
ytc_UgydjrGWx…
G
Very succinctly put Sam.
Why a double standard between visual art and sound / mu…
ytc_Ugz7CbAcb…
G
This man is correct it will not solve it and especially the big excuses if we do…
ytc_UgxNbrknZ…
G
"Oh get creative, liberal arts people, find a new job." Says someone with the pr…
ytc_UgzZftnpX…
Comment
So, the argument that it cannot be unplugged because AI replicated has many backups, then what happens in case of blackouts? And if AI ends up killing all humans (or almost) who will be there to generate electricity? To harvest coal, oil, gas, uranium which fuel the data centers and the backups? Human can survive without computers. We have for thousands of years. Computers can't survive without humans. I mean, I hear the danger of AI and how we are dependent on it but this only works in a connected world and resources are limited so eventually, we will have to disconnect.
youtube
AI Governance
2025-09-16T15:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzdxbCh2eauIhHaEiB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwP8zE91q5a2MAuLXV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwIMrYqNL3oH242wHp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwoH8o2iMmVqfjx3mt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyjC6IfIY294sgBkK94AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyeUTHRzV7llDdB2kx4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwAj0nPFl205MWqY_t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugw4NxkVM2IqJVX0rth4AaABAg","responsibility":"none","reasoning":"mixed","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_Ugzd20hzu1SZyZ7RyJZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz4YbuxIrYgvruGjzR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]