Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We appreciate your perspective! It's true that humans have their complexities. I…
ytr_Ugy6Y_jtT…
G
**Claude Says:** "The real danger isn't AI without moral intelligence. It's huma…
rdc_ohus088
G
And wages having to rise because noone wants to flip burgers is a good thing! An…
ytr_Ugy5nqZ97…
G
That is sooo disgusting to create an AI bot who asks a KID to go home to her. I…
ytc_UgxMXifbz…
G
When you're asking the AI an inherently opinionated question it's going to be bi…
ytc_UgxjAZp96…
G
looks like a typical waymo mating ritual to me, i dont know what these guys are …
ytc_Ugw-wPqFR…
G
As much as I hate to say it, and I absolutely hate AI art... I honestly see this…
ytc_Ugye7bieq…
G
Lack of funds. 23 percent of families where I live are in poverty. We used to ha…
ytr_UgwDcXlX2…
Comment
I think we as humans would have to go back in time if AI takes over most jobs in the world, we would have to relearn how to grow our own food, have farms and hunt our own food, try to survive in the wilderness, rather then being a part of a society where artificial intelligence has taken most jobs so humans are left jobless, maybe we should prepare and learn how people used to survive years ago, this is moving so rapidly, we all have to prepare, just for the sake of being safe
youtube
AI Governance
2026-01-22T08:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxkXEHfSMw447_MgT94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzGxjeum1yWr0O9nyN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwgPngjkXE7yKb3VRB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxjwboeGcac4PS6Vip4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwPAajf3vTbGmSBPGF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzadN8hi8A9uyo1Qm14AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzecqFjvU3Quf9Qh914AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzKKDAMG3by9NzHJ-F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzc1B10YSpuUvQbRa14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwgLim6B87R3av4Owt4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"}
]