Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The cars should be prohibited from being sold as long as they use the term, AUTO…
ytr_UgysPhuxG…
G
When somebody says a robot could never do my job you show him this video😂😂😂😂…
ytc_UgxlXqioI…
G
"I used to drive a cab until the driverless car and android dispatcher made me o…
ytc_UghlFLSF9…
G
I don't get how any of this conversation is supposed to be diminishing fear over…
ytc_UgwlQ8bUa…
G
I think it's more important to use AI as a function, not a being. I think AI's f…
ytc_UgwnueM9x…
G
I think the worst case scenario, that it is... that there's something it's like …
ytc_UgxEk4JjO…
G
You're right, in the short term... But in the long-term AI absolutely will take …
ytc_UgzykdYA7…
G
AI needs to be stopped!! It was NEVER going to be a good idea! I wish we could g…
ytc_UgwAut-ZA…
Comment
if people dont wont work how do they purchase the services provide by AGI. will the abundance be democratized ? I doubt it. people become worthless. its dark.
anyway, sometimes I wonder - we were all going to die anyway , so what is different - just the how
the irony of AI figuring longevity escape velocity to give us unlimited life, only for AGI killing us because we didn't program controls
youtube
AI Governance
2025-09-05T14:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxE9ChmCVCR7RIbItN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxWGdFmhoz6SCLASMx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyMiweQ6vzFHGXxD3Z4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwYGWoTRD95x74qJW14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxuZwB1avMpfkJUD_54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxUoEh98snuKDkbr_h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwT2-16VdP8sLn0na94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxrHv1I9E3pGyRACl54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwyCqqjAGn-EcQDDyl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxA2ZSd7emkPKsh3Up4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]