Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Does this mean that AI robots are going to takeover the job that politicians are…
ytc_Ugyy7cLUd…
G
You people need to stop being mad at AI for existing and start being mad at the …
ytc_Ugzj0tWLl…
G
The first robot was looking at the camera so i dont really think there robots…
ytc_UgyuOHF4-…
G
Case law and legal argument is the second profession to be made redundant after …
ytc_UgztuXl25…
G
Stories like these are truly annoying; I've personally seen people try to push A…
ytc_UgzIlTFVV…
G
They don't think the art is bad. They think that how it's made is based. It stea…
ytr_UgyVW8iQ3…
G
I'm gonna let you know how they are are.From 1925, they tried to make a robots s…
ytc_UgwA3dYnw…
G
"We have to take the enormous profit of automation and be willing to share"
Yea…
rdc_my07qg2
Comment
Why would super-intelligence progress? It's sort of like asking "who made God?" It's an infinite regress. If a super-intelligent operating system progressed, it would make itself obsolete. If anything I see AI destroying itself (and whatever gets in its way of destroying itself).
youtube
AI Governance
2025-09-04T17:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzUDDMnsRnIqjKWMY54AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxVtV0A6DuSokP_oR94AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzIkOulPc6KhmZGL5x4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzKmgU5nCG59oKOMfB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugwg2N14Ern6rZHoJ9V4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgyKxG338wt6HSRBTGZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxEu4hlVdBn9Iajdxd4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyCJsBy_pQzU2YfQk94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwrHyfH4BlSjtXqi6l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxvQFbTFh1Gl6lx_dZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]