Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Not solar panels, concentrated solar power (ie those towers woth mirrors around …
rdc_ibdfqoa
G
In the end if the robot goes to error to destroy the world then we blame that wh…
ytc_UgxgSowFh…
G
AI art is lazy and dumb i am bad at art but the point of art is to learn, you dr…
ytc_UgzW2gp78…
G
A lot of this wouldn't be that much of a problem if we weren't living in a socie…
ytc_UgxsKXMf_…
G
It would spread more awareness if you put it in other languages not just English…
ytc_UgwI5Ria9…
G
Bernie you are a complete moron... They are investing in Ai because China is pus…
ytc_Ugy5LSoFc…
G
I am absolutely crap at art, and I have thought of stuff in my head that would b…
ytc_UgxumTtcA…
G
We need to say no at some point. Limitation is necessary. Human being must value…
ytc_UgwbFy2ZY…
Comment
The makers have of course an obligation to make money for the investors. Part of doing that is promoting AI in the best light, lying about how far it has come. I will believe in what this Dr. Roman says the day I see it. I was not born yesterday and has high degree of common sense.
youtube
AI Governance
2025-09-04T23:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwcg4il7gZUHtfK2Fp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzTdn4y2ZeiUytqcGF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgxFfFgaON5NutyepAF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxEZXyPGmYgT7Y2HWR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz8EMPwk-DBLRxLqS54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzDw43xki5rhK7pHEB4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxRl_vB-gChofNpM3x4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzpqmJcpKqiBmB8Tn14AaABAg","responsibility":"none","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwxT5n84DofFAoms3h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugylx98HugrqehTPYct4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]