Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Paul Virilio said when we invent a technology we invent the related disaster - …
ytc_Ugzqugwni…
G
Nuke's were/are worse than AI (theoretically, press a button and destroy a whole…
ytc_Ugz-WUwQP…
G
Would love to draw a bridget from guilty gear strive as a plague doctor, i don't…
ytc_Ugy_ygRK3…
G
Hm, the last AI response does not appear to be ChatGPT, but instead to be Google…
ytc_Ugyu766U4…
G
If that's true by 2030 then by 2035 AI will be doing ALL jobs, if we are able to…
ytc_UgwHQUIRJ…
G
@petermmm42 Please keep in mind that people have been saying that for decades ab…
ytr_UgyB9AGh-…
G
His response with a "plumber" was in my opinion was just to make the point of ge…
ytc_UgytYiZrq…
G
Except, no one is shaming anybody for using AI.
The problem starts when mental…
rdc_n7t8wve
Comment
Human kind doesn't need any invention. If we lived without AI, AI it's not a must. What human kind needs it's cooperation. We need to work together for our own good not for the benefit of a minority always imposing it's way. Technology did not free humanity from work nor hunger, poverty... How AI would? It'll just increase poverty lowering work costs.
youtube
2026-03-09T21:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | virtue |
| Policy | liability |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwa7kCZRSoKOK2--aR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzUYAL4OO-CPtFZbMt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgycykcpCIS4dgj4DbF4AaABAg","responsibility":"government","reasoning":"virtue","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgwuqIcWWBjgi23IqkF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugy6_aRK3V-vpHd0-Pl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzVtTbJXyaaoHPkxrh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgzlM9173FQv6DbZY4R4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz19dVcEQ0Md6SPjWp4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgykokbYn25Lolrt4kh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_Ugz1M3x7y4LkhE5ae-B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}
]