Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Rather support ai art then an artists that believes looting is acceptable ac…
ytc_UgzphBovh…
G
its school,where kids going to learn not to force them to learn..really AI?? eve…
ytc_UgzLEvbLB…
G
Music Artist here and just adding a few musical additions (odd time signatures, …
ytc_UgxtQ-58u…
G
@zoomingby AI managing the profits for the military industrial complex would lea…
ytr_UgyRAcP3-…
G
Im more worried about what pushes people to rather bond with an AI than other pe…
rdc_mdj06g0
G
@SinisterRaccoon217 Hello Lefty217.
Though at this point predictive-text sof…
ytr_UgwQ_d0XT…
G
Must say, as a user, if it's AI Music, AI Art, AI whatever. If I, as a user, fee…
ytc_UgxDBWicQ…
G
Disabled writer here! My art is part of how I've dealt with my disabilities from…
ytc_UgypVGTGo…
Comment
AI must be in the hands of Humans with empathy.
It's similar to Nuclear Power and Nuclear Weapons in the hands of the responsible and trained Humans. Safety and Security is no 1 inuclear. Because you can start war and extinct certain group of the planet with humans.
We need to do the same with Ai to priorities Safety and Security otherwise I foresee evil humans starting digital wars and wiping out digital information of other nations etc.
AI is good if it's used ethically, but it seemed governments are hands off and relying to the Oligarchs to develop this technology without putting governance to protect humans or the world from exploitation.
youtube
AI Jobs
2025-12-14T07:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyD_pT1SPbEORWBR854AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyyZyZxSzF_AkDt2Mp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz1sMxigyFppQGnF9N4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw4Dcxb6MVCk6q-VKh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyvPM7SAdCAPp0eRIx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxNnWJIvd025-fy1w54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy181EU7sTWU4BZQ654AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugw1mOmdvcaLBKLalA94AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzmDSWGnOQJ3DFpLrx4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzKhQcKz6J1R3wX8K14AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]