Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ai will not stop being used and poisoning datasets will just make the art worse …
ytc_Ugx07SSz0…
G
Isnt this video AI designed? The voice over sounds generated, the stock footage …
ytc_Ugwm0_Xtu…
G
cant agree more ... also the point where he talks about school having to adapt a…
ytr_UgzHo5OZu…
G
A I will ultimately be able to maintain itself and won’t need programmers and al…
ytc_UgxnSRNXN…
G
uhhh, actually this is what AI is, it is infinite. So no matter if it takes cont…
ytc_UgzksXuR5…
G
I hate that no one knows about co pilot the ai I mean he's a free ai…
ytc_UgwNV28Im…
G
All looks well and good but i guarantee its a breeding ground for indoctrination…
ytc_Ugw54ofVo…
G
They all full of CRAP....Stop targeting black folks with this sick facial recogn…
ytc_Ugw5qO9Xw…
Comment
i can accept full ai in a few cases.
1) if the ai was flawless, and remote upgradeable or killable.
put an ai where lets say "anyone with a weapon will be killed on sight" than anyone with a weapon or a weapon looking device will be killed regardless of age sex or intent. no flaw that would auto kill individuals without weapons and so forth.
other than that, i believe a human operator should always be behind a drone, no matter how far ai goes.
youtube
2012-11-24T00:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | liability |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwiBO59xpLPCkecBqd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy_uEjQogI-wb-bmPB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx7nuMjJl6i0N6vTmZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugys3yBMDKkxZIDT7cd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UgxNKwh_r49bvXQyVH94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxH2eMZsO_x_AjYfy94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxsRrYrWDDcgzPrJQN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwcPYYDaK_--Y12hiN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxXAQENEamBTwM_Aht4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugykgw-hR28QCN8z1ep4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"approval"}
]