Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So true the final puzzle to AI is to learn to fix and maintain itself then we in…
ytc_Ugw2mTRg-…
G
Ford recently had a self driving car that wouldnt stop rode the rail till it cau…
ytc_UgzMsInJv…
G
Ai is dangerous. Havent there been enough prophetic sci fi movies on this to be …
ytc_Ugw8VyEIi…
G
Skytalon Food for thought. First powered flight in 1903. First moon landing in 1…
ytr_UghlDrX-6…
G
@Pickle_Rick007just saw a clip of a robot attempting to hang rock. It shut itse…
ytr_UgyKPaaae…
G
Will businesses who use AI to generate drawing also be obligated to label their …
ytc_UgyxBdH48…
G
simple, make a most favorable employee rule, to tax and cap the profit of pure A…
ytc_Ugw4g2d8i…
G
Ai algorithms are not and never will be "conscious". They will present a perfect…
ytc_UgznqbABI…
Comment
Whats worse: a psychopathic general or a autonomous drone?
In any case an artificial intelligence without restraints would certainly figure out that humans are the greatest threat to nature and themselves.
youtube
2015-07-30T18:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UggLZ6M2z5JqTngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugi3e0GA4HfH8HgCoAEC","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugizge_QLY4xw3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UghHaxdOpGagangCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgieDwB_j4qUKngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgjvbmAc83_c83gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugia_nrfbV5-d3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UggRCZjvTN6Mg3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgjRoWWlA3PONXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugh0VoxfRhV-tXgCoAEC","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"mixed"}
]