Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I make my oc/persona the most traumatised i can possibly make them then force th…
ytc_Ugzstvitc…
G
Metaculus, a reputation-based prediction platform, has community forecasts indic…
ytc_UgzzdqQls…
G
These AI apocolypse people are nuts. There's no way 99% of people will lose thei…
ytc_Ugz0g-oPy…
G
As A.I. takes over, entire skyscrapers will become nothing but giant storage tow…
ytc_UgxDDxzsq…
G
Thank you for sharing such a thoughtful perspective! Treating AI like Sophia wit…
ytr_Ugx_HPip5…
G
I was a top computer geek and a cognitive neuroscientist and I warn that AI is N…
ytc_UgySs4RPm…
G
Also btw you can't get a license onto art made by AI because of a certain lawsui…
ytc_UgxZMEaM0…
G
Here is my opinion on AI safety.
We have already hit the point where red tape h…
ytc_UgzbgvpBa…
Comment
DARPA has been experimenting with autonomous armed vehicles since at least the 1980s, when they mounted an M30 machine gun on a four wheel drive cart and called it Fireant. After forty years of development, only now is it being brought to public attention? It's too late, man. They already have robot attack dogs in US cities.
youtube
2026-03-10T20:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyA9kXd7xiiQKMXbCB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzjxjZOUjfaQipRYpp4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwwpNN_xweBqUNxhu54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzWKwDfLr3d_2ID03d4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyEFdTlbF7F4Uc68dt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx4-RFfed9D_BAj6RJ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwfXsu0h2aalz08mXd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzMEkDcDlpNUIaXDT54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwyIL12YAbAC72wA9B4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzLy--Rit9lri7HEm94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"mixed"}
]