Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"Mind-blowing! Witness the future of robotics as this AI robot pushes the bounda…
ytc_UgxLTtae6…
G
No job is safe from being replaced by some combination of AI plus machinery.
Wh…
ytc_UgxG7qegx…
G
I think a lot of guys are going to end up with penis injuries in the next few ye…
ytc_UgxwbvXkX…
G
Any browser company can do this already, they don't need AI for that. Malware fo…
rdc_nufufb5
G
Writing AI prompts is something any -tard can do. That's why so many people are …
ytc_UgyYdqINf…
G
you fed your brain with tons of pictures and learned from the masters I asume, b…
ytc_Ugz_Xm7AL…
G
The CEO of a tech company warning of the ‘extinction level’ threat of AI is a fa…
ytc_Ugxnm588c…
G
Depends on who our is? There are those whom are grandparents, mothers, and fathe…
ytr_UgxTP7P0S…
Comment
Like with nukes, the people at the UN or any government can SAY whatever the hell they want. But the cats out of the bag, the technology exists now. There's no way to put it back into the bag. Trying to limit the spread would work worse than even nukes since nukes have the limitation of part of its required production being a limited resource that can have SOME amount of control. But autonomous warfare, isn't that way.
youtube
2026-03-11T02:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugz7bxPrZ4ktDOixWyJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxEzJElYC6nD5dQXql4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugwuy7Lbxw3YDCO2Nj14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw5PAX1-qMY73lXbph4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxqP6yWxaMoI4gI9g14AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy4c5nu3GtJxxzL6C94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzj0utiZyrJ0AMN-8t4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy2r56Rc9FWdFf6Kv94AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgzsT-3UIkdWaIEn2Eh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyUedzy5nSjUnuZr2p4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}]