Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The ai told him several times not to kill himself or don't hurt himself. If he w…
ytc_UgzKoMbhd…
G
Well it is going to raise productivity not replace jobs. The Jobs replaced in te…
rdc_n7px44v
G
florianschneider3982 Because it an environmental nightmare, musicians lose work,…
ytr_Ugx9-L38H…
G
9:10 oh come on you trained it with us and it wasn’t like you picked the best of…
ytr_UgxpzCnzA…
G
Lol 😂 ai don’t have pain and emotions and never will . So no is not real scream…
ytc_UgyS9_S3I…
G
This might be a paradox but What would happen to AI if you told it or even code …
ytc_UgxNKmddl…
G
@FlatOnHisFace "Look at me, I use fancy words, Im so smart"
All while you compl…
ytr_UgxY8zDpS…
G
I don’t understand how training new AI on the products of older AI produces bett…
ytc_UgwJ_Exo3…
Comment
The UN has been trying to come up with some rules for AI weaponry however the countries making these systems (US, Israel, UK and Germany) won't even accept a common sense definition of AI but nor are they offering their own definition so it hasn't been able to progress any further
youtube
2026-04-13T21:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwXhvVFQMizm1MZAEN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzrQhJFViyGy1Kwho94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz7k0-_DlhaKZHydrh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz6D1qGHg8W6PQANZ94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxKfzsPwe2FXeukGm14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwfzT9gDuZS2ympFpV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyMDukbLJApBgYT9nV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw7YzHzFi49BKbOJKx4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"resignation"},
{"id":"ytc_Ugzcqt6w6ma0diPhd9x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyQu6PI7HEfd7PTi0N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]