Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
That’s because the US and the UK are selling arms to the Saudis which is fuellin…
rdc_gruzv5e
G
Ai companions are wolves in sheep clothing. This just furthers long term problem…
ytc_UgwKhOHpw…
G
He wants to be right. The safest way to do that is to lowball and then say maybe…
ytr_UgxvGHZWW…
G
Without skin, she was ugly, but with she was beautiful; this is the reality of u…
ytc_UgyAJY8rV…
G
I think what the world governments need to do is to allocate a basic salary for …
ytc_UgzSVd1W2…
G
it seems like a lot of people dont get that the ''AI wins" is just sarcasm since…
ytc_UgxkUbqro…
G
Soul is an energy, and energy is everywhere. Our bodies are just biological robo…
ytr_UgxR91ITm…
G
Only will work if EVERYONE has self driving, along with proper safety for large …
ytc_UgjzNTXzu…
Comment
First Law: A robot may not injure a human being or, through inaction, allow a human being to come to harm.
Second Law: A robot must obey orders given it by human beings except where such orders would conflict with the First Law.
Third Law: A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
youtube
2026-03-11T00:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugz7bxPrZ4ktDOixWyJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxEzJElYC6nD5dQXql4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugwuy7Lbxw3YDCO2Nj14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw5PAX1-qMY73lXbph4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxqP6yWxaMoI4gI9g14AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy4c5nu3GtJxxzL6C94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzj0utiZyrJ0AMN-8t4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy2r56Rc9FWdFf6Kv94AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgzsT-3UIkdWaIEn2Eh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyUedzy5nSjUnuZr2p4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}]