Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Heres your description of a worthwhile world with human-AI ESIX development:
We…
ytc_UgzWqGb9k…
G
God creates man, man fights against god and creates a robot expecting a differen…
ytc_Ugyg5CkZ5…
G
By definition this would not be utilitarian. Killing millions to maybe have any …
rdc_gx63j8e
G
Thats true in 3rd world countries but not the west. We've had low death rates si…
rdc_nk4428r
G
People who thinks that the robot thought he was playing catch because the guy pu…
ytc_UgzUAuydo…
G
most of us can string together the context. some of us need a map, directions, …
ytr_UgzLLYpXA…
G
AI don’t work like that. If you asked them stuff like that unless their text is…
ytc_UgwNG_Dfp…
G
@SusCalvin Subtitles are the lowest in price translations, usually for students …
ytr_UgwSnuTQk…
Comment
First stop the wars, al weapons and bullets must be disappear from planet earth-when there are so many problems AI will deep the problems and not help.........People will become dissocialized, fat and stupid (just check how kids in schools are already using AI to solve assignments and exam tests - all knowledge is written in books and not on the internet) all in one fell swoop - all in Alian intelligence...ops Artificial intelligence...Tokens bla bla bla when war starts first thing they will disconnect the internet...AHA moment (Internet was invented for military purposes)
youtube
AI Moral Status
2025-07-25T19:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxbzFzUcLviNTFmK3V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx2vLP3y4OOoOaNPah4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"skepticism"},
{"id":"ytc_UgxhoR5UHK6THIMTaSF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzZhKhJ4iVVt2hAfQN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxgQLOVOtt98fyt7lR4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwWxPJKhJVI0MzAic94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzDbssCuyZV4i4D89N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzD6hIZA8zXmdwa9gN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxWmftk2HugRn7na3t4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxGm-uZtJ_u4Pe3TGp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]