Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It is horrible how big tech develops all-powerful AI deepfake creating programs …
ytc_UgzhChF7g…
G
Stop panicking about "75 million job losses"—the 2025 data tells a much differen…
ytc_Ugyf_jOKl…
G
El movimiento lento es para evitar el desgaste y posibles daños futuros que teng…
ytr_Ugyx9Hewg…
G
Is it ethical to use AI in war? I suppose you have never heard of this song “all…
ytc_Ugw4lZwSM…
G
Change is hard. If you guys at TYT (and I'm a big fan, mind you) want to be cons…
ytc_UgwQqIY9U…
G
I just finished this book and searched for one of your recent posts about AI to …
ytc_UgxUYxBzY…
G
From what I've seen, AI struggles most with accuracy and complexity, not creativ…
ytc_Ugz9LuZld…
G
ohh my god this guy does not know anything AI. Get a software guy . AI is comput…
ytc_UgxJ4zIRL…
Comment
We are creating robots to talk and behave like our partners of life because we want to feel their presence and talk to them after their death. There is an episode of "The Story of God" with Morgan Freeman about it. The robot's name is Bina48.
youtube
AI Moral Status
2017-02-28T22:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UghVriokmiBrdXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UggjMob2djzkEHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UggJr8-UN-xM-ngCoAEC","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UghhNDhzWUUiOngCoAEC","responsibility":"government","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugj9myDUs7y-zngCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjfweSgo8G6r3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgjXivWrKkGxu3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UghzKagSWsoOAHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Uggj1y11qcrSHHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UggaLH0Jy1BVU3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]