Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Been around guns my whole life never shot any automatic
This fucking robot gets…
ytc_UgxfLDK-W…
G
Can super intelligence exist in the presence of the absolute stupidity of the av…
ytc_UgwIJLRjj…
G
Yea I use AI for stuff and kind of not really for art sometimes maybe idk I don'…
ytc_Ugzf4YfZI…
G
The fact that it was Sam Altman asking for licensing made it laughable. Just a f…
ytc_UgyLFbI1U…
G
Tesla killed many people with their non-working-yet-marketed-as-full-self-drivin…
ytr_UgwC9qsUp…
G
DISINFORMATION - This is presented in a misleading way because it is not being …
ytc_UgzrN8p1g…
G
I don't think y'all want an AI CEO or supervisor for that matter. Can you imagin…
rdc_jsxbjg5
G
Well if super intelligence becomes a thing and goes rogue at least we will be ab…
ytc_Ugy-fHE2_…
Comment
Wouldn't we make it so that robots enjoy being our literal slaves? The difference between a human and a robot is that a human can almost never enjoy being a slave, while a robot can be programmed to enjoy it. They can be programmed to feel good about being servants.
youtube
AI Moral Status
2017-03-04T11:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgiA1_INbJOFTXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugi1nrPKExbHOHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UggFGTUIov_oOHgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UghdOolC8joZ6ngCoAEC","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UghAw59QZBitCngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgjT-wD9PuFMo3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UggRBlCDj7mB73gCoAEC","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgiLuFIX4HCn7HgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgjcZKTKJEoieXgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UggVd289Q9KLTngCoAEC","responsibility":"unclear","reasoning":"deontological","policy":"none","emotion":"indifference"})