Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You gotta wonder if the new phones that use facial recognition technology to ope…
ytc_Ugwwe-yKr…
G
stop watching these goddamn sci Fi movies. if you're so afraid of robots and AI,…
ytr_Ugx27Mhft…
G
Wrong, it will replace coders, it will only need one person to tell it what apps…
ytc_UgzKLJkNK…
G
If AI fails, the economy collapses
If AI succeeds as promised, it takes all our …
ytc_UgzvORYnd…
G
@olivercharles2930you can tell it sucks and you can tell it was written by an A…
ytr_Ugy_Ld2O1…
G
Bernie, AI will not be stopped, either USA do it first or the Chinese will. The…
ytc_Ugw18GGlh…
G
I gave this reply to another. We just need to catch up with our own humanity. Us…
ytr_UgxdedI3o…
G
There are rules of military engagement that protect civilians and criminalize th…
ytc_Ugxq9UzNz…
Comment
as a student in software engineering I wish I could mention to the scientific community as a whole "stop researching AI unless you are prepared to nurture that AI". From the dawn of human existence we have made tools to make our lives easier, or worn out our tools so we don't wear out our bodies. Making machines to do our work for us is so we don't wear out our bodies, making those machines sentient means they won't want to wear out their bodies either. So we should just plain old not make AI so we can keep the machines as tools we use to better ourselves, rather than turning machines into our alien offspring.
youtube
AI Moral Status
2017-02-27T04:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugg3xHoUtx6gWngCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UggE8ZCLy_Y7-XgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UghT7BJ_Jkv_q3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UghD0PHvZSddz3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UghzdlAYEf702XgCoAEC","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugi0wlK0xxTZ3XgCoAEC","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UggL_n6lQWteeXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UggLBNtGHpEtHHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgiX-KqMqEVNV3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgivRlFZ-T5UaXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}
]