Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It sounds like you're amazed by Sophia! She definitely has some fascinating insi…
ytr_Ugzt7xD45…
G
Our world is going to look like an episode of "black mirror". This is human anni…
ytc_UgyWgdtJq…
G
AI is scary!!!! People need to wake up. People can't even think for themselves a…
ytc_UgyuTxbLm…
G
AI is just a tool to aid people. Developers only spend 10% of their time actuall…
ytc_UgwB-Zilz…
G
Until copyright laws adapted, the printing press also let business owners import…
rdc_j0c6rwm
G
Why there was not any safety measures in the AI. If someone was talking about ki…
ytc_UgzSkHtp8…
G
If u look closely the last one is not the ai’s fault whereas it is the fault of …
ytc_Ugym3Q3tc…
G
Yeah, just accept the fact AI is slowly taking over the world, like companies wi…
ytc_UgyOcVbsh…
Comment
You act like robot's in general will just become "sentient" at one point. Not all robots would be sentient, only specific ones would be designed to be that way. And why would someone take the effort to enslave a sentient robot when they can get a non sentient one to do the same work. The non sentient one would also presumably be vastly cheaper to obtain than a sentient one. Like why would a robot that doesn't want to do what it is forced to be doing better than a robot that doesn't care and will just do it. It's hard to imagine any roles a sentient AI would be better at than a non-sentient one.
youtube
AI Moral Status
2017-02-24T01:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UghlGXyQaNsIXngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugh3566kEkH8cngCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugiw7wyfDqb7DXgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugh_Jnzba6zPMngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgjTD1xyiwqHLngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiOCFMcZTm5j3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgjGbkL2h-4AkXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugg4D6Jf2shlKXgCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugh5cHol1AeHWHgCoAEC","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UghIJiW-jZtI7HgCoAEC","responsibility":"none","reasoning":"deontological","policy":"liability","emotion":"approval"}
]