Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's kind of disappointing that these discussions are always about how superinte…
ytc_Ugw0ssw-Q…
G
Art Mentor actually bought this up in his latest video about AI and when I heard…
ytc_UgwUkphht…
G
Just wait until judge becomes ai rip humanity and it will be of our own making…
rdc_n5jeoft
G
What jobs are they gonna be for humans to do for AI robots what do monkeys do fo…
ytc_UgwHTJ3L5…
G
I hate AI so much.. even my boomer father has become an AI bro and just mindless…
ytc_UgzHffyvI…
G
There isnt one single commentator that thinks AI is a good thing coming that wil…
ytc_Ugy9wgMy0…
G
Lack of common sense and low EQs run rampant in Chinese society . Introducing th…
ytc_UgxziDqCv…
G
either mak ai create 3d models art ..we seen no other forms of cures mnetioed ..…
ytc_UgzZLxTG2…
Comment
I think we are more likely to make a sentient biodroid than silicon based sentient AI, so we SHOULD be in a better position when we accidently make the first silicon based sentient AI. Creating a sentient Biodroid would be an excellent stepping-stone, as they would be close enough to human to spark mass-outrage over slavery and pave the way for silicon based sentient AI to gain rights. But the best argument for giving sentient AI rights when trying to convince someone who thinks terminator is a documentary is that slavery is a highly impractical institution. It wastes the skills of the oppressed, sparks rebellions, costs the government a fortune through increased garrisons, and magnifies inequity in wealth.
youtube
AI Moral Status
2017-02-23T18:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugh8Be6KyQwV-HgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugig6ZaSL0xYUngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UggpmXzTxCn0_HgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"resignation"},
{"id":"ytc_Ugg0JlDKIdxowHgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgjucK8bclx98HgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"resignation"},
{"id":"ytc_UgjxYcYgD_mbE3gCoAEC","responsibility":"developer","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgjxlQ5IIqou-HgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UghWat3HN-CRn3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugg6_7WvjPQi53gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UghrPpy0tE2CDXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]