Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"If you can't beat them join them" just lean AI and lean how to interact with ai…
ytr_Ugw7lKd1y…
G
I don't like how these videos are edited. Key words have been removed to mask th…
ytc_UgyRiYZFD…
G
You can deny people the use of your website fore a lot og reasons
Ther is a wh…
ytr_Ugwe2lXKj…
G
the way it was just men too tells you a lot about why its mostly them making ai …
ytc_UgxloEEpl…
G
Years ago it was tictactoe, then chess players
In 2016 (around then, I was too y…
ytc_Ugyb9tGnM…
G
Can I switch over to a career in AI at the age of 30 ?. Is it a good idea. plz s…
ytc_Ugyt8a96g…
G
4:06 that's such a deeply stupid presumption, though. Because it'd be a deeply s…
ytc_UgyoLMOtF…
G
That’s another thing I haven’t even thought about until now.
Skill transfer!
Alm…
ytc_Ugy_wn1Bq…
Comment
The comments section is disappointing. It reminds me of the old times, when people were making cases that animal cruelty laws shouldn't be implemented. Because they don't have souls, they were just fancy automatons that just REACTED like they experienced pain.
People, the distinction between our sentience, and what may arise from an A.I. is blurrier than you think. I would advise that some of you watch movies or read stories that involve these themes to get a better understanding. Blade Runner comes to mind, as one example.
Personally, I hope we never create anything that can achieve sentience. It's just too cruel.
youtube
AI Moral Status
2017-03-01T23:3…
♥ 54
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugg9Dqny3LoDQHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugjl892grkD1CHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjusG2XXNsQ8ngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgjdXJQpASsKnXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UggFqHDoWRfrsXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UggRQk_shtKMS3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgiU0CbkUs7EXngCoAEC","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ughae_Q7RxIYQHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Uggczad5RakHtngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgilhY784SZqgHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]