Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We appreciate your engagement with the content. The robot in the video operates …
ytr_Ugx55VD2u…
G
If I remember correctly, these algorithms are not open. Neither of us know how t…
ytr_Ugh7duXxN…
G
pretty convincing stuff ... FROM AN AI!! Im onto you Exurb1a , or should i say E…
ytc_UgwsebN_8…
G
Yes it's someone's child and my 12 yr old granddaughter knows not to touch it.
…
ytr_UgyzRxECF…
G
Even though i personally am against the constant continuous integration of AI in…
ytc_UgxjrlZHQ…
G
Not these twats posting a video about AI taking job, made by AI and narrated by …
ytc_UgzBUeuPC…
G
I cannot see AI not killing the human species. The human condition’s been fk’d u…
ytc_Ugyu3OeMQ…
G
Always happy for a video from my favorite existential depression turtle... I've …
ytc_UgztMdsej…
Comment
Thing is, by the fifth law of logic (if i recall right, the principle of sufficient reason), it's impossible for a being to create another being that's at a similar or higher level of intelligence (in an artificial way). It may be possible through genetic manipulation, but then the result wouldn't be a robotic being neither would have been an artificial creature, as it would've been "born" somehow and would be alive.
So even when Matrix, I Robot and Westworld present interesting concepts, they're ultimately futile. It's just science fiction, so I think we should just abandon the topic and focus on human rights, because it seems a lot of people lack their own. First fix our current world, then try to fix impossible imaginary future ones.
youtube
AI Moral Status
2017-02-23T20:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugi3l4d6_ZVSPngCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UggXxUS6ImDcVngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UggiiBkWN73X1ngCoAEC","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugg5MrhnXcA4ZHgCoAEC","responsibility":"distributed","reasoning":"contractualist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UggRPiq5dwY9P3gCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugg2foK25E_ACHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjaRkRwKWzpoHgCoAEC","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugg8QyIAn2PW43gCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugj4vFwy4jRFsngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgjrvLoOOdSbhngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]