Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Well the use off that ai might just cost you 7% off your companies yearly profit…
ytr_UgxW_1UCU…
G
AI data centers need to pay all costs for electricity infrastructure and they mu…
ytc_Ugy57QOxd…
G
there is no way that a for-profit system uses AI to benefit anyone but bosses, r…
ytc_UgyFoOd7g…
G
Thank you so much for addressing this. As an artist this is the scariest thing t…
ytc_UgysAy4us…
G
"AI will kill us all...now watch this vid explaining how, that we made with AI"…
ytc_Ugx0Kmm6Z…
G
i am absolutely horrified with AI being rolled out when NOBODY asked for this!!!…
ytc_UgwUrOVrB…
G
Why does he remind me of Lt Dan in Forest Gump? Nice friendly smile. 😅 Anyway, I…
ytc_UgyIy_6kr…
G
Many if not all human problems can be solved through cultivating human excellenc…
ytc_UgzUDE-3k…
Comment
if you are an all powerful being and there is only one other being that can stop you what do you do? from an economic standpoint the being is useless and can't give you anything you don't already have. from a defensive standpoint no other being can destroy you except for said being. from a logical standpoint said being is finicky and easy to anger. so would you not remove this problem? if there one other being that can defeat you wouldn't you destroy this being? take out feelings and use raw pros and cons why keep this being around that can destroy you? that exactly how an AI would see humans as nothing more than a complicated variable that needs to be removed from the equation.
youtube
AI Moral Status
2017-02-25T04:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugh_V3vu2DuvengCoAEC","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgietbweVEt0NHgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UghyeUksCRmVYHgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugg1cdAYAiAmpHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UghvGb_0icgToXgCoAEC","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgjgBssLGskAt3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugj2OLPFihnkBXgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgjHqU5fojdo-3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UggDe-aW7XmtPXgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UggCn9WTTgjXRngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]