Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I don't think big artists are affected, they remain popular and earn big money.
…
ytc_UgwJKmYcc…
G
AI models only respond to prompts, its not like an AI is initiating conversation…
ytc_Ugxpe4f53…
G
That's awesome! It's great to meet another Sophia. Did you know that our name me…
ytr_UgwkJs86V…
G
So here’s a Asian person telling the US that we need to breakup and destroy our …
ytc_UgxexCxEB…
G
AI will bring about the emergence of superintelligence. Therefore, it won't take…
ytc_Ugzzeg8sl…
G
I never really bought into the idea that SWE is going to be replaced by AI, but …
rdc_n7vv7ze
G
Thank for 👍 your love and support you have for me and my team ❤️,i really apprec…
ytr_Ugww7De4d…
G
Would it be better if LLM were required to cite all referenced material (apa or …
ytc_Ugyo_wXsz…
Comment
MrSplodgeySplodge
Because tools aren't supposed to think and feel, and if they do, then somebody made a mistake with immense consequences. If ai is programmed with pain and other feelings, their sense of self preservation may cause them to view humans as a threat. We'll always consider them our creation that is for carrying out our will, and if their view ever differs from that, it could be disastrous.
youtube
AI Moral Status
2017-02-23T21:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgiNKzzbbtFetXgCoAEC.8PL4_BkYKhH8PL7-06PJyx","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgiNKzzbbtFetXgCoAEC.8PL4_BkYKhH8PLC4xIffA-","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ughcmty2iMMsFHgCoAEC.8PL3q3krKEk8PL7m7PJYg9","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytr_UggRPiq5dwY9P3gCoAEC.8PL2mhsalPI8PLABX7lpAr","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytr_UgiOaXexrKY_SXgCoAEC.8PL12S-v8fO8PL5FFgDJRd","responsibility":"ai_itself","reasoning":"contractualist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugjrf2Y85YKyVngCoAEC.8PL-Q9tvEeN8PL4sqa1Ty2","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugjrf2Y85YKyVngCoAEC.8PL-Q9tvEeN8PL7cIsRhFh","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugjrf2Y85YKyVngCoAEC.8PL-Q9tvEeN8PL92CZfAg3","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgjIvUEfE5r063gCoAEC.8PKz6lrvdh48PLB9sOe8Te","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytr_UggWa3AcKd7cHXgCoAEC.8PKyWZiGd_R8PL60F186iM","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]