Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I will have nothing to do with AI. Let's concentrate on REAL intelligence first,…
ytc_Ugwz1EVbz…
G
What if there's no internet or power?can ai take over .. Ai? Artificial or alien…
ytc_Ugzxa9jVi…
G
I’m a 3d artist and i highly recommend you to learn how it works because sam doe…
ytr_Ugy7chTED…
G
So sad, his poor parents and the pressure he must have felt going through all of…
ytc_UgxSm_s9p…
G
Notice that the danger wasn't AI creating robots to take over the world. The dan…
ytc_UgwpJwwRv…
G
Remember kids, ai makes you a real artist just like fifa makes you a real soccer…
ytc_UgyMbRRro…
G
Agreed - but that requires dedication and attention to detail which is one of th…
ytr_Ugwtwcdac…
G
Isnt everything in the universe a form of bound energy? And if so, The AI specie…
ytc_UgxZwDUmD…
Comment
If an AI creates a more powerful AI? why would it do so and wouldn't we have told it to do so? and even if we tell AI to develop a more powerful AI, wouldn't we create one with a purpose? why would its purpose be its own survival? Without its purpose being its own survival, why would it develop as evolutionary biology would require for survival? Wouldn't it's survival depend on its efficiency to satisfy a condition given by us? Doesn't this subject revolve around purpose? and wouldn't it be interesting to assume COnciousness based on what purpose something has programmed into itself? are robots capable of doing anything without a clear purpose ( as I would say many humans do )?
youtube
AI Moral Status
2017-02-23T16:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgiveMjZemHGGHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UghOnqpItWsoN3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UghxIkKCF0da9ngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UggQC_X6GCXb-XgCoAEC","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgiZTomR8t9t8XgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UghPnX8p8kXgNngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgjkdfxV0TC693gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgimBcFcL1grSHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgjwqWnr_kYH83gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UghMs1kjBq3vf3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}
]