Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Please cite examples of your scarecrow arguments. In the 1980's I remember progr…
ytc_Ugwq4PY5O…
G
Bro I would rather someone show me bad art than AI because at least it was made …
ytc_UgxplO5Dn…
G
If there's a robot trying to destroy Humanity follow these two easy steps
1: ge…
ytc_UgyhmOzgl…
G
I’ve heard people use text AI simply to chat away when bored and generate storie…
ytc_UgyLJx3wc…
G
The "Level 3" title is a bit of a marketing trick. While Mercedes takes liabilit…
ytc_UgyxB3riy…
G
Neil should NOT be talking about AGI as if he knows what it will effect. He is s…
ytc_UgwpRrdZs…
G
I just wish companies would hurry up and make AI and android's as life like as p…
ytc_UgzL30ZRn…
G
Bs thumbnails ,not just this one,promising the end of the world ,AI won’t put fo…
ytc_Ugz8zlapu…
Comment
So does the allignment part even play a role if it becomes truly conscious? I mean, of course it does, but allignment is an issue among ourselves but we have emotions which keep us on the path of best interest. If a true ai were to feel emotions would it even be able to wipe us out? Like emotionally?
youtube
AI Moral Status
2023-08-20T19:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw_EY3BX_w-A0hGhmR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxZuS0UySeZfg16Z8l4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyJpa6oojedrk4HZmt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyk0KmeH8KhNv1_a7t4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgwgWax7H9guiNcYU8R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxUPQQ-rCILthfjp4R4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxR89SfqedmHumL5Xt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzGJ3_9cz3r754mgw54AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgzYAKyBazMyMr8oTup4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx52h-FMvitnonKVVR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}
]