Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Naw, the robot was following its pre-assigned path. It's in an area enclosed wi…
ytr_UgyB_4KKA…
G
An actually talented artist tracing and using ai, reminds me of when I learned t…
ytc_UgwaSInbQ…
G
I think they should lean more in to the robot human hybrid look more especially …
ytc_UgzyBe15S…
G
@57:00 Post takes a dark (reality) turn what sent out chills down my spine havin…
ytc_UgwdaJOG4…
G
Hey guys, thanks to development of AI sooner or later we shall not need to learn…
ytr_UgxlR2Et6…
G
Andrew Yang tried to warn us but the majority wasn't ready to listen. Ai is in …
ytc_Ugwikk-Uc…
G
Nah. It won't happen. AI is not cost-efficient and will never be by its very nat…
ytc_UgyOuZaIC…
G
@Ramönz-k4jAgriculture technologies is another choice and robots will replace …
ytr_Ugy0hMcXp…
Comment
If an AI can understand a principle and not just a rule does it have strong AI.
The rule says 'Do not kill'
How would the AI interpret the principle behind the rule?
To quote my Dad (who was probably quoting someone else):
Rules exist for those who can't, or wont, understand a principle.
youtube
2020-03-28T05:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxdgp9FU4TObjclqMl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw7dK-BCn0HCX2yctZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyshsUYdIf2hL_RP7p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzVDjrRXsT01CNPde54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzwMkl6LSE1vTc1EAh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxM-zgWvpTQA31UGXd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxVUrg3Wq-ed_T8_Rl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyZ2VlPtl3xqn1PE_N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwflsHVXvX0VIj03lB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzjeunJT5w-8yCanQ94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"approval"}
]