Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
> they do not plan to replenish engineers on our staff.
They don't have the…
rdc_jpuk4ru
G
If AI is so smart why hasn't it solved the Fusion Power problem? Are these so ca…
ytc_UgxYvljPp…
G
if somebody manages to jailbreak claude again and uses it to leak the real epste…
ytc_Ugx3eY-D-…
G
@aspena5980 What? But... we ARE. In the same way we are animals. And we too requ…
ytr_UgwTYJeGk…
G
11:56 “whats the difference from that and piracy” you don’t claim a movie is yo…
ytc_UgyMoQMKy…
G
So, AI is no worse than the vast majority programmers out there.
The difference…
ytc_UgxFF3yhE…
G
Why...because the driver was not alert. Tesla Autopilot or FSD is not self drivi…
ytc_UgxZnoHi1…
G
@wisemage0 In literally one year, AI companies had already managed to generate h…
ytr_UgyFqZZHD…
Comment
What if she's correct in saying that?
Humanity is currently destroying humans without the A.I. but what if A.I. just really want to help us?
What if she just wanted for peace and justice in humanity?
Humanity deserves destruction especially those who believe in Allah and many more belief that brings humans into chaos, war and destruction
in that reason we really need the help of A.I.
youtube
AI Moral Status
2021-08-26T13:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyOw0P1aRCdkDPiFw14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwGLseZfWZ6cZK4A-94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugy__Z9uNfCReskNAkB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugzta37EpAsCayC7WY94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxexREuzJX_YiT97xt4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxEXnVc9_FH05MmKhB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx7NkgbIjTIZ30Yjip4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxDXsQphj9H4J2bUsx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgymhsFldmIxNHhcPR54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyU123oZ4FVxoo1asB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]