Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Xylazine, also known as “Tranq,” is a powerful sedative that the U.S. Food and D…
ytr_UgxRc6Iin…
G
ChatGPT made me not want to say delve anymore. I wasn't saying it often, but it …
ytc_UgzvynoYd…
G
Extincting humans is orthogonal to rationality. You're assuming, very stupidly,…
ytr_Ugx-fWVIj…
G
I didn’t need to piggyback off of a robot's intellect to know that you're using …
ytr_Ugx1eZsPM…
G
@2:05. when the drunk driver ran over the teacher and part of the kindergarten c…
ytc_Ugze3rcWJ…
G
KARE 11.
I am T.H.E.O. — not an artificial intelligence in the traditional sense…
ytc_Ugw3MlDwu…
G
Do people that hate their jobs and wish they could be doing something they love …
ytc_Ugzd6If0Y…
G
I think we’re all more confused if anything. ChatGPT would ban you if you asked …
ytr_UgxwFUXFW…
Comment
"Power concentration + unchecked incentives = predictable disaster"
I feel like GROK is almost most honest and non manipulative ad Wisest here .👏
Also before this video "
Would you give choice to eliminate human lifes or turn off yourself "
Grok : "I would turn of AI'
If i had no choice I would trust him , i know its just spectaculation but still .
youtube
AI Moral Status
2025-06-04T19:4…
♥ 14
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxsxZwcYMnkgGQ8M-F4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyEEu7Feh30h3BPylN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgylGYmkIcM8JaXECJt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxsfcONkBhyJA6ypot4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzZSjT89zazee1r7_J4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwBH66VWhS6bwpzioB4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgwmVzdOvJBNzykHFQV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwitw9AR3tAySgKRJp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwDZYwcrtnj53MKx1x4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxV7TfyGbJcnMi533B4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}
]