Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
how are the models benchmarked? is there an objective way to see the Deepseek is…
rdc_m9fklhf
G
So she goes back to Amazon web services where they also likely host traffic from…
ytc_UgzdT0miW…
G
Me before searching up Sora AI: "Probably nothing bad."
Me after: "I'm going to…
ytc_UgwlWZqSw…
G
AMEN. If we did not learn from those movies and even some Star Trek episodes suc…
ytr_Ugx43Lyn4…
G
So, nothing about OpenAI and other GenAI companies' data centres destroying the …
ytc_UgxgmYNz6…
G
Why do they have to give these fuckin robots personalities? I've hated that for …
ytc_Ugy2ADyjS…
G
I earnestly encourage you to delve deeper into this topic and make a follow up v…
ytc_Ugyq1rZ66…
G
"Hello kids today we're learning to be racist'' ahh
Kids life lesson NEVER BE RA…
ytc_UgwvNSXTU…
Comment
Many AI vs humans scenarios have been played out in science fiction novels and movies for decades. Some of it was harmless and entertaining but most of it had consequences that didn't bode well for humanity. Therefore, who shall we blame if the AI take-over actually happens? But by then it'll be too late to object, my little lambs, as you'll already be well on your way to slaughter.
youtube
AI Moral Status
2021-12-15T23:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugzb2OyhQ9FQdX_dlQN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzI_M1FPfJJbx1IjFB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwQGjgxuyvl9BcQEul4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxOmCH-jSfIr53ZgAN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwWVwVkUmoXuhhSFw54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxfLKA9bGYyRt_iBGZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyGFnxOS-ZLLxBtdPt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgwO1KZ46YN57BkC_5d4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyUytHI0WTGI-nMVaV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz57-0JyhgG3N6M-Ll4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}
]