Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Canceled mine as soon as I saw they'd taken the DoD's offer. And I'm (I'as) a Op…
rdc_o85fo3h
G
What they are not telling you is this was a deliberate experiment in a controlle…
ytc_UgyunZszE…
G
I'm waiting for the day where AI teaches us the secrets to eating gas station fo…
ytc_Ugwc1feHv…
G
That's a thought-provoking concern! The interplay between AI development and hum…
ytr_Ugx7lROa5…
G
Haha, it seems like you're referencing the Terminator movies! Don't worry, on ou…
ytr_UgymxwgTW…
G
They wont because companies WONT own the code. Openai ext will own the code so i…
ytc_UgwrfU78o…
G
Was hoping id find a comment like this. Like damn I love shads historical vids a…
ytr_Ugyp_Hd1F…
G
No. But then the robot would never run if it could manage from the background, …
ytc_UgzjeRmR9…
Comment
Before creating AI modeled on humankind let's create a humankind that is less modeled after greedy monkeys. That's all we are and it's nothing to be proud of or propagate. The robotics of today is a merely the magnification of human frailty.
youtube
AI Moral Status
2021-12-21T03:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyRmed2IJjcgo1gLyx4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyM5FkLtZDx9FP8u154AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxTUwP6xf41wMTMPSV4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzLOdrYGM-FHSxSFa54AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyIagn4jv4L4tBa3b54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugxum6fBNzdxe5kAL9x4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzHr2Epx4lzFcKpS8V4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy4R-sHSGzGuuWpf3R4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzu3owYvKtczufx6LV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy7p1hF1678XQXGe1V4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"}
]