Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's a complete crock of shit.
We lawyers at r/lawyertalk and r/lawfirm discuss…
rdc_n5gov3y
G
I am not a licensed driver, so I have no opinion on how well the two Waymo vehic…
ytc_UgyU0FzJk…
G
As someone working with AI, I fully support this! The audacity of corporations l…
ytc_UgzXGcpb6…
G
How do you prepare though? If a fully sentient AI does emerge, and it decides hu…
ytr_UgyYQ-mtL…
G
The biggest danger of AI is that Government will regulate it, Dangerous? Nope, i…
ytc_Ugx0yZ1hU…
G
They're hoping that AI will become good enough to replace the higher tier engine…
rdc_n5gkrc2
G
Why they just not stop free use if thy are so afraid of the future. Free educati…
ytc_UgytiPVOB…
G
8:46 Popular media seems to be hung up on the "sentience" aspect of AI (aka, sci…
ytc_UgzpL-oSe…
Comment
You must be dumb to think if we dont develop AI, other countries specially china would stop. Imagine them become military advance 100 times the US. If there is a stoppage, it has to be universal with some teeth to control which is nearly impossible.
youtube
AI Moral Status
2025-01-20T11:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzxvkN3k_9tHGPLf6t4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzFfHpGluBP0-Qy6FB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugw5HtOdTB11jOdVZXR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzAHQ2HG07fMtbeHH54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugy9hluCbeVJNB7MOlR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxfD97QdVTI13zmNl14AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugx5Ex3tk4PX0p3h58V4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwC9OB5SG86o8tC6NN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugy9vX_gn4BnQDRotH54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyUxEjo3DKZa081yDd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]