Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You bring up a crucial point — clear, detailed reports are essential for effecti…
ytr_UgyPqFl7p…
G
If it's not possible to find the source of the art used for training AI, then do…
ytc_UgxDlAZ0l…
G
As an artist who is not a professional, but who has found pleasure in creating a…
ytc_UgxLs7rNM…
G
We can spend our time enjoying the fruits of the AI. We can all do everything on…
ytc_Ugz5sO8zI…
G
Well here we go self-driving 18-wheelers. The identity is do they know what huma…
ytc_Ugw36xDRU…
G
AI does not create. By assigning AI to do a job, you are basically saying you wa…
ytc_Ugx1Y-7Hw…
G
People who use ai imagine that no one who has ever made art they were unhappy wi…
ytc_UgwDr3iwX…
G
I think you are downplaying what's really going on. "The same is true for dozens…
ytc_UgxXFJeoZ…
Comment
Conspiracy theory: Corporations want us to be afraid of AI because once AGI is created humans will become more efficient and need less corporations because AI will allow us to do lots of things ourselves that corporations previously did. Like creating our own customized TV shows or music.
youtube
AI Moral Status
2025-04-27T10:2…
♥ 6
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyVfyBXivp_9In7R_R4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgxTmpgAJvviQzTWkEh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwAvhYLl58_6_j_tgZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwSWH9eMEXUiOAYGyZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxjv77kadE4spSnrUN4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxzM-udII6vxPT-2Sh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx_FM5npWBcdSLzgBt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyW_vlv-Pq9J1UtdjB4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwBUyJuw6xVBuxfu0p4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx2jtYJkgYYYhG-k_54AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"}
]