Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
U think that because it's programed that it can't be harmful, no , once the robo…
ytc_Ugx3jdWZp…
G
I use ChatGPT to help me with my homework, and with the information it gives me,…
ytc_UgyaTLprR…
G
He can't really one hundred percent prove that AI will kill us, but the trouble …
ytc_UgzyMjkxA…
G
Personally, I worry about the impact AI will have on the job market.
AI's relent…
ytc_UgwdPrQuP…
G
It's a whole lot of things, it's outsourcing, greed, AI. There are trade jobs ou…
ytc_Ugx8lJ6FJ…
G
right now what we have is LLMs calculating strings of text that sound convincing…
ytr_Ugxf3PGO1…
G
Dude! the AI promptards community is hands down the most toxic and immature comm…
ytc_Ugzt7WMP7…
G
I think this accident is good for self-driving techniques. The video shows that …
ytc_UgyTLEzI6…
Comment
We would still need some sense of control, that is, ensuring it is an AI that values kindness, otherwise we could easily end up with an AI that understands us, but doesn't at all care...
youtube
AI Governance
2023-05-10T12:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgzmDQQc44Lj58f4mCN4AaABAg.9pXRwQCL_PU9pYLH6OcpJ_","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzJw0IzJ4H1tkh-tIh4AaABAg.9pXP3OXoQ_A9pXWJwBKLmA","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgzJw0IzJ4H1tkh-tIh4AaABAg.9pXP3OXoQ_A9pXYm1vK9Wa","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_UgzJw0IzJ4H1tkh-tIh4AaABAg.9pXP3OXoQ_A9pZLJNlcoVo","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugx1jZ6EvdgN4_BEhK94AaABAg.9pXOTqsm72a9pXXkjq23yp","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytr_UgwmP897iusCok7sm894AaABAg.9pXOHBwQZfb9pYMmnEK_xQ","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytr_UgwnOErK98mBnSq_CG14AaABAg.9pXLe5SUN8b9pddnBn4TL3","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxPMS3WM6iSFpSDagR4AaABAg.9pXLHovscSd9pYPgxPRsQL","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytr_UgykUL4pD3StP0Vktdt4AaABAg.9pXKi_NTZXU9q-F4aDrIYw","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxA4t3d2TSnwA0ujiF4AaABAg.9pXBE49xQU_9pXKR3o4naP","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]