Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If an autonomous robot kills someone, the owner/builder/programmer would be culp…
rdc_cqiprrm
G
TIME: "AI Leaders Debate"
Me: * looks at the debate seeing only Demis as AI Lea…
ytc_Ugx_OGsc_…
G
I follow an insta account who finds feeds of people who believe that they can tr…
rdc_mykw771
G
My AI doomsday scenario has always been that it would be co-opted by the oligarc…
rdc_jkhnbpx
G
I’m looking forward to the AI take over. Humans clearly aren’t able to look afte…
ytc_UgzJSjDkR…
G
There's no way these Waymo cars will be able to avoid getting rear-ended as I've…
ytc_UgyXc_PI_…
G
Don’t blame ‘AI’ (transformer models) for layoffs. This ‘AI’ term is misconstrue…
ytc_Ugw5kypKl…
G
I'm not even remotely suggesting that LLMs are "sentient" but them being glorifi…
rdc_mdkqjsj
Comment
Nobody stopped and thought yes we can but should we? Or when is enough enough. You know that common sense stuff we get as humans ... I'd hope anyway ..but look at the state of the world now. It's 28/11/24 now and ww3 and AI and Diddy and the rest 😢
youtube
AI Governance
2024-11-29T02:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwNGLfU0vPXSXTntZV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzO_VqkLEHuvarKT5h4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyElk6Ys3Rd9pQ8v814AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxJzylPbq0LdPlz9_h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzgfDVGvrkCJ0RdSkp4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwJe8nIZVBZ0ElWQeF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzBU4iAprhfgSHr7XR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_Ugy_I1ahsih6e-3cQU94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwg_n2Y6ZUuei3hnG54AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwAy2WtKDRrL-vKsRN4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"}
]