Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm not an artist in the traditional sense, I build images from photography (col…
ytc_UgzFOsSZv…
G
Elon Musk supports the idea of a universal basic income (UBI), arguing it will b…
ytc_Ugx0M3q2E…
G
We don't know enough about AGI to say how close or far we are from it. That's pa…
ytr_Ugz6Pt_A9…
G
Large language models are just a reflection of the common narrative. Chat GPT is…
ytc_UgzNPd2YV…
G
Well, a lot of the big labs aren't just straight up for-profit companies, that's…
ytr_Ugwaf8pzY…
G
Do you know what is to be super intelligent without empathy ? If a robot is cons…
ytc_Ugi6kqBma…
G
I like hoe they’re making all the robots women…I suppose it’s the into way those…
ytc_UgwukQyP7…
G
why is a question regarding social inequalities and human striving in a world of…
ytc_UgyYeZCe2…
Comment
Hmm.. 🤔 makes ya wonder how far ahead the military is on this stuff? They’re usually about 15 to 20 years ahead of civilians see. They’re probably already a.i. robots that think like humans and have advanced computers connected to brain tissues. And I bet they blend right in with us already. We’re screwed. 🐉
youtube
AI Governance
2025-09-25T09:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyeVqR8euCstj73d8R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxEXtU6Afktx5FIlL94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyBQ4JXvn8nZPW_1Uh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwR6_koRWF4pHhGO294AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzLVJ30CMYeCONSad14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwbh2h-70XRPQQLrHR4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgywOQNT6fwhAmvZlEx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"disapproval"},
{"id":"ytc_UgzI8G12uL_DdiXXH2J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgweXdT2YJ_k5uOctdp4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz3SOtOxG3kzYEZJPZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]