Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You missed the first controversy: AI art is an oxymoronic concept that doesn’t e…
ytc_UgwEmmUaD…
G
Elon Musk, doesn't believe in God,Bible, he says, humans will become extinct, (h…
ytc_UgxnwsohV…
G
This guy looks at digging ditches so simplistically. There are many complexities…
ytc_UgwiALUr1…
G
this is what happened to those "evil luddites" ai bros keep talking about, they …
ytr_Ugx7VmxJf…
G
I remember calling my friends on our landline and I'm pretty sure that I live in…
rdc_gtd4kt2
G
It will just have to be made illegal to use AI to be used for certain things...s…
ytc_Ugy3Pbu34…
G
It's not even a robot. The person been edited to look like a robot like all thes…
ytc_UgxwS9Icn…
G
This is flat out wrong. I mean, a hallucination IS coming up with something new.…
ytr_UgyTtnbsD…
Comment
What do you think about jobs that require human interaction and maybe more compassionate or caring modalities where the accuracy is important, but the more important thing is emotional? I teach dropout recovery students and often they have discipline issues. Making eye contact and a person's energy can be important. What do you think about social workers, teachers, police officers? My instinct is very similar-- to say AI would not be as good at those social emotional tasks as humans, but I am not naive enough to think that organizations and companies won't hire them anyway if they're cheaper.
youtube
AI Governance
2025-09-08T19:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwndxoSQxoIQWv_OId4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxPtCNZJk_QLEp0kml4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyhluISH7ccqxIVWIV4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzHWV57lqkLnbxtuWZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzwfS4ECucHfa4StBR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgywIpBhUwSiakX5-yx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyFmsdr0K1EF-BVDLN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyUvUv1FAgPt_QxExF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyCH8Z3inYBBm4Sdsd4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgyVFwG2fUTVpYITA754AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"}
]