Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI and automation should be used to make boring, repetitive and potentially dama…
ytc_UgximTnQd…
G
Hey guys, if we continue developing Ai, it could lead to our extinction as a spe…
ytc_UgzL-a1HY…
G
We might not be able to completely eradicate AI art theft for now, but that shou…
ytc_UgyYsB4SK…
G
If anyone has read On Bullshit… generative AI is a Bullshit Engine. It isn’t ly…
ytc_UgyWaZcwZ…
G
AI is hands-down the biggest threat in society..economically, socially, psycholo…
ytc_UgyIT4kr6…
G
Nonsense. My senior backend with more than 10 years of experience uses AI consta…
ytr_UgzknS6RC…
G
Google doesn't have a policy against sentient AI, it has a policy against suppor…
ytc_UgzQm8l_1…
G
@agentmerc exactly, even the most advanced AI right now has no real sentience. i…
ytr_UgyGurV3D…
Comment
so here is the thing, I work in Radiology and I am certified to give radiation. Not even the nurses can push that button and depending on the state doctors who push that button have to have certifications to do so. Radiologist has already kicked AI to the curb when it comes to the program reading lung nodules. And just two months ago I witnessed AI misdiagnose a head bleed. That could have sent someone to brain surgery. You can make all the predications you want but the reality is....... Radiologist were supposed to be obsolete all ready. So with that said, right now I do not see how this makes healthcare safer. It will only become massive expensive paperweights as Moxie has already proven. But then again when those billionaires do not use Biblical principles the human life becomes devalued.
youtube
AI Governance
2026-04-07T21:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwAxu-wKxr_S_6cBxR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzxVp_NX6OTtCHwCe54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzNQg2-2yNcl9EZ3ph4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxKe9ZLv_CPMl2_3654AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzmoNDDvryZ399CrJ54AaABAg","responsibility":"government","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxc-Sc2YN0YP68uAJ54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzfHavtnSqwRtI2Hz94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugxs0rvltAxuYgIgx6J4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgyXbPKjgOPLm_7Av4t4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwmWJ-xtMUpjX0jLVt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]