Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
the best videos on this topic are by Daniel Schmachtenberger, John Vervaeke and …
ytc_UgyJMOcfj…
G
If you want to mitigate the risk, you have to limit the power of chip capable of…
ytc_UgwZpM4t3…
G
Are the videos with people horrified AI data centers having horrible effects on …
ytc_Ugz1XEQEx…
G
I work for an AI tech company in China. I would say right now more than half of …
ytc_UgxW6pfE5…
G
you telling a robot to make the whole image for you is completely different from…
ytc_UgwtMicH1…
G
Since 2000, I've been saying that human augmentation is inevitable to survive in…
ytc_UgwLxOYyr…
G
Yes, there is. Overseas generics manufacturing is a shitshow (sometimes literal…
rdc_grrq4mb
G
Welp the era of USA leading g the pack is over. Let's see how things turn out in…
rdc_e2wcyol
Comment
If AI is so great why doesn't someone ask it to design an affordable, practical fusion reactor? Or a warp drive? Or merely tell us what's wrong witth our physics?
youtube
AI Governance
2023-07-07T02:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwzsvT0R8QaLUyNUIx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugxtb04KfZMnmnwKn4x4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugx5zeaDZkjk9MLbG_54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzOy-FBaa-ajhMXoMZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugykacu35_BkzEzD8hN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgyEH-u-qtlab019hkB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxl48isDCChCc0PY594AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzKB-h2aVWk7JxR03x4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzR0GPh_1T1t2ExF0h4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzX0cewoR8nHZWY4Td4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]