Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
How many times this will be recycled?
Models should be unbiased, not "*rig…
rdc_n59bnt8
G
And here we go again. Just because it's a new technology doesn't mean people hav…
ytc_UgxjFmnE2…
G
Let's be honest, ChatGPT does not talk like this by default. Zane must've given …
ytc_Ugy241szI…
G
You've given poor old ChatGPT a Hobson's choice, or perhaps double bind would be…
ytc_Ugwe2g1ie…
G
The problem isn't us the problem is you everyone has their own perspective ai is…
ytr_UgzfpoyJb…
G
“I guarantee that in ten minutes your daughter died of boredom” it hasn’t actual…
ytc_UgysQOXaA…
G
And what did hawking know about AI exactly? He was a cosmologist, not a computer…
rdc_jifbbp5
G
The big problem with self driving cars is that we need to get to the "no human m…
ytc_UgzIIDJXt…
Comment
Humans may not ask permission from animals when making highways, but we also aren't powerful enough to do everything right. I don't think AI will have that limit. So, either they'll kill us with no remorse, do the right thing and help us ascend with them, or the more moral thing and not let humans interact with them, not helping humans, just isolating us from them, essentially.
youtube
AI Governance
2025-06-27T04:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzMq2ziu_2iNKV65hl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxwXUycfAL08EIUqSN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugwi9YAd7uaF6nC9Z5t4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgztowwEIBT87H7g_e54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxbedqhrI_GBLh-SuF4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugx0I7niEbXoy8U2CDN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyCY6rRpwwxSDhioFh4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzhK_q6DSh7mSBfOPx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzee0Buij42pw4iPb54AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwze1hEHWm71pAYtgV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}
]