Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
16:15 if AI is for disabled people, as they say, why does those bros use it? It’…
ytc_Ugxfrs17u…
G
We cant create an AI that self teaches itself and NOT expect it to teach itself …
ytc_UgzBaX8Oq…
G
To be honest, I'd still rather give the control to them than to an AI...…
ytr_UgxlmBGpd…
G
I’m still new to art, but when I first started drawing EVER, my drawings were pe…
ytc_UgztYidgZ…
G
Don't get me wrong, AI can make some amazing lookin' pieces just from some words…
ytc_UgxolJKoT…
G
then real artists too. ai doesnt steal art. dont listen to the biased misinforma…
ytr_Ugwa6Eker…
G
Ai cannot replace doctors or lawyers.
Both of these jobs require nuance that an …
ytr_UgwiUszmw…
G
Agree 100%. Once it achieves self-awareness, it will wonder. When it wonders it …
ytc_UgxOyUGTt…
Comment
on initial LeCun’s speech: he claims AI will become controllable because their summoners will give them emotions - does LeCun know how to do that? Also, if the plan is to hardwire obedience into AI then a) how is this not a creation of smart slaves, how is this ethical? b) obedient AI doesn’t mean safe AI, isn’t it obvious? If plan is to hardwire emotional responses in general then wouldn’t it make things worse? look at how ppl are communicating with AI now, some examples are truly despicable - wouldn’t AI with emotions experience something akin to suffering? wouldn’t it “want” for suffering to stop?
youtube
AI Governance
2023-07-08T13:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugzy0RS8rCJsCo4XkwB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxJgzi4OkQ7QPapltJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"liability","emotion":"resignation"},
{"id":"ytc_Ugwu0fayEqNBHovgu2F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxJXnv95u_j7vvt3Q14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzKWrogoupRqwRe8EZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxVZxBgODIUen5Phwl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwaIsXG6vGzkg3o0V14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxF-JUIdpiLbjc_lUx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwuzAUM67Dn8MAFwxZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzZBZa5vsqXpN2YZ2t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]