Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I love this! I've been an educator for 20 years and am a huge AI fan. I believe …
ytc_UgyfATuzT…
G
Money and Success blind most people into thinking that failure will never happen…
rdc_czm986p
G
@Hot-Rob I respect that. Is there a reason you don't want other people to have …
ytr_Ugy6rwF8B…
G
They keep thinking the world doesn't want a based AU but all the different AI ke…
ytc_Ugxyu-9Ff…
G
Why. Are you to stupi to do it youre self?
AI is not helping, it´s in the way to…
ytc_Ugzz7-SDI…
G
I work on illustration and concept design. I like AI art, but the two problems I…
ytr_Ugyl6hP_Q…
G
Change to title to: "Breaking Points tricked by Fox pretending to be tricked whi…
ytc_UgxOI0wVY…
G
They should just automate the distance: have hauling go on the barely maintained…
ytc_Ugzp4qhx4…
Comment
Yann's opening statement is either incredibly naive or pretentiously misleading, for you can not calculate something that is unconscious, he says we'll give AI "emotions" and that's how we'll control it- we don't know how things like affection comes to be, to begin to put it into code, as it is an unconscious function of one's psyche. You can not get affection, or guilt, from calculus, from code, that is a technical impossibility as it is not a conscious function, it is unknowable. If, by calculus alone, these systems get to develop sentience, they are going to be psychopaths. It's quite sad that none of them know enough about basic psychology to raise what would be the most important point of this debate.
youtube
AI Governance
2023-07-09T06:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugzy0RS8rCJsCo4XkwB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxJgzi4OkQ7QPapltJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"liability","emotion":"resignation"},
{"id":"ytc_Ugwu0fayEqNBHovgu2F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxJXnv95u_j7vvt3Q14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzKWrogoupRqwRe8EZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxVZxBgODIUen5Phwl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwaIsXG6vGzkg3o0V14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxF-JUIdpiLbjc_lUx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwuzAUM67Dn8MAFwxZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzZBZa5vsqXpN2YZ2t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]