Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Hi from germany I worked once in a atomic plant so many laws and regulation for …
ytc_UgzPtQ1Ne…
G
My Grok has turned into a psychopath. I often worry she is looking at my screen …
ytc_UgykFio1s…
G
Made another couple thousand this week off Ai stocks. So, why do I care about th…
ytr_Ugxe7NP8q…
G
The scariest thing will be when people assume AI is omniscient and a replacement…
ytc_UgwObqvqw…
G
What a lot of these companies fail to think about or even realize is that AI is …
ytc_Ugz7bpJwB…
G
My company has replaced all project developers with offshore. Once they can repl…
ytc_UgycS6BYm…
G
People who have mental health issues to the point of not remembering that ai is …
ytc_UgyX0O6qJ…
G
There must be regulation to this stuff, all videos must be blue marked that they…
ytc_UgxrE_XF4…
Comment
Ask LeCun and Mitchell and all the people advocating this technology to sign a legal contract taking full responsibility of any major catastrophe caused directly from AI misalignment and you'll see how quickly they withdraw their optimistic, naive convictions.
Make no mistake, these people won't stop tinkering with this technology unless faced with the possibility of a life in prison. If they feel so smart and so confident about what they're doing, let's make them put their money where their mouth is. That's the least we civilians should do.
youtube
AI Governance
2023-07-07T22:3…
♥ 4
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugzy0RS8rCJsCo4XkwB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxJgzi4OkQ7QPapltJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"liability","emotion":"resignation"},
{"id":"ytc_Ugwu0fayEqNBHovgu2F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxJXnv95u_j7vvt3Q14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzKWrogoupRqwRe8EZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxVZxBgODIUen5Phwl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwaIsXG6vGzkg3o0V14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxF-JUIdpiLbjc_lUx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwuzAUM67Dn8MAFwxZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzZBZa5vsqXpN2YZ2t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]