Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
That's what aliens are. AI has figured out time travel and is controlling us and…
ytc_Ugxn60-oi…
G
Or you could also create by ACTUALLY creating. It doesn’t matter if it doesn’t t…
ytc_UgzEoOoiU…
G
And one of the AI industry's primary planned markets for their products is provi…
ytc_UgyJSPscc…
G
This is the problem in society today. All the socialist hippies of the 1960's an…
ytc_UgxI3g4Qv…
G
Mind-reading AI. Wow. A thought of violence is detected. The threat of violence…
ytc_UgyS0o9_I…
G
I was using ChatGPT for scientific research and discovered this problem. It foun…
ytc_UgwQUEBwJ…
G
Geoffrey sounded intimidatingly intelligent and articulate through out 99% of th…
ytc_Ugzf3DXFH…
G
The thing about AI is there is no imagination or love in it. When an artist crea…
ytc_Ugx8ZhNrz…
Comment
Geoffrey Hinton is a great computer scientist, but he has an even greater ego. How can he be serious to think that we can replicate a human??? And if we can make a copy of the human brain and put it inside a robot, does he really think it will be a human? Then, from a moral perspective, would it be the same crime to kill a robot and a human? That's total crap, I think. We simply cannot create humans; we are not gods! I recognize that Geoffrey Hinton is a very intelligent person, but he is not God.
youtube
AI Governance
2025-06-21T09:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzQBReemd161WNDw4N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzUyuhOhKK_VyGiyll4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz82lF-PZQe6JSlgr94AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzZBpU-DLd0FlWGJLB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwhP5h1IhBFf6mzZZB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzHTSBKrTD4DbrJIN94AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzVi23NihX5a3WzzWN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw6gDX7KQLVpxFw8354AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_Ugyut0yVYMmqKdORLlJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgybxP5ye3qyMPCNFap4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]