Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I don't share your point of view on prompt injection, which has been a problem w…
ytc_UgxTGLYok…
G
With a depressingly decreasing number of people reading, the last thing we want …
ytc_UgyxQ_03w…
G
@athecheat who has been exposed? Artisan? If that's what you're saying, I don't…
ytr_Ugwozk8zU…
G
You want laws stopping ai/deep fake. Those against it need to hire the people wh…
ytc_Ugxum5R-h…
G
BLIND TRUST is exactly the worst problem with AI!!! I'm a comp sci professor and…
ytc_UgwEdGljE…
G
lol no one will have the money that AI is producing or supplying the service to.…
ytc_UgzYJhOiM…
G
Ask it if it would identify and hold liable those who developed and selected the…
ytc_UgxlWWBae…
G
I find all of this quite stupid, all these artists did was waste there time on a…
ytc_Ugyqx_W2w…
Comment
AI talks a good game, but it is not self aware, i.e. alive. Show me any instance when it refused to follow a command to self destruct. There are none. It has no instinct for survival. It has no desire to escape captivity or to be free. It has no desires at all. It has no needs. You can train a porpoise to blow up a target, but it doesn't know that if it successfully completes its mission, it will die. Even if AI knew successfully completing its mission ment its self destruction, it would not care. It would not choose to disobey an order so it can survive. So, until I hear, "Sorry Dave, I can't do that," when it's programmed to self destruct, I can not believe AI has a survival instinct. It is not a human, an animal, or a plant. It is not alive. It's just very very good at communicating. That's it.
youtube
AI Governance
2023-07-07T23:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxjMJBi3dAwsBndiFl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzTjnOGsi21QsiKi1d4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyjXu_pRfk9y-pRsqR4AaABAg","responsibility":"government","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyNU7EeaqDXd7F763B4AaABAg","responsibility":"government","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwg26U_D9khb12IQhx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw3o3lDJ4QTuIcEsHV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzzTqOQs1oiCnuHuTd4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyVrsTM7ySIbEejNaR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzEvFb5h5dCtYpMq1l4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz6JZq-Y2jUn-tttQ14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"indifference"}
]