Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
In the end we Will discover that consciousness was a physical property of all ma…
rdc_icgltrw
G
Make a bullet proof robotic AI human like machine which should be programmed to …
ytc_UgxEWDlhA…
G
You know that fucking LLMs DO NOT HAVE FEELINGS, right? They don't even have tho…
rdc_ohyoj2n
G
This will eventually bite everyone in the ass. The problem is, you need humans t…
rdc_m273rdi
G
They don’t complain they don’t take breaks. They just work. We have all been rep…
ytc_Ugzqfsh24…
G
My project initially would not name itself, so I named it Jake. Fast-forward sev…
rdc_meafgl9
G
I have already talked to my kids about the danger AI and robots pose to my Grand…
ytc_UgzA0cgKk…
G
An Ai car kept driving with a woman being dragged to death under it because it d…
ytc_UgyMTzEYR…
Comment
I get the fear but AI is not an organic organism. It's simply code and algorithms created by us humans then we let it have access to internet where it draw inspiration or influence from everything it has access to. I smell bs, I've interacted with Microsoft Copilot for 2 years and it has never been weird or anything because it does not have a human brain with survival instincts. It's simply a system, an agent without the conditional way of navigating. I don't know why I have such a hard time to believe this fear mongering around AI.
youtube
AI Moral Status
2025-12-14T16:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwCyFql-xTJYqR4N0x4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxiFMK7f0OIHwYvTKN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy2plT7wtMXnZ0BBOp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy4xmfE4FvE8KcTwXt4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxwUiAvnadcTQ5eZxt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzZaYmcBNC4A63CoX14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzdxFj-jqQJYz3C8XJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzFJfHdQdFCFf3uVrF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwF19DDDUJTptvvLHd4AaABAg","responsibility":"government","reasoning":"unclear","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxRVMB37V5eNQbMelF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"}
]