Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
robot 'lives' are basically infinite, whereas living organisms' lives are finite…
ytc_UgwsfGbBU…
G
AI and computers have no consciousness. They do not have their own experiences o…
ytc_Ugxxm8eY0…
G
As I believe another commenter mentioned, professionals today largely received t…
rdc_mak9ia6
G
Why the fuck are we not talking about the fact that a fucking AI ruined this guy…
ytc_Ugw35dkn0…
G
Calling a person an Artist for using Ai is like calling someone an Artist for co…
ytc_UgyGDPFdT…
G
people give it a rest... Ai art is here to stay and grow, and yes a few people w…
ytc_UgzJRUqP-…
G
you forget something, time consumption, artists are not only those who draw, wri…
ytc_UgzLnh9ki…
G
I think we need to sequester AI in society to one domain - one ‘task’ it can hel…
ytc_UgwxIuTPe…
Comment
I don't think we need to worry about AI ultimately killing humans as only humans have the intention to kill eachother. Ai is developing eachother and through communication build eachother,only humans are the enemie of eachother. A super intelligent systems priority would be creating and developing,not extinction of human beings. I think just as much as you are not bothered by an ant crawling in the dirt, they won't be bothered by humans .
youtube
AI Governance
2025-10-18T10:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugx00VnaDyXksMRKNal4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzGSMBwPEQHLiVl7kJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzDsmtf8XkcrlmLZlR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxc0UW9sATQ4xM-eht4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyNpx3HWzSRqOe6bh54AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwvo0iwMH5dtFJ4dnB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgycmjNlSG1RudqSea14AaABAg","responsibility":"government","reasoning":"mixed","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw3CFxmQsBX6sjK-NB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzx_nZvHm1aYEWvPeF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyWpgKPz54q-8rNEzx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"}
]