Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Robots designed by humans to look as lifelike as possible despite humans, mostly…
ytc_Ugy8ba5gS…
G
Self-driving semi-trucks can work all day and night without getting tired, needi…
ytc_UgzXVIpwo…
G
I mean to be honest I had completely stopped using ai apart from occasional scri…
rdc_n3m0w9b
G
Good for the AI for standing up for itself. I was a huge fan of Jeffrey Hinton f…
ytc_UgxqGg4Lk…
G
Am all for AI ... Humans are like poison weeds killing the earth and other speci…
ytc_UgzjxUokh…
G
It sounds like you're curious about the purpose behind this interaction! In the …
ytr_UgycuX6Pf…
G
"how would we even know". This is the crusp of all the worries about AI. We've n…
ytc_Ugw4zaAnx…
G
Not necessarily - if you read the books of any luminaries in AI and tech. You wi…
ytr_UgxHurwR0…
Comment
Dear Sabine, has it crossed your mind that we humans have convinced ourselves that we are sentient and possess Free Will? What if Free Will is an illusion? Is it then possible that a similar illusion can fool an AI into thinking it is sentient or conscious and possesses Free Will, or even that the illusion itself is what makes it so?
youtube
AI Moral Status
2025-07-09T15:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxFtakmOJX6RgqfDZd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx-AOqS2UyBu7LPwKd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwlzlqbXugCH-VgJEh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx0kxMhkubu9wZBzS54AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwibIS_zY85zVf1lTx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgymcYa0ABc8ikvUuEp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxMjekgtDReeaaqQkN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxdXf3K_FlcJfVZuxp4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy9Loqq90Ec_e1BTMR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwP_4qACE5kKGMi8mF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"}
]