Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I don’t think he is dumb at all. I am putting my life in the hands of a driverle…
ytc_Ugyksc065…
G
I use AI tools and I don't care that I use AI tools like ChatGPT.…
ytr_UgwZ9Kc2W…
G
Cant blame an qpp. Its parenting at stake. Were talking teen. Hope this opens m…
ytc_Ugxl6hjyb…
G
I believe ChatGPT's example is a very strange one.
Anyone saying that they can p…
ytc_Ugx5c6SgG…
G
Why do men think long scruffy beards make them look cool or intellectual?
And …
ytc_UgxSEhS7W…
G
Bullshit, 99% will be replacing the military with killer robots run by an rogue …
ytc_UgxkKJ12o…
G
What's wrong with having the workers push a button when they feel something inst…
ytc_UgyDkZZeZ…
G
Assuming the self driving cars can pass certain safety standards to make sure th…
rdc_dmpnvzn
Comment
All the actual work is done by the words "consciousness" and "pain".
If I hand you a pile of computer code, how do you tell if its conscious?
You could make a low level simulation of a particular human brain, that acted like the particular human being simulated, and that would be conscious for the same reason the human is.
But for most designs of AI, "is it conscious?" is a hard philosophy problem.
youtube
AI Moral Status
2020-07-08T14:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugym08kqdNxUkx2-10h4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugw_FVBepk0HmLi4XIl4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx-doPVTFSsH45POn14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy_U1c8hw1dbnQUZP54AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwSfROM85Ux7Gs0y7F4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzOEG9iNvpbccy3GwZ4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzX6dYAXDanVaf0hUh4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwRMrZoIrY08Mv59Ht4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxKIra1BpAyZvQy4YZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwnY8UO0f3NEOf4jd94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"outrage"}
]