Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Automation is great! It's capitalism that's awful! Truck drivers are doing a nec…
ytc_UgxtcV0m2…
G
Unless you know what you are talking about. Then it sure does, most "artists" le…
ytr_UgxmRUOhE…
G
There is always a lag of years in tech innovations and I can expect it to be par…
ytr_Ugwyst-lM…
G
They always talk about AI as if it was an autonomous, uncontrollable entity. Luc…
ytc_UgwLn4qti…
G
Totally ignorant thing to say. Once these accidents start happening from AI gene…
ytc_UgxX77Dyb…
G
ai artists are mad cause they cant scam people if you start pointing them out. t…
ytc_UgzBPlOEU…
G
Its a half truth... Yes, AI loves to role play - BUT. Its designed to write LIKE…
ytc_Ugy-LW8_e…
G
“Shutting down the internet” might kill more people than the AI would. Most crit…
ytc_UgyaRPozL…
Comment
I’m assuming Alex meant this, but perhaps it was a happy accident: I think the implication that the “belief” of consciousness can be logically inferred by an AI suggests that humanity may also be simply convincing itself of the same (a layer of subtext thereby implying that consciousness is the ability to convince and be convinced).
But maybe I’m just making logic assumptions based off past information, like some kind of robot.
youtube
AI Moral Status
2024-07-28T04:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzNGlYdnZvX4azzyC94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwuEm_5tqZzijSnMtV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgygCCjC4fzOBowmT5B4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxB44yr2IRR-IlOhSd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwBRL07Sa-5L_HJEiZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx5sSvm46XjxYcRSYt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyeT7urd73ugdmYmMB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy6XnlqmvpP6HI_qTd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzC0CGIIPbtj39STZR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugwz6NMFZ2oEHOsuO3p4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"approval"}
]