Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
5 years ago. AI couldn't create any images. How long till it figures out contras…
ytc_UgwjWy6j0…
G
I majored in history, so I'm curious to know whether there are any historical pa…
ytc_UgwBWekUy…
G
From my experiments with ChatGPT and Bard, it is my observation that AI routinel…
ytc_UgxmZZoX4…
G
i think EV should have exposed light indicator for automated acceleration/break …
ytc_Ugxx8vw6L…
G
Warning from Gemini, given that same question: "As AI becomes more sophisticated…
ytc_Ugyw55Y0B…
G
A.I reminds me so much of Y2K. Nothings going to happen, as large language model…
ytc_UgwId7DjK…
G
Schwartz: "Hey, are you sure these cases are real?"
ChatGPT: "Yeah, they're tot…
ytc_UgzOWIKaE…
G
@officialtollilion6627 so far Ai can't do comics, absolutely useless in that top…
ytr_UgzT7wwNE…
Comment
So, this is a pretty common take I hear a lot, but there's no reason to assume AI consciousness would look or behave anything like human consciousness. Besides the fact that not everyone's experience of consciousness is the same, and AI don't have to deal with biological demands and chemical imbalances it's pretty likely that AI won't have any of the problems human consciousness does/can have.
*Edit: besides things that have to do with the inherent brain structure like personality disorders, autism, or ADHD
youtube
AI Moral Status
2023-07-03T18:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_Ugy-P0EvcZYPiOSr3GJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxxIjCPOkl0-oT_Gqp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwMCNsndG_EzQm0ZzV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxQG6onATysv-_xZoF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyBG1vfGeiDFTIpUHh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgzivrBdKCRNSSvpEOd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw7BInOiKjcUk3m2e94AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugyfpts3f89Y1Cqka7N4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxVWiHHpnnppk1Q4iJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxf93kWXqMK9mfNLd54AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"resignation"}]