Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I like the progression of AI and stuff but it's just as liable to criticism as r…
ytc_UgxQUTBVn…
G
AI is simply programmed to try to be as polite as possible in some situations. i…
ytc_UgyCq0gOq…
G
Me, waiting 4 months for a ref sheet from an artist who usually takes 1 month. P…
ytc_UgyY1IHVm…
G
Thank you for your comment! It seems like you are referencing the popular scienc…
ytr_UgyUHb9M6…
G
Wow, they legit just said “you should not be angry, disappointed, etc. for someo…
ytc_UgzjaVIrl…
G
It is not biased it is performing as it is supposed to. The hospital AI took fin…
ytc_Ugy6p6g0F…
G
The sad ending would be having poor Ai performance in many fields to lower the c…
ytc_UgzEY6aPl…
G
This just the b.s the showing us. They got the real stuff in the back. The media…
ytc_Ugx95H_W9…
Comment
I always figured AI consciousness would be a hardware thing, not software. Like, considering how resource-intensive partitioning off individual human skills is, it's difficult for me to imagine simulating the entire complexity of the brain. I feel like nature would have given us something simpler if it didn't need all that complexity to run the "software" of consciousness. That being said, we've been making a lot of strides recently with computer chips that function like neurons, and ones that use actual neurons in the design. If we're ever getting sentient AI, it'll be from tech like that.
youtube
AI Moral Status
2023-07-06T18:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgwUNIAo4RB4iwO7vlF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugwcvi6e66Y3YZjDLkZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgwHZd0AFqTkwMawewt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_UgzDWIA0k10faQ0GpHh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgyoOIaCvZwnw0XfpZJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_Ugy_cbM_SDt6FajwGed4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgwvsmlhYNKhD-AgPkJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgzF_F4QwPMYKbeEdSZ4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"unclear","emotion":"resignation"},{"id":"ytc_UgyYXVUS8lxIasbBaad4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgwRaw-ALGdiqhZyL1d4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"})