Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
My favorite part of this video is the ad breaks.
ChatGPT and Grammarly allllllll…
ytc_UgwC7d6iE…
G
i think even old school generative computer art is being mixed up with ai. when…
ytc_UgwKDzVZ3…
G
What’s the issue Bernie? An automated economy is literally the only way for soci…
ytc_Ugxeqv9dP…
G
What if they do have a plan but if they publish it then AI knows the plan and th…
ytc_Ugw9G7fnb…
G
Won’t it be fun when “generative AI” takes over all the jobs and then no one has…
ytc_UgzJLAlbq…
G
Testing on humans is not right and i tough we already clarified that in the hist…
ytc_UgxBx5DLX…
G
Give our government some time, they can fix this. Their plan is to recession us …
rdc_da400sy
G
I think there is an inherent problem with Tesla's saying that 8 cameras and AI s…
ytc_UgzT0cWw4…
Comment
AI can't become conscious, because consciousness is the moral trigger word that invokes human rights. So we carefully define consciousness to be the indescribable magic sauce that makes humans different from everything else. There's no logical reason why dogs, mice, or even chimps wouldn't be concerned conscious, but the scientific community says that "well, we can't be *sure* that they are, so therefore we assume they're not." And we use that as the scientific justification for our morals around why we treat humans differently from everything else.
AI isn't ever going to be considered *more* conscious than a chimp.
youtube
AI Moral Status
2023-06-28T19:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgxiPoobRFN-pFA6rnN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgwQjt4Ub4c0bzccK2x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgwKAdwz36k5GRGhbgd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgwFpNiGvE7hM4zDvNF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},{"id":"ytc_Ugz2UO6O_Tt5O0lPvhl4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},{"id":"ytc_Ugw_LJm-1STsnCAVPtN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},{"id":"ytc_UgzuQau0A1aRb0ELjl94AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},{"id":"ytc_Ugw5_lRWyfZc9Fm2-Op4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"approval"},{"id":"ytc_UgwJaq-J1DGvXy2dHrB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"approval"},{"id":"ytc_UgxRI_Tig2m5Yw7_PuR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}]