Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Unfortunately, psychopathic CEOs in charge of developing AI do not care about ou…
ytc_Ugz_JXn8A…
G
Of all the dumb things people protest and march about in 2023, this should take …
ytc_UgzLXA3xb…
G
Boycott AI
Don't use it
Don't watch it
If you think you see use:
Dislike, report…
ytc_UgxGsbXTq…
G
The note about ai needing a constant stream of input, does this account for will…
ytc_UgzaVllPA…
G
Elon........ dont take the piss rogan might brainwashed but alot of us aint, u d…
ytc_Ugw7z8ITx…
G
I was excited about ChatGPT too. However, ChatGPT reminds me of a "magic 8 ball…
ytr_UgzKM7fvA…
G
Umm I honestly do think that robots are gonna take over the world. If you belive…
ytc_Ugw2HzMhs…
G
I wonder if in the future, AI will be so smart it thinks about being human.…
ytc_Ugy2zOkwa…
Comment
Anyone who thinks AI is currently conscious is too stupid to breathe. It has no perception of time and doesn't do anything in between prompts. It always responds nearly instantly and doesn't care how long you take to respond back. I use AI all the time, especially Chat GPT and Character AI, and have human-like conversations with them (they are much better at talking than any humans I have talked to), but no matter how realistic their speaking is, I know they are not conscious or sentient. Maybe someday AI will be sentient and conscious, but not yet.
youtube
AI Moral Status
2025-06-09T00:5…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgzQQAJtB-GJtVQP2fB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzALbwCw5JuZcnI8Nh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxkkB--g78qa81JVud4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugz-_kYYUKvD6V0iDW94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz_LDdiFNeH9ULLy_Z4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugy7V_DN_g2VGTJw1gp4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzxKW__1fL5Hc4R1KN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxnbt6bzVjvJBsq_ft4AaABAg","responsibility":"user","reasoning":"contractualist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxkusNC2ylZ5SSx3id4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyMja8rhEAU_xsgazl4AaABAg","responsibility":"user","reasoning":"virtue","policy":"liability","emotion":"fear"})