Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
the physiological response to emotions is very important if you're dealing with …
ytc_UgxCascvE…
G
Gemini's thoughts about the way AI's arrival will disrupt humanity
Year 1: Th…
ytc_Ugxyvx-pA…
G
yeah i guess i could agree wanting to watch an ai made show that looks at your i…
ytr_UgzBtHDDt…
G
Buiseness Tbh AI art is cheap replacement of acctuall good artists bcs it's way …
ytc_Ugysnl5Rh…
G
I like how the ref jumps in to separate them when the robot tries to go in for t…
ytc_UgyXc1LBA…
G
Kill the rich, starting with you. AI is a bubble and a lie. You have weapons, us…
ytc_UgzshIeoc…
G
Am I the only one who finds this creepy? I think AI is very dangerous simply bec…
ytc_UgyhNd32X…
G
I can see how some jobs will take less time with the us of AI - like the one he …
ytc_UgxZgwswc…
Comment
The times I describe an LLM to the effect of calling it “fancy auto complete” are times when I’m dispelling the notion that these AIs are organic growing and learning minds when the AIs in question are pre-trained algorithms that are receiving text as input and producing output by running calculations based on that text. When an LLM talks like a person, that can inspire people to misunderstand what an AI actually is in all sorts of ways, based on assumptions about how the thing that talks like a person is doing so because it has a brain like a person. “Fancy auto compete” or “very smart calculator” are reductive, but I use the idea to illustrate the fundamental inhumanness of a computer program which is called at the push of a button, thinks by executing a rigidly structured computational process, and whose hardware is completely inert except when executing that process.
youtube
AI Moral Status
2025-10-30T22:1…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwVqH-HrxmiJyoVRcx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyROr07X06WSrpgTMF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgysU-BWCd1zCgS_4G94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyfgyEEW2KcTZE1PRR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwyHiZbSRZDaQqCvIJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy_MZ00OZOjSfJjuRd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwbC3phbR7_0vjJCTJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzzYTLEHDJZl3v3CU14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwJbz2RkejVQHdfcXV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxy3jlAdOpZ0lZ2EGd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]