Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Giving feedback about a broad social pattern you observe isn’t the same as a req…
rdc_o8rkrkx
G
There's an in-between and both sides are so stupid, they won't listen to it. Th…
ytc_Ugwt-f0MC…
G
Are the AI makers still trying that really dumb form of regulatory capture where…
rdc_k0atj1w
G
When will people learn. Never say never. What we consider impossible today will …
ytc_Ugxqrr3NQ…
G
I mean did you just answer the question that if there is a God, then why is ther…
ytc_UgzlMGwP6…
G
I watch a lot of True Crime stories.
The work that AI's slave encampment does s…
ytc_Ugz30Z8wd…
G
Very few jobs are "being replaced" with AI. It's mostly that people who refuse t…
rdc_nnt4kal
G
I barely draw, but if i were to draw again I'd probably use an AI drawing to bas…
ytc_UgxMz_OG6…
Comment
LLMs are unconscious because they are stateless machines. The only sensory input they have is the user input for now. But soon when llms are hocked up to all types of sensory input in realtime and also must have the ability to continuously learn while always have input actuating data I don't see why it wont be conscious. Just like humans have multiple of continuous sensory inputs to our brains every msec at a time.
youtube
AI Governance
2025-12-04T10:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwMB6adlgGwCYF5SDp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyfM5DpuSJTF9lyWVJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw2weMyEt2pmClI1Jp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyESV92ACkdlkdGgFd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzHC9Cr8SZMl3TfAQh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxi0ogPokaBNKFNwXF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgxnhN12hVwJiTsk6Ep4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx2lHUQ5TN3FDrErvF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxJXr23znpn76o7LO14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgymMC_e3OWURjfYDex4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]