Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I used to make at least 10k extra a year from freelance gigs, and as soon as AI …
ytc_UgyGok4ZZ…
G
Is it, though?
>The right of the people to be secure in their persons, hous…
rdc_nmdt6py
G
Work. Otherwise we'll have to offer further incentive to those who work, just go…
ytr_UgyMUJRV_…
G
AI companies need a story to tell investors about why they lost all their money.…
rdc_nip9n2y
G
Hmm... AI researchers making fair use of publicly available data - is stealing a…
ytc_UgyHyUFVW…
G
I would not have problem with automated production IF and only if the companies …
ytc_UgyFMWbAq…
G
Nah it actually doesn't, I've made multiple AI bots and people use em so that's …
ytr_Ugw4RihfN…
G
How was this conversation accomplished? Alex didn’t seem to be pressing any voic…
ytc_UgzRU5XKN…
Comment
To touch on the points at 1h:10m
Breath, thirst, hunger. All costs of doing things, and all with an increasing call to action if ignored. With digital systems the only similar cost is electricity. How do we get AI to value something that's on tap 24/7?
In neurology, we can break down all behavior until it gets to a single discomfort. That discomfort is caused by neurotransmitters. What relavent process does AI have? I think it would be fair to say true discomfort can't be felt by something lacking in those processes.
In psychology we have around 9 fundamental emotional needs, these too cause discomfort when unmet.
What would drive the discomfort if there isn't any need to feel it?
What does satiation look like in a digital mind?
youtube
AI Governance
2025-07-05T16:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxBhPJopMR5oCn8wUd4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzWtHJ7_UIJTnPITxd4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgxnTpjyZmzKRD8Py-94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgxCjlMVkDg_V8LKb9l4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyTYj12b9jxJLLeGCh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwbhT-sNcMtN0I7AHF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxwyrswocWBqZkm9vB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwS2wm4MSh8Ygiguod4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugw1KiymkJ-1zbZ8qhp4AaABAg","responsibility":"society","reasoning":"contractualist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgzymLm3HobJDwN960d4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]