Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Financial and geopolitical interests will decide how it goes, not you, the polit…
ytc_UgzQkJwPV…
G
The rich will build walled compounds with armed security, they won't give a flyi…
rdc_et6zqac
G
last night I was doomscrolling on Twitter, seeing how many AI artists I was seei…
ytc_Ugy4NKBEl…
G
You talk about these companies that pop up. Those will be consumed quickly by th…
ytc_Ugxh9uDB0…
G
That's just Fucking WRONG!! No income, no purchasing! Get a damn robot to fix TH…
ytc_UgwjcUrFg…
G
Which AI leader is lying in public?
If Steven speaks up I am guessing he will g…
ytc_Ugwb8BQKr…
G
Technology has been taking jobs way before AI was a thing. The 1% get richer wh…
ytc_Ugz8YdXPn…
G
Copyrighting AI art could kill it faster than any poison or nightshade or whatew…
ytc_UgyG7SODn…
Comment
1:02:25 there’s a level at which I think that AGI is gonna happen whether we wanted to or not and the biggest question is going to be whether the US does it or China. Now I am surmising that I am quite a bit older than both of you, so when I was at my community college around 1980 give or take, taking computer courses while I was a high school student, I ran into the 1967 poem "All Watched Over by Machines of Loving Grace" by Richard Brautigan. Of course being 17 years old it made an impression on me. It speaks of returning to nature well-being cared for by a computers.
Maybe the thing that saves us is that we convince AGI that we are so interesting that we should be protected. There’s kind of a running theme and fantasy and sci-fi that human beings try to turn every other living creature into a pet. Maybe that is an attitude that we can transfer to AI.
youtube
AI Moral Status
2025-11-03T09:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxl-irZ24TQH6hWA-x4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx2N4VSLpWYiaAU9Nl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxTRDRI6ihW6Y7mXL14AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugw3hgWijWat3sIFkE14AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz6RubrE5SGRB6CNSp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgywsXEBKXwBoOtxibl4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwy4DLEMkskCHsxpHJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxz4cdA5FdLHOEzscN4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_Ugx29BBr5-nygEAtEH94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzeWfR6lzNkl1wAnbV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]