Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
That was literally the foundational problem in the matrix you know… a robot one …
ytr_Ugwv5BHid…
G
I think the thing that scares me is the fact that at some point I am going to ha…
ytc_Ugx7M3hoO…
G
Would love to see three taxes get tested. One of which the MIT guys are probably…
ytc_Ugx83fW1F…
G
I write a lot of stupid fanfiction and some blog stuff, and chatgpt has become s…
rdc_jhswu3z
G
While AI is not exactly like humans in the way it thinks and generates stuff, th…
ytc_UgyD3iVqD…
G
As a Tesla owner and motorcycle rider in LA, I have noticed another hazard from …
ytc_UgwJ5HrYV…
G
I hope the AI bubble bursts so hard that all of tech falls flat on its face and …
ytc_UgzIHUOzk…
G
Mercedes has an improved cruise control that it did with the Hyundai kona it had…
ytc_UgxSC6W2F…
Comment
It seems like surviving the nuclear race (so far), has made us eager to move on to the next mad thing. We can't truly understand the dangers any more than we can conceive of, say, a billion dollars. I believe the scientists who say we can't get there with what we have now. What is true is that it's a MASSIVE bubble, and that's all conmen like Altman care about. Investors and tech bros know they'll make out like bandits when it pops. If we do get a super intelligent General AI, I hope it goes along the lines of The Culture novels.
youtube
AI Moral Status
2025-10-31T06:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzsKBioXFrB8Xi7-8x4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwid8bmG_g2GdlAuD14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugws9sfRXx301Cd8ChF4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxV_YHQCHw31eZyceV4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxgeR_zdgm1SNCilZR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxlaTTwEVfCbQwj21p4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzBzszkguA4vKkRVq94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxPQGbP5vRzUX5bx3R4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzPyLcmKVdlwcAsChF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyOM8-_8L_1ct1gW4B4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}
]