Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Current LLM are basically the ultimate politician. They tell you what they predi…
ytc_UgwHLr-um…
G
The fact that you can override the speed of the self-driving but it will continu…
ytc_UgyNIk_Qs…
G
The godfather of DETERMINISTIC AI - not probabilistic. It's not intelligent. I…
ytc_UgxnvHUSP…
G
TL;DW: "Real artists" want to believe they are special, "AI art Stans" mere exis…
ytc_UgyYpwti5…
G
This is only CGI for graphics expert's boredom. Don't fooled with that. Unless, …
ytc_Ugy5su-_D…
G
It is up to a small group of people to decide what will happen to humanity in th…
ytc_Ugy3EiVYo…
G
4:00 the reason why it can't generate without theft is because it's not an actua…
ytc_Ugw5iR9bb…
G
No, I'm good.
Despite selecting google assistant on my phone, Gemini turned its…
ytc_UgxTaYANF…
Comment
Elon was actually a part of an international org like this until billionaires started racing for AI and he hopped on the bandwagon to hopefully make his version of a “safer option” before them (that certainly went well lol @mechahitler). Sounds like pro elon propaganda but the point is that there are people who want this shared goal and that it is possible and should be reintroduced. The problem is people like Sam Altman who hate the idea of regulating AI in any capacity and lobby to pick these orgs apart.
youtube
AI Moral Status
2025-10-30T19:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_Ugy0-5WwJf846xl2_8R4AaABAg.AOuwg1tZiz3AOv0aJqIVPm","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgzLLCL_Vfj6k2zOIMF4AaABAg.AOuwY_h_9-zAOvD6madUt7","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgzLLCL_Vfj6k2zOIMF4AaABAg.AOuwY_h_9-zAOvX4W8g2mc","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgwtpQPsnnkNjjYcD1V4AaABAg.AOuwMtPwH8WAOv2OrhBzc0","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytr_UgwtpQPsnnkNjjYcD1V4AaABAg.AOuwMtPwH8WAOv453FfsUH","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UgzF4EYAm-1EQ2_o6pl4AaABAg.AOuwKO6z7Y_AOuzBQqw93a","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugxxy_KXEeL86_ndwU94AaABAg.AOuwCw6KFq0AOv-FwOjsUr","responsibility":"developer","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytr_UgzE5c4Fl9aH1FMeCvZ4AaABAg.AOuvwwah6o3AOv151PvJyv","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytr_UgzE5c4Fl9aH1FMeCvZ4AaABAg.AOuvwwah6o3AOv3D0qgiDC","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgzE5c4Fl9aH1FMeCvZ4AaABAg.AOuvwwah6o3AOv4GELK44z","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]