Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Exactly - who are you going to call when your heating and hot water breaks down …
ytc_UgxQ5Xs5G…
G
Psychological tests (which is what that compass looks like) are almost without f…
ytc_UgzWUEVl6…
G
So does the value of art go up in the future? The strum of a guitar, the stroke …
ytc_UgymDQWOW…
G
The only solution I see is to restructure our societies entirely. We can not aff…
ytc_UgzHL2Sxg…
G
When you’re so delusional that the way you complain about people stealing artwor…
ytc_UgylfIDi1…
G
17:30 life is short. why does my identity need to be tied to an occupation that …
ytc_UgxUDtZh0…
G
Lol a robot will never be able to take the role of a human being.. because if th…
ytc_UgzdDfm9b…
G
Most humans lie when they say they're excited about something or other, too. No …
ytc_UgxkmA3xn…
Comment
It's a big problem that tools that aren't stable and finalized to a point where legislation about usage can be put in place is now spread globally with very little thought about consequences from the developers. In a well run world the developers would have been sued out of existence for potential harm.
The software model that many developers use where they take a program to early beta and then release it so the users can help them finalize with the money they earned it is bad enough for normal apps but is devastating for something as revolutionary as AI.
youtube
AI Responsibility
2023-12-06T11:3…
♥ 4
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyydKYHLsIvkR2_RbR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwuOAmwgDuzTapIaZ54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx9VJZuN5PUniV9XW14AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwvz7-mGBDh04IBAQJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwvM0jOWGSPMnQ4DlJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwrJIp35ViXOlt4X-l4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxOTbAeSoMpc8LdC2d4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwVRiMnjUmTlhSDq_Z4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxTVIlWp2f2HSXs6Ax4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzcwqKyO76eQAlHhsZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]