Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@williamafton1376the problem with the advancements in AI is AI will be able to …
ytr_UgwXsmozF…
G
That's an interesting and humorous thought! If we look at it from a philosophica…
ytr_UgzyJXtQm…
G
I'm just saying, whse gonna create something like this on their own? Nobody woul…
ytc_UgyZhEODm…
G
The incentive systems of current AI all reward engagement highly. I think it wou…
ytc_UgxVl_ePA…
G
Did you guys see how the robot was trying to type Kill al the humans but instead…
ytc_UgiT6JNbc…
G
Esploytation of somebody's work... That's #Capitalism and we fought against ever…
ytc_Ugy7TgJF0…
G
Flash versus steel in metal come on now how the hell you expect to hurt that dam…
ytc_Ugx-eotPY…
G
Better invest in something else in AI, these equipment looks unreliable and not …
ytc_Ugy6VvO5F…
Comment
I do not agree with some of your conclusions, but i have great respect for your efforts, and I value deeply your contribution to this very necessary discussion.
Nobody knows where this swarm is going. If one listens to Altman then it seems that he understands, that AI may not/ must not remain a mirror of our humanity. If AI doesn't kill us, then we will somehow manage to eradicate ourselves. To be really, really honest, i feel we need to fear ourselves more.
youtube
2025-12-04T17:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgzCX0-xmR8UNch4v214AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgxZGa02IcH-J2PvWEV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},{"id":"ytc_UgyizSSelWkX-3DBUSp4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},{"id":"ytc_UgzBk5ZDYxK6--WUYqJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgygQOYm7T8WsPMCLRd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgwPvEzL0TAT55aGqxt4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},{"id":"ytc_UgzyUu5A2zQpXs549OB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},{"id":"ytc_UgzMwBazFTJ6Vp0Er6F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgwpXv43-tJNqn7a5oB4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"resignation"},{"id":"ytc_Ugz-iSiR2jH-v0zUv1l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"})