Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why do you think it will be able to in 100 years? Don’t say “you don’t know what…
ytr_Ugxj5DCPW…
G
Humans are so dumb they think animals are not sentient or self aware. What the h…
ytc_UgzcD3BO3…
G
Surely there’s a roof on this AI business though? Say we replace every job with …
rdc_j6g0ulq
G
Stop your phone from listening to you by plugging in a pair of earbuds with the …
ytc_Ugz_AdB_r…
G
1. Most Data Centers today are already working hard to move to renewable energy …
ytc_Ugw6owrZ5…
G
Mmmm. Sounds like this guy is here to keep the OpenAI investment money rolling i…
ytc_UgyVJO61W…
G
i just want to know the "kneeling down" and "barking" part was your idea or ai?…
ytr_UgxGzqteR…
G
As someone who works in IT service desk, I'm not worried yet.
This call did 0 tr…
ytc_Ugzba4cl0…
Comment
To predict the futur, look in the past (at least for the human part). "AI will benefit to all" is a lie. Like petrol, these people will mass fortunes and won't share it with anyone, even if they don’t need it. AI will be used by few, maybe even one, to dominate the rest of the world. And there are real fascist people in the game, some who won't hesitate killing other just because they are different, or because they simply wouldn't bend. So even the first part of the history does not seem encouraging at all. Are AIs systems that care about others? Or are they just systems geared towards survival?
youtube
AI Governance
2025-10-22T22:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxC4CSkSB2-Cnn347t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyucFbl-PAegc4b23x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyJuoOtFECFbPHey3h4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzQjqknbxaEaUkgdxB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgxrPCz8vna0jAbqlO14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxEAYKLnqeocaw6vlR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxrmfBfjfXpVJjGxft4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzqt4wb7bgn7vvOLOd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzCMRZnSqbjoKZynDB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"unclear"},
{"id":"ytc_Ugx67Cuvw5-b4nfIS-t4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"}
]