Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
As soon as I saw salt and AI I knew it was the bromination case. I was waiting f…
ytc_UgydsZRkj…
G
The predictive algorithims that power chat GPT wont ever become consious, but w…
ytc_UgxA2w141…
G
Prime example of why I keep saying humanity is not ready for AI or robots becaus…
ytc_UgwZpchBg…
G
@tylermoore4429 Many people deny AI existential risk is even real, so this doesn…
ytr_Ugz_o_4UN…
G
asteroids are full of precious metals, dinosaurs are warm blooded and still thri…
ytc_UgxLqUFlg…
G
you want too much money for no skill work and lousy production and attitude. AI…
ytc_Ugz01J_g0…
G
I don't see the value in all the statements in this video made by AI (I think th…
ytc_UgxkceJPH…
G
Freaking love this stuff. First off, this isn't the "self aware" Terminator AI t…
ytc_UgwEVgBuy…
Comment
The next step in evolution is to scram and let robots take over, I would accelerate AI research. I am sure robots will take better care of what we left of this planet. They would also have no time problem with exploring the universe. And in a few decades they would also write better poetry than we ever could. The robots are not our enemies, they are our own next stage.
youtube
2015-08-11T23:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwm9I9NcRQElvQfqu54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwRhW6ydR3WoIlU3gl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgghtrugE12abngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugic-8CdfbK863gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgiiVzQEVXTO8XgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgigNAG8ggHJ7HgCoAEC","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugigkb4gWN8_I3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugi_4VKjBann7HgCoAEC","responsibility":"none","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugi9Gszi21MTEngCoAEC","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UggnLXyVGHuX8XgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}
]