Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The biggest thing holding AI back from actually being dangerous and acting on it…
ytc_UgxX6AwjI…
G
People actually think this is real wow what is this world coming too, it's a fak…
ytc_UgyJmm_cJ…
G
I’m not sure that’s true. Sure the AI companies internally are ahead but that’s …
rdc_o80oeh3
G
Yeah sometimes I spend WAY too much time hunting down a bug the AI introduced. S…
ytr_UgxHSsrP8…
G
why is this even a debate??? No mater how convincing an AI may be, it will never…
ytc_Ugi6zFhOr…
G
there is no human intellectual labor because intellectualism is about thinking n…
ytc_UgxuVtABW…
G
What in the world do we need "artificial intelligence" for when the human intell…
ytc_Ugx9bf7_z…
G
who makes this killing weapons like AI, Robots , nuclear bombs, guns to kill hum…
ytc_UgzWIrcQ5…
Comment
The ban would be completely futile. Every nation capable of it 'will' research it whether the ban is in place or not, so it just becomes an exercise in who can operate in the most clandestine manner. They're right when they say it's the new nuke, in that it's the next iteration of mutually assured destruction. AI's on the way whether we like it or not, it's too lucrative and effective for any nation with skin in the game to ignore, so rather than trying to bury our heads in the sand the best bet is to be the cutting edge that develops the best AI. Welcome to your new space race.
youtube
2018-04-03T19:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugw0QS7E7tno3PFcub94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"approval"},
{"id":"ytc_UgyPqjdvlZmjGugvEnh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwqmRfpWMiI7S74MDB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwKiYnjFrFjmgY2BYl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwhbH5nbdQPIETSJ7h4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgyIfVkRJOUFGBPM6K94AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx9lQHy65E6ilBa9_J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwyfaA-ijnkCtVPcqF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgykYB7zSFgCalhpSFd4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwjWdvb4RrIuqg9T1x4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"}]