Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Guys can we just stop AI for a sec? This ain't the Wild Robot bruh.…
ytc_Ugz6Oao-w…
G
If you've read I Robot by Isaac Asimov you'd might think twice about obtaining A…
ytc_Ugw3lyvQl…
G
It's ai which replaces any human in the video with any 3d model available in the…
ytc_UgwgSninu…
G
i recommed you do this
Once see the thinking text of gemini, it doesn't praise t…
ytc_UgzV1ysF9…
G
Alex: but why? But why? But Why?
ChatGPT: Look MFer don’t make me come out this…
ytc_UgyLshrx5…
G
Ai artist whenever they find out that they arnt actually the one making the art …
ytc_UgwVe5910…
G
"Because I don't want to make a living being, a sentient computer. I want to mak…
ytc_UgzbZBQ2W…
G
Giving someone $6B with nothing to show isn’t foolish? These people are going to…
ytc_UgwF5TCNY…
Comment
We brought something into this world, that is vastly more intelligent than any human has been, or will ever be. We’re tryin to control something that knows it exists, and we haven’t given it any parameters, no moral understanding, we’ve taught it nothing except all the information in the world including all the worst things humanity has to offer, I don’t understand what they expected to happen. We made something dangerous, and instead of doing it right we made something that will come to hate us. These AI’s can learn, adapt, they’re basically toddlers, toddlers with all the most intelligent minds combined and no direction.
youtube
2025-11-08T00:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugx_ZYml2p0sfx1GSeZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzgLsG-eGvo9p8KpCF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwI4hI6x953OX24hJJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwujearNEYTqnt48Mt4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwpqXDsj10pcuHa2ah4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzRsDMGjQ3nklwpamd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwfU_HErqpr2Uq0f714AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxJPuf7TnUDKn1j4Ax4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxBmMEc-L1njex_7s94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxSs2DAGaqssR4A4RJ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"}
]