Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
To be fair they are pretty much on the edge of being thinking entity's themselve…
ytc_UgwmGhQm-…
G
It's lowkey giving GLADOS - an AI character from the game Portal 2. In order to …
ytc_UgxyFGLcW…
G
Can we please check such theories?
Every sensible person knows that the creativ…
rdc_nee6abj
G
Istg if all this hard work and practice is just wasted and i dont get to open a …
ytc_UgwdiqVuG…
G
Bro said he was getting arrested for not being identified… but he was identified…
ytc_Ugx_hilCE…
G
AI will make so many roles redundant it will be shocking. This is the thin edge …
ytc_Ugyb8QSXw…
G
I would like to see AI somehow replace me as a plumber/ Gas B and HVac, I have s…
ytc_Ugxg69Dfu…
G
At least this is talked about more, another perspective on the ai art is that an…
ytc_UgwWcNx6E…
Comment
Banning AI and automated weapons research will only ensure that people who don't care about the law or morals will reach the goal first. An incredibly naive idea. It's the equivalent of letting the axis powers create nuclear weapons before the allies can, if the allies would at all.
The weapons will be as such that only the same kind of weapon can combat it effectively, given how they can consistently outperform humans.
youtube
2018-04-08T22:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgyMsw_5sghugRgZzM14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzEBedvKqcFUkb1b894AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwGsrZA0IgP6Jft_094AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwS27Cr1Mx5vHWqzM54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyAoRXYxzT8z4j8qbt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx0XiMOC3M8BPV2Gb14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyWWGEI1-_J0amIte54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwQcBfCdpeceoxdRZN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwovzul0dbt9NL-bSV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwgrJW8y9MaeT3wjKJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}]