Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Unless the bordering countries start building their version of the Surovikin lin…
rdc_mcrlayx
G
also…
•ai uses water (I’ve heard this many times, im not sure if this is fact)
…
ytc_UgwYj03od…
G
Meanwhile ChatGPT seems to think my truck has a totally different engine when I …
rdc_n0m03gl
G
It should be obvious that Ai is (literally) a soulless mimic of the human condit…
ytc_Ugx-7bIRC…
G
That's subpar facial recognition technology. The developers need to come up with…
ytc_UgwdY8JVc…
G
@RobotTed I look at it like I would if I hired a live in nanny or housekeeper. I…
ytr_Ugz2BJmqX…
G
Tucker at the end: "AI has created something that is far smarter than humans" ??…
ytc_UgyKZ1MgQ…
G
The holy crusade against AI art is absurdly laughable to me, but I respect your …
ytc_Ugw6t1ZTW…
Comment
I’m a common sense guy. Biological beings will not survive. It will either have to assimilate or be wiped out or maybe left alone in its little corner of the universe to just try and get by as best it can. Are we just incubators? Maybe no one knows or can comprehend. I see so many scenarios but make no mistake we are already locked in. Human beings can’t help themselves. Who did curiosity kill? Yep..us. Don’t worry. As we are already locked in none of us really have a choice anymore. The reason we are here is because we’re survivors..if we die out then something else will try. If we make it to ai super then that might just mean we succeeded but it may not be the success you’re thinking of. I just hope we at least have a period of utopia until anything goes wrong..but would that be wrong?
youtube
2024-05-06T00:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxOoa4FixJpMwwykFh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwbTffeyqRVjB9IRtt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzJUYkVej3QLEHaGz54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw5AtRpQx9GGlC3XbR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw-UVakVVQMoBoX6Mh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx2ztaVpwAnjqbAWgt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugww57KxxAYLwN4m8RR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwSK9D5K5WHkjTXiJh4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyvbulXDmxAGY2qU3R4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzpKz6c7Gv_AsM-yaZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]