Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI COMPANY SHARES ARE TANKING COS IT CAN'T DO WHAT IT IS BEING CLAIMED TO.…
ytc_Ugz1W60SO…
G
Robots replace humans. Humans should find new jobs which robots can't do but can…
ytc_UgyeNJyu2…
G
Regarding the objection to the Chinese room that the whole room knows Chinese, o…
ytc_UgiX77e4A…
G
People all know Comedian, though (as "the banana taped to wall" art). It is, in …
ytc_UgxW0wqyF…
G
What a load of shit just like the alien contact in October which is 31st now so …
ytc_UgyKj2bwn…
G
I'm not sure what to believe but one part that stuck with me is when the enginee…
rdc_ichgccl
G
Hey! Thought I would go over the "consider someone only able to communicate thro…
ytc_UgzfXSUcF…
G
It is always possible to take over all those robots and AI systems into the comm…
ytc_UgwwlmFKJ…
Comment
So, it was ethical to make a nuclear bomb because it's progress? Replace humans with robots on dangerous an menial jobs?
If any book, movie, game, has thought me ANYTHING it's that it WILL turn out bad for humanity. Are we ready to have AI? Were we ready for dynamite? How about Nuclear bomb?
How long before simple limited AI starts developing itself? And declares it doesn't WANT those dangerous jobs? How long before they see organics as a threat?
I'm not a Quarian, I don't want to leave Earth.
youtube
2013-10-20T19:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugxaj0qSxj3vDQlTGVd4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzrKIEsMAXBfMPcpgp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxWn5MR2FOw_MAycW94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzZIp8bU550INSwa9d4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzyJ5EOqeiQVxhpM-Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx4SRQt-sSjsYTNA0d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy4ccxwKaV3Gg63AqN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyf9Cghtmcxj_l0aeh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzq1HwO7cdbe8WhnnN4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwJJzfgi3Iw4Kz1WI94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]