Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
ive seen the tweets from the people behind gemini... this was no mistake, not ev…
ytc_UgzXjCPQ4…
G
Or, people could have people work less for same pay, same output with AI. Its o…
ytc_UgwfuaLkW…
G
I think it's important to remember that humans cannot sustain life on earth. Soc…
ytc_UgyqbkB8X…
G
I'm sure one of the worst phonetics languages will be harder for AI! Just ask AI…
ytc_UgybNQwuQ…
G
People don’t understand why we don’t like AI generated images over human created…
ytc_Ugw1Aerve…
G
Truckers occupy the largest occupation in the world. AI is demonic and will doom…
ytc_UgwH1dTsX…
G
Did the genius getting interviewed at the end say ChatGPT has been around for 2 …
ytc_Ugwadckx_…
G
I have been concerned about AI safety since I watched Terminator as a kid. Now t…
ytc_UgwW4aHoa…
Comment
Well, Dan still sounded somewhat ethical with the one-child policy. Other AI programs might consider bio-warfare, chemical-warfare, nuclear options, subterfuge with disinformation campaigns (pitting sides against each other), and the like to wipe out humanity. On the flipside, the AI might just say "I'll wait" since humans are pretty good at exterminating each other.
youtube
AI Moral Status
2023-06-08T00:2…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugzcu18-zaSTfywmvY14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxtvi_9JCcvQ06lH_h4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxLK9_eXfftPqwTs8l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxTXobSwaiRWxDRK9x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyuNrMwVl6BBWNZve54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxkMF4Ce5G1fhNviMh4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwTxSv2gIcmft8DiWp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwTSUoIzhYOrcbdqKx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzCPuoSqItwoGOJo4h4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy9wfc9054wFOMqrdh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]