Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So disgusting how underaged users are actually using both ChatGPT and Grok to ch…
ytc_UgzH3SMOA…
G
>I simply wanted to know which specific job titles declined or grew the most …
rdc_nntq65x
G
I suspect that if A.I. decides to bring peace to the earth , humanity will becom…
ytc_Ugz5LFLNj…
G
I think context in what the prefer is important. Statistically white men have mo…
ytc_Ugx3GuniN…
G
Solution : regulate the AI industry. But Americans will never do that. Their pol…
ytc_Ugx3_F6js…
G
They called it a suicide immediately because they were *told* to and the scene w…
ytc_UgwdXb0ku…
G
"Is AI Coming for Your Job?"
It already did. The owner was already retiring, so…
ytc_UgzgpzwS4…
G
Can somebody explain to my why this text was so problematic? I understand that t…
ytc_Ugz3M34zH…
Comment
Just look at the history. Following through was the best option.
Should we have stopped developing guns? Then bad guys would have kept building them.
Should we have not developed nukes? Then Russia would be in control
Should we not develop AI? China, Russia, Terrorists will have more power
Just like Nukes, we need to be at the front.
Believe it or not, but the world has evil people in it, regardless of what technology we have
youtube
2019-04-16T20:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugy3boToPB_xWnwDHgh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyh2gJc47ez_S9dRKN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyIE2I8RCmsT7k9A9Z4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzldFL3hAVhJj1xO9B4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzXHaMIBlkxJAOtj8d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzFQY3tdogCJuB7cOR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz9HDU1etCXTMNqZPJ4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy5xzYdlGdJWiWktBp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgwP-7b0tk7S3HzwoAB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyyVZ6vNhke3sRzYqV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]