Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I feel so bad for her and people who are AI are super disgusting and despicable…
ytc_Ugw6ziQfX…
G
Which is something AI bros don’t have. Clear by their “AI can do it faster” argu…
ytr_Ugy3wAWf1…
G
You indeed are wrong about this. AI art is objectively terrible for artists and …
ytc_UgyNTSkUu…
G
ChatGPT can’t encourage suicide! The second you mention something about suicide …
ytc_Ugw9JeT5W…
G
@davidfuqstell that the the sorority of people that dislike ai. Ai has such a b…
ytr_UgzQEaEJP…
G
I hope AI destroys us all. I will plead for my life just so I can get a front r…
ytc_UgxnGfr_P…
G
We are all God, including those who create AI, like the owners of the Roman Colo…
ytr_UgwJKhVSA…
G
I agree with most of what you wrote, but the first part is based on a dangerous …
rdc_mvbiu9p
Comment
The A.I systems would probably still get orders from humans. No government would risk having an A.I system that just does what it wants.
youtube
2022-03-20T18:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgxXY1A1jB1RwuOhmTN4AaABAg.9gQBCkCK7Cd9uAjGtMeUtm","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgyXvOkouEfwO_U0wvh4AaABAg.9Zln4KNMEfe9ZnzvganPAj","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgwbV_FSiG4KcX6TVEV4AaABAg.9YHhhmDzbkU9Zo-3rckqbA","responsibility":"government","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgyoGNtEJG_8qhOlEbB4AaABAg.9Xm91E0VQzj9nbinY_9OOr","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytr_UgxnMP_6vZ3HHi0jTJt4AaABAg.9Q_k7KHtBrY9Q_n9C6apOe","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgwScv2WH9lqjQNscEV4AaABAg.9Q55hprlhym9Q8svJjCvBR","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgyF-FYilKZnSJt-vlB4AaABAg.9Q3ZrWSSZnD9Qmc5-E14rP","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgyobqgoESh_kyIlhrx4AaABAg.9Q-qnIpXeO19Tlk9XkcAlv","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgxfJdxh82VFqzLi7j14AaABAg.9PUlJucnC4T9PUvivVwTdm","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgwyuYpc9UsJCgNig1x4AaABAg.9PRXs3NgrZt9T6l1VKX94A","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"indifference"}
]