Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
In college, we had a minor subject called "Art Appreciation", which basically ta…
ytc_UgwWxn4R2…
G
@Synapse88 ai isn’t just limited to digital stuff. Machinery powered by ai can t…
ytr_UgzQI0oXE…
G
Yes it is. Good luck to the firms using AI for coding. 😂
I am a not so good cod…
ytc_Ugw082PWd…
G
i have to admit, i stopped watching your last video because i disagreed and as a…
ytc_Ugx_Jigtf…
G
Yeah, I'm a CS major, and I specialised in machine learning, natural language pr…
ytc_UgwNIYAi3…
G
America is falling because of greed. Power greed, money greed. Look at the white…
ytc_Ugxi5DVfa…
G
Autopilot is a completely different thing from FSD (Full Self Driving)! It canno…
ytc_UgxaQJAsb…
G
Don't give up, the work if concept art is to develop and explain concepts. It's …
ytr_UgzT7wwNE…
Comment
Human control is arguably the most dangerous aspect of an autonomous weapon system. You could program a robot to follow a moral code, the geneva conventions, and to minimize civilian losses. A human sitting at the control center thousands of kilometers away will probably not hesitate to sacrifice some innocent lives for "the greater good". Don't forget that an A.I can't be evil if you don't program it to be, while there are plenty of horrible humans.
youtube
2022-03-20T18:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgxXY1A1jB1RwuOhmTN4AaABAg.9gQBCkCK7Cd9uAjGtMeUtm","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgyXvOkouEfwO_U0wvh4AaABAg.9Zln4KNMEfe9ZnzvganPAj","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgwbV_FSiG4KcX6TVEV4AaABAg.9YHhhmDzbkU9Zo-3rckqbA","responsibility":"government","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgyoGNtEJG_8qhOlEbB4AaABAg.9Xm91E0VQzj9nbinY_9OOr","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytr_UgxnMP_6vZ3HHi0jTJt4AaABAg.9Q_k7KHtBrY9Q_n9C6apOe","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgwScv2WH9lqjQNscEV4AaABAg.9Q55hprlhym9Q8svJjCvBR","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgyF-FYilKZnSJt-vlB4AaABAg.9Q3ZrWSSZnD9Qmc5-E14rP","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgyobqgoESh_kyIlhrx4AaABAg.9Q-qnIpXeO19Tlk9XkcAlv","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgxfJdxh82VFqzLi7j14AaABAg.9PUlJucnC4T9PUvivVwTdm","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgwyuYpc9UsJCgNig1x4AaABAg.9PRXs3NgrZt9T6l1VKX94A","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"indifference"}
]