Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I actually made my friend who's a youtuber into AI. It's gotten barely people. W…
ytc_UgzpjlzdI…
G
We can understand your concern! The idea of AI becoming too powerful is a common…
ytr_Ugxz1Kj1e…
G
@pbhandsdown1046 Whilst they are being vicious it's because it's their livelihoo…
ytr_Ugx1Xg5kB…
G
got documents that you seemed to have lost now I'm really exposyou and your done…
ytc_Ugx3DtA7H…
G
when you have a teammate that uses ai for school and fails your school result wh…
ytc_UgxLmDHkM…
G
The thing is. AI will be the reason the rich will finally send the poor to the g…
ytc_UgzGenZ14…
G
Reflect on the following Quranic verses:
Quran verse 70:4 describes how "The an…
ytc_UgwSQvdFj…
G
Another thing is that the creativity of a human is unlimited, ai is bound to run…
ytc_Ugx0-13qG…
Comment
Actually, the comparison of fully autonomous weapons to animals such as guard dogs really apt. And one that I hadn't really seen or thought of before. Ultimately, people's concerns about fully autonomous weapons are their unpredictability as well as their potential for misidentification and attacking the wrong target. All things we already have with guard dogs and other animal pets. As well as rules and frameworks around consequences and responsibility for their actions should such mistakes occur. They could thus make form the basis that informs such frameworks for fully autonomous weapons as well.
youtube
2024-06-30T17:1…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgzxCbxuXWUJhrsNqfh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzgqLtQK0O0DmPYdx94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw1aCNv_FoIevkv7dl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzAV70__PRqJhqUIM14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyRq0iMi7R8RwiRkol4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzBfNnJAbOo9kot-Bt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzFT5MOQ_2KuftEBLt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy9BdJCKhbu16V96v14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzAg7w-Ema6FaE5OG14AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwCPbK-1BoUO5KgUYV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"indifference"}]