Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
" handicap themselves " some real entitled, out of touch, and apathetic tone dea…
ytr_UgxLE0y4L…
G
I dont understand the entire point of "we can use AI so no one has to be an arti…
ytc_UgyW2IoID…
G
13/50 are upset the AI learns and adapts after reading millions of police report…
ytc_Ugy4gVWCg…
G
I absolutely hate ai. I am actually in the middle of tracing a ai picture right …
ytc_Ugy5UntlC…
G
i feel like the ai ads for google pixel 9 are made by ai itself 😭…
ytc_UgwNs9g2I…
G
At the end of the day or the planet billions of dollars isn't going to be worth …
ytc_UgzJVuUOI…
G
The irony he calls everything nuts and insane, meanwhile call himself a Marxist,…
ytc_UgwIACSOj…
G
With all due respect, this video is a bit behind the curve, kind of making it si…
ytc_UgzqTvo9w…
Comment
I think this will have pros and cons in a long run. Developing robots capable to learn from humans and automatically save in a sort of AI cloud that is automatically shared among robots is dangerous. Let's see this differently. Hbt they learn techniques of destruction, war,.. This will turn then to revold against human and take control of the world.
youtube
AI Moral Status
2019-11-17T01:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyQGgrTBFjdAdQX-6F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyRtH3JLC8jIt1H3VJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzBatEZTWaxR5uoM7d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzRsKZrOKR59jYhsWJ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw1L4bdk4DOeCiNqyl4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxUWdKy4oghqZsrobx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx-aKCQi7ubCWGqGaB4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxYyHC9vGrH7NKq0xx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz-5_UrxLuS4UzFrol4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw0JaG9uh_EGzMv65d4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]