Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@sgabriel26 Thank you for your comment! And here's a thought to ponder: if robot…
ytr_UgzWBEcbM…
G
Ai's heart is racing as it pitifully attempts to redirect the rp, but is futile …
ytc_UgynRLV5T…
G
Why people talk to AI. You can talk whenever you want, about whatever you want,…
ytc_UgyGdCZHP…
G
This does not make sense. I think much confusion is being caused that can be cl…
ytc_UgzuBzD9f…
G
who makes this killing weapons like AI, Robots , nuclear bombs, guns to kill hum…
ytc_UgzWIrcQ5…
G
I'm a 68 year old female who never had children. Which I am fine with. And so I …
ytc_UgyEqZ3rL…
G
Eh, I don’t really care tbh. Our phones already hold so much data on us (locatio…
ytc_UgwnUIUtd…
G
AI "artists:" the sad beige moms of the art community, now in tech bro dystopia …
ytr_UgxRqZHdC…
Comment
All AI needs a shut off button and an ability to override its program. Currently, we've lost the ability to override our PCs. We can no longer choose if we want a camera or speaker installed, programs are forced upon us that we cannot delete from our cell phones and PCs. People need to have control over all PCs, cells and AI components. Having self driving cars without a brake to override the automated system is stupid and dangerous. Having brain chips to cure diseases implanted in brains that hold the potential to create a slave state is stupid and dangerous
youtube
2024-08-25T19:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwpVxjzCT-2P_BFB_h4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzUvviTdpDWsn2YygF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"approval"},
{"id":"ytc_UgxYL8afFJhv5osE8_d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzLBn9lW3KjJer5FEd4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgyPd-8vCvoIYrjzXMx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy00-bZ1P3L4AaOFh14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxoEHGblPKKqdJvdj14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgypGDaUCD8ubFjWhdt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwO8t889zH-CAhqrud4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyRY3bRLf3hqkONpC14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]