Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I disagree with you totally! Have you driven a Tesla with FSD AI lately???? You’…
ytr_UgzOt2HuV…
G
I don't think the world is made better by hiding the truth, not in the long run.…
ytr_UgyolKZJV…
G
No it will take a bit of time before the AI runs enough of everything to kill us…
ytc_Ugw8mY0ct…
G
I agree, yet these AI elites don't care about indigenous people or people in gen…
ytr_UgzNNqMkv…
G
@Pseudo_ShorkX3 You misunderstand my point. I am not mad and I don't care if she…
ytr_UgxYDN-m1…
G
Ai art to me is somthing like nuro that vedal made
As in the ai it self can be a…
ytc_UgwCOC5jw…
G
The real reasons for this man’s talk is exposed at the very end. He wants more f…
ytc_UgyMuG-wf…
G
Tbf, this is probably the fewest number of humans AI will end up k*lling for the…
ytc_UgysunZ_O…
Comment
I've been thinking of that since the beginning of this AI debacle. Companies should be focused towards increasing the workforce not the other way around. Are they really thinking of a dreamworld where they could just deploy infinite AIs at will and treat them as second class programmable ready to use customer?
youtube
AI Harm Incident
2024-07-29T00:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugxm7ENojjvkF12DvLB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyZMhLpDCn4D5jSUZt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxHPTbCD7wyErFM7JN4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwUKSeAiYp0lRJ5jqh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxKlPjNG7aKu82glWN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzDtVKFLRUw7SZ4DbR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx2rbKNDHKC-hUyz_t4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy7_X9JdCJW5fuRos94AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgytNdnkuR-ZnmMgq9Z4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxU9dPXuKcGOp-LgB14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]