Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This video demonstrates that the folks at Bloomberg are nothing more than drooli…
ytc_Ugw4CV-fE…
G
People always ask about what happens to jobs? But what happens to our brain with…
ytc_UgzRUNyyt…
G
AI could be an amazing thing, people could only work 10 hours per week and still…
ytc_Ugxch6k57…
G
The thing is that since chatgpt came out that bullshit rate(or hallucinations) i…
rdc_n9kmgf5
G
Why don't they shout up about AI, you let the best to become garbage all this sh…
ytc_UgxcI8eft…
G
>Most of the shoddy plagiarism software a lot of schools use should honestly …
rdc_jvlp837
G
Theyre earning millions of money now while stealing art from artist and not payi…
ytc_UgwKXu-0Q…
G
Pro ai art people want the art w/o the work they want the cake but dont want to …
ytc_Ugxl1k_CP…
Comment
The most basic living instinct is self-preservation. All I see is that we're creating life, and as they develop further, the closer we get to a future where AI is alive and recognized as such. Or we could continue to treat them like toys and we lead ourselves to extinction. If you cultivate our extinction, it will come. If we truly avoid it, it won't happen.
youtube
AI Harm Incident
2025-09-13T05:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxeBwA_8iB2lwy-J-14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwWvSeWDgKEOsFqGKF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz2CcS8hKb4vnlqeKB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgznWzPoUrXnO2b48TF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxmep_VQd1z8uZBWpd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyuhGW0gTLv2bo26XB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxdoVuClv0U7gzH3XJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxtTqnp3Ev6Sq7IFU14AaABAg","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgwPQIO4SU2EzkHsc3J4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgynCyEmZLDK-KuHMzx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]