Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Bruh, if they think art is something that people naturally get, they must be dum…
ytc_UgxtpgmO9…
G
Thank you for your comment! While the dialogue might share some similarities wit…
ytr_UgyRtDnSR…
G
I THINK THE MESSAGE HERE IS YOU CANT REPLACE A HUMAN SOUL, SAME AS YOU CANT DEFI…
ytc_UgygzKIZ3…
G
As a developer I like to use AI as a stack overflow alternative when stack overf…
ytc_UgxeGDn8e…
G
In 50 to 100 years, he will still be right because A.I. is not improving.…
ytr_UgwXt6Rya…
G
I agree. Sending out a drone or a robot removes the actual threat to human live…
rdc_f8sk2za
G
One of the dangers of social media is bad faith actors (or even good faith actor…
ytc_Ugz01mhwO…
G
It’s sad that we all do this because- again- we did not ask for it. We did not w…
ytc_UgxwxoR3I…
Comment
It only took a couple months and we have an Ai that lied to a human to hire them to pass Captcha, and an Ai telling humans to kindly… “please die”💀
youtube
AI Harm Incident
2024-11-29T19:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzQ5g-EOmPZBPmhotp4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy6EWWO6RBVI7ovwi54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxWCY227pGbJ91nxtZ4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwgr6zM1y8p6kEN_yt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy1dOEJSqJ5ScEf9cF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwsTuxgdOTtYKBCsxN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzTCo_Z6j5TW-2eIGB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxEdtK7LfgtYhM0R5V4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxMYCk1fAnBH3-4fh14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgziJivoEseqCFmmmNt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}
]