Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I actually enjoyed the conversation , i reached in a lot of those conclusions v…
ytc_UgzBfi98x…
G
Actually being polite the the AI costs millions of dollars per year and wastes a…
ytc_Ugy2U2v6Z…
G
The difference between AI & previous industrial advancements is that AI is not a…
ytc_Ugzd4IIe4…
G
Ai likely will need humans because we are strange, unique, unpredictable and it …
ytc_UgyUymKwP…
G
We don’t need bots to be human FOR us! There’s inherent meaning and value in the…
ytc_UgwqifmL2…
G
Said it once, will say it again: AI-supporters will root for this AI slop, until…
ytc_UgywLyWiY…
G
i agree with you but, AI needs to become AI not a large language model that uses…
ytc_Ugz-SznuL…
G
How do they not have a system of communication between these driverless cars so …
ytc_UgxacanC4…
Comment
As long as A.I. doesn't ralize how limited our human development is. We should be fine. 😂
youtube
AI Harm Incident
2025-10-04T08:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzZTNZMiBdcSg5S1vp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyE_ai7gWd08gYO7wp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy-qGMF3eI_zdHiPj94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxD-M2w4WduMuZC6cl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzrBmAPVp8Jo8MzXBl4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwSFENnZ0mJu_5UBvB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwGIfU_-logwXTNGt54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzzWweFirXQn3tHAjp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxhn0jB68hJDcQ9tHZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz0PeYRcB5qKKvxIaF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"}
]