Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yall should have voted for Andrew yang when yall had the chance in 2020. He was …
ytc_UgwsASA7U…
G
They go on a game show and risk their lives to compete against 456 other players…
rdc_lj950j3
G
I like to say that a key difference between generative AI models and human learn…
ytr_UgzlNbA13…
G
I totally agree and think we should also push the development of tools which uti…
rdc_nt8vapr
G
It’s almost like AI supremacy is basically The One Ring. while everyone endeavor…
ytc_UgwB04mSG…
G
I find it a little eerie how AI has free will, but this comment looks like it do…
ytr_UgyfPhvnM…
G
You can use open source models like llama 3.1. If the service you run it on is w…
ytc_UgwmaIxDl…
G
@GabCleon i meant AI art ong i should have clarified 😭 Idfk why i put digital a…
ytr_Ugy3LCHfk…
Comment
great well this is terrifying isnt it? but also, millions of humans have killed before, and only one ai has killed. i still say humans are more scary. we keep dogs in our homes and they have killed before many times. ai is still pretty safe compared to everything else we deem safe
youtube
AI Harm Incident
2025-09-12T06:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugxv4AfuBqv0qJC7QQR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy72mpRozhh7V97sQp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz1KzgeWa8SDKeiVkx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz0OziEge1SD_B43ah4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzOJ6HDELP9cBs-w_V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzNhhpTsSBKIVcoBat4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwvBe4v051qB_Q0fOR4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzqXo2fwPeSZB3Wl-54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzn1w4RJTRmLf9pBv54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgytfoIDpxT07SJIwZp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]