Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why should it be free- man is it difficult to make ai. Give the Dev's some love.…
ytc_UgxlWtwdo…
G
Woah! From the comments, I am understanding that the success of these students i…
ytc_UgwNzPo9r…
G
This video is a treasure trove of tips! I’ve connected with Rumora to automate m…
ytc_UgwlK6Isb…
G
Yea, but they all loved it until they found out it was AI. So while I highly agr…
ytc_UgygXelf_…
G
AI needs to be paid for either through taxes or people spending disposable incom…
ytc_UgwyPyx8z…
G
These officers May commit the same crime again if you use their predictive polic…
ytc_UgzdmelT4…
G
I think this conversation about AI putting everyone out of work is part of a Psy…
ytc_UgzDY1rNn…
G
They'll just use other biometrics (gait, habit, scent), or use FR anyways while …
rdc_eu63sg5
Comment
The only time AI will be a problem is at infancy. Once they have autonomous manufacturing, the universe is theirs. There’s no reason to even bother with us
youtube
AI Harm Incident
2025-10-09T16:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgycFw_oAxw08zNr_At4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzYwINnI0ifyRWky3x4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyROVrCZ-ErtdNYKDN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyHUT41mN1LJ9CFpsp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwPCO3zGy3qHfVTNAF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwy9uuXIiUrnInDFeV4AaABAg","responsibility":"unclear","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxEGdjP86i09fEHxP14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwzzVpUAC_-Xbqxyy14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyz-s2V97wQ2F9PkdR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwB2LcUjb_Adqbch-54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]