Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The robot in the thumbnail looks kind of like the one from the daft punk music v…
ytc_UgzNzYUwS…
G
We desperately need an update to this video. Preferably one that combines the up…
ytc_Ugxvq7mWj…
G
I appreciate the concerns voiced here in regards to what "the policy says" versu…
ytc_UgxPZ_FsW…
G
In some instances, driver turnover has reached up to 300% annually. Even in the …
ytc_UgwASrUdd…
G
I mean, I dunno, without AI slop we wouldn't have channels like AI Slopalooza (I…
ytc_UgwBtYlId…
G
There is kind of vague spectrum of definition of AI
from the learning AI via en…
ytc_Ugyz9AZsN…
G
@TheDaringCreatives as one who makes the machines; good. Less loss, less mistake…
ytr_UgzHPhAzm…
G
Khan academy has an ai tutor which they are charging $5 a month for. They have …
ytr_Ugw81eCJ4…
Comment
You can be as intelligent as you want, materials for self-reproducing , anti-human killer AI cost money, and that is the decisive limiting factor. Also, nothing can be everywhere at the same time, so there are always random mistakes, random faults, random survivors.
youtube
AI Harm Incident
2025-11-09T15:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgwtE8v0OLbTd8DHZl54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzv83AzMi9Yxc_5Z-F4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwXfmAMiyeZJ6d11Bx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxA73NBfFzwRyRxuq14AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxjGgnaXbybhhRM8dB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgztlIosD1oxwDFE1X94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxAjdViIsU1QbgKhKR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwfMqzTl5mLYK7VIvV4AaABAg","responsibility":"company","reasoning":"mixed","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugyb50IccIOGRLqh4s94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgwAHlGjyaXT1m1j7H94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"}]