Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@fierce1340 Yeah, I will die on that hill, cause it doesn't matter what the fuc…
ytr_UgxhwUDAa…
G
Go get me a soda from the fridge sophia...Sophia: sorry cant do that. Useless au…
ytc_UgyenqooJ…
G
Les techniciens ne pourront jamais etre remplacés, pour moi ils auront le pouvoi…
ytr_Ugx8HBVbc…
G
I don't know. I've heard all this stuff about how infallible and great Eliezer's…
ytc_Ugzd-ma0u…
G
22:51 - in that case, I hope, Midjourney wins. What kind of broken and crooked l…
ytc_UgxBWCWW6…
G
I will never relinquish the the steering wheel nor will I be driven by a driverl…
ytc_Ugw6TrgIk…
G
7:56 I think they are doing this to teach AI. That should be enough of a thought…
ytc_Ugwk7sB4L…
G
When ASI or even AGI, and a robot with 100% human dexterity, exist at the same t…
ytc_Ugw7ixWkm…
Comment
So humans are racing to make more and more intelligent AI, because a ton of money is on the line and we need to keep ahead of the "competition". It's our own predominant value systems that are going to wipe us out AI is just the means.
youtube
AI Harm Incident
2025-07-27T02:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | virtue |
| Policy | regulate |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzDiR_nCcLdP3sB1VN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzC6vD6bzZcj4AvmAh4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugx6Qm8chzGNjpYV-Wh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxFyWamhfaXvnBpu4V4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzPV-XjudsgjUsrd1N4AaABAg","responsibility":"creator","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy8ZKyuYpCs6vea40V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxKmmwPpMe9zgBVb8d4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugzv0KzvWUPMoWtEpVd4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgxE0n3AoY1WnWZQNMl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugy7oJag4TP1_d0jLCd4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]