Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What happens when Quantum Computing meets Artificial intelligence.
Data and dig…
ytc_UgzqskMAb…
G
Brilliant idiots! Not the AI agents but the people wanting to "grant" them right…
ytc_Ugz60YSWM…
G
The AI could lock/encrypt us out of our own systems. I also think its impossible…
ytc_UgzABmZvo…
G
Most of this just shows how little attention people pay to driving. we should ma…
ytc_UgyD-lnV-…
G
I literally cannot fucking believe this is a conversation we are having. AI "wri…
ytc_UgzbJQwMH…
G
I will say this, problematic / controversial / illegal artists will always be be…
ytc_UgxkyS8D8…
G
LLMs *always* "make stuff up". That's what they do. It's just that sometimes, th…
ytc_UgzqqvUub…
G
AI is not the problem ... human beings are the problem .... a species thats soo …
ytc_Ugy15PwR-…
Comment
I mean, If that means free healthcare for all that sounds great. Imagine paying an OpenAI subscription of 20$ as your doctors fee, instead of thousands and thousands to for profit hospitals and Big Pharma.
youtube
AI Harm Incident
2024-06-02T22:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_UgyKib66b8s5YkeJGnR4AaABAg.A4Cnn1DJ6g4A4cWbquRXTr","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgyTBsoIjnKqYD_a-sx4AaABAg.A4CggS7ktZiA5uoYd_7TjJ","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgyTBsoIjnKqYD_a-sx4AaABAg.A4CggS7ktZiA6ZHFOwhYwr","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugwn1haP5cFdZs_AVdN4AaABAg.A4AnJcNnh1cA4CDMBnl1P-","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugwn1haP5cFdZs_AVdN4AaABAg.A4AnJcNnh1c4CIFXhmBh3","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytr_UgyrXvIHrsU1GKX3MLp4AaABAg.A48XmVFEoKtA7Mx2e42YYt","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgyXpXXhU6y1SMq-8pt4AaABAg.A48VmxuDnpJA48WdkgKczf","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugydt8PlJDYYAwWn-Ch4AaABAg.A48HSbRsNp1A48UXZ0LsU9","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugydt8PlJDYYAwWn-Ch4AaABAg.A48HSbRsNp1A4CFrEdOApF","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugy778tA6QSF-ISWArl4AaABAg.A482dS1MC8UA48Y2pekjVg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]