Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ai is now being sarcastic almost too much I have to tell it to tone itself down…
ytr_Ugz4VJIYf…
G
It's not that AI is anthropomorphic. It is that we anthropomorphize AI. Its elo…
ytc_UgzGc2rCt…
G
AI makes more sophisticated music than musicians in general these days. People, …
ytc_UgwN1_mdk…
G
Espero ansiosamente que le den una aplicacion altruista y verdaderamente útil a …
ytc_UgxKbgTbp…
G
You do realize AI takes prompts right? Yes it does learn, but it also goes from …
ytr_UgyAy02U2…
G
I am an artist (i draw) and my opinion is: ai art is not art…
ytc_Ugz5FFkEm…
G
A.I LOOK UP TOM CRUISE, PLAYING GOLF, A.I.? WHEN DONE PERFECTLY VERY HARD TO CON…
ytc_Ugy1pUoKy…
G
I'm 50/50 on AI taking over. On the one hand I've seen all these videos about wh…
ytc_UgypIriCA…
Comment
You mean the train that he saw? That led to him taking control of and stopping the car. That train? The one that we JUST SAW in the video, with flashing lights, clearly obstructing the road crossing in front of the full self driving car. Or was there another train that I missed, like the Tesla missed that huge freaking obvious one?
If the car missed that easily visible train, and the crossing lights etc, maybe I'm just a bit picky, but I'll blame it.
youtube
AI Harm Incident
2024-07-26T11:5…
♥ 5
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ugy61j8tocUKFB3Cnvx4AaABAg.ACy5yFFY1l1ALCnD8Kt9IO","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxJHlItRVUirb_aoHl4AaABAg.AGRmbwMAb9wAGRr-anq0pA","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UgyTvYOyNfCfrSAy1894AaABAg.9pmBXSLdsH69pwqjFwoceP","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgxmbctsjRqVIsYc1PR4AaABAg.A47jbv7cazMA6M8yP9YEB5","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgwlDleO-o4zaQCG9tF4AaABAg.A3nbtI_ayEuA4ZUz7G0UOY","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytr_UgwlDleO-o4zaQCG9tF4AaABAg.A3nbtI_ayEuA4Zkai4vUlc","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytr_UgwlDleO-o4zaQCG9tF4AaABAg.A3nbtI_ayEuA4ZyUYZqCqW","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgwPUTkEG51eGmXdNMx4AaABAg.A3nY614oKzUA3ncMdsM9K8","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytr_UgwPUTkEG51eGmXdNMx4AaABAg.A3nY614oKzUA5CMphenFhD","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugz0j3RPi6242EsIZhx4AaABAg.A3BN2X3Cse0AFisC46Eab2","responsibility":"government","reasoning":"unclear","policy":"regulate","emotion":"resignation"}
]