Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yeah, and then the AI has to talk to my users who have now idea how they do thei…
ytc_Ugx53RpYB…
G
We don't need AI to give life ruining medical advice. Human doctors can do that…
ytc_UgzaT1IoZ…
G
I don't feel any draw towards a chat bot, I really don't get it. But I have a go…
ytc_Ugwbp9aOo…
G
To be fair to the tech bros with the Apple example, they could totally use gener…
ytc_Ugzt55ZZS…
G
@JJs_playgroundyour thinking is flawed and narrow. You assume that robot will b…
ytr_Ugw1mzxg1…
G
"if generating videos with ai is horrible, isn't making videos on youtube the sa…
ytc_Ugzx8D7LU…
G
It’s fascinating but we shall see what this produces. We should always explore n…
ytc_UgyPdJ-ia…
G
AI is a pandora box akin to the nuclear age….not always for the betterment of so…
ytc_UgzkZ0fp9…
Comment
God the fact that this is such an easy thing to fix (ie keep the radar/ultrasonic sensor) is insane. Id rather pay the extra $100 Tesla saves by removing sensors knowing the car will have an accurate way of checking for objects around me. Cameras are not eyes and no matter how AI/ML gets, roads and situations constantly change and we need exact sensor measurements of items in the way
youtube
AI Harm Incident
2025-02-08T06:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyQ3m5S1-uxH4d_Jeh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz3NEMttcsox0b2H994AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzLBRuw_HnUTAZEd3p4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxH6Ogyc-SqjY-ULa54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgywbTV4MkOlNu0v8aJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw3HAuHzFzDAa5i-G14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgxEzTMaEjEJDFnzfM94AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyTrRs2KlDKdLZthxt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy5V6AuaWqfQtx4Ay14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxiTMgze-JnQAYraH54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]