Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
One area they didnt cover was how increased dependence on AI puts us into cognit…
ytc_UgyGlWPQo…
G
Working at tech companies you are forced to basically train AI by using it as mu…
ytc_UgwRcy_ha…
G
I begin to find it cringe for how these Tesla people really suck up to this bran…
ytc_UgyQRke7Y…
G
What if we just destroy A.I. right now , and carry on with our lifes…
ytc_UgyuGgRoH…
G
Maybe if consumers actually raised their standards, "slop" wouldn't be so entici…
ytr_Ugwph_3TM…
G
Wait we are already using AI for these purposes. I thought we already knew that …
ytc_UgzAzS4J5…
G
Like I'd ever be in the same country as a robot with a gun with bullets.…
ytc_UgzuwkM6C…
G
Carefully crafted systems have been in place for nearly 40 plus years now. They …
ytc_UgxwRw4XG…
Comment
until proven that those incidents were actually autopilot hitting them then everything in this video is full of crap.
No you dont want radar or lidar, because you want every car to have autopilot, and yes its is called autopilot correctly it shouldt be called almost autopilot or some bullshit, its autopilot and its in beta.
You of all people should understand that if this tech hits the masses we the riders will be super safe, because i'd much rather trust a robot than morons that drive like crazy every day. At least with a machine its consistent.
youtube
AI Harm Incident
2022-09-04T06:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxCxYZo9m_LHfHEkp14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyKUOLxpSSl60xSwC54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyD6nQYTbMDdVNZHn94AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy3SrPcKbufoo9yxx14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyJBR2a-RPrCK4RpP14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyt-Fa7mbWvHZ2gW5R4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxlZO5_AoMLXPOG5Qp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgysN3F3pzx6MrDVMVZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxu19QGgTea-JCpAHB4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx51pLGEND4ettC6SR4AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"mixed"}
]