Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Elon want to keep hes steps forward with X ai thats why he say to all who are im…
ytc_UgzSOEGk2…
G
I hope people take from this not that AI is conscious but that AI is the produc…
ytc_UgzFHY5KI…
G
Respectfully, sampling has never eliminated jobs the way ai bros want to elimina…
ytr_Ugwwd3qRq…
G
The folks of silicone valley strike me as legends in their own minds or master’s…
ytc_Ugx6MUZod…
G
2:55 need an example please.(edit: halting problem can not decide to stop or ru…
ytc_UgznwOiEN…
G
The company will end up being run by AI... humans won't be necessary..money won'…
ytr_Ugzciflc6…
G
@100c0c heh, those pesky ‘artists’. Always doing crime. I think this tech is gre…
ytr_Ugy-D6N1J…
G
For all of you who told us craftsmen to "find better jobs" our life's work, pass…
ytc_Ugx1aBsha…
Comment
Yes it seems like a possible hypothesis with the added factor of night time, hopefully this is addressed, so sad to lose a life, but it IS the drivers fault. Old school cruise control is level 1, Autopilot is level 2 AI(maintain lanes, speed, can pass slower traffic in about half the systems, requires driver attention), FSD (currently) is level 3 AI (can navigate from point to point and requires driver attention). Which were the tesla vehicles on? all require driver attention. Any further AI levels would require government approval.
youtube
AI Harm Incident
2023-06-15T19:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzMCBuziw4Ezp0Kmoh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugx7pmrAT3DWktPyFpt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwWivrymhl4wUY4gnl4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyCJq7SLVLDXpHIfwN4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugzqyu8i58Wr9TilSw94AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwF7kvlsv2wJuBtbuF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwZWArYDxNKoU8kNDh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyDOq1-JQnZ7nc5DCh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxkv38k3jP8uOnaEKF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz6YHYopMkwhS4TmQp4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"}
]