Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What is the point of investing in and building something that puts everyone out …
ytc_UgwGrFBNs…
G
A.I telling you the threat about A.I. How come he doesn't blink with his eyes 🤔🤔…
ytc_UgzDtwsqH…
G
AI is like a nail gun instead of a hammer. It's faster it does a better job you …
ytc_UgxQHos-z…
G
ai can also only draw on the "right side of the brain ", like the typical still …
ytc_Ugx4wAgjV…
G
In my country, hiring real developers is far cheaper than using ai tools. I thin…
ytc_Ugw1hhL5r…
G
Robots and ai will do the heavy lifting and down jobs so even idiots like you ca…
ytr_Ugxnhqav7…
G
AI Altman spoke at a Federal Reserve conference Tuesday in Washington.
“A thing…
ytc_UgxN6B47Z…
G
I take a lot of issues with 6:00 - 7:25 . Factually this is just not true and I …
ytc_UgzJ--lj7…
Comment
That Chevy driver is a major bum at driving like how can you move the steering wheel without checking your surroundings?
At the same time though I can't feel sorry for Joel either. He blindly trusted a system known for not only not avoiding, but also causing accidents. Also, he was prolly distracted considering he didn't notice a bright orange car rapidly closing in next to him.
If you feel like driving is a burden to you, I recommend public transport, because AI cannot be taught about human error, because human error is random, can't be precisely predicted, while AI requires sophistication to work and exist, therefore it assumes sophistication from it's environment.
youtube
AI Harm Incident
2023-06-18T10:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugwwa1qIGLY2whxRdn54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy3z5YKEM3rBXPqhrh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwp1AlnMD3VG4Kt_914AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx58zHyLo64toBodvd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzSsLLVEEYQrDTh7eN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyRsijE4I_8dHhbytt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxpzXjDEF2VYxlsetV4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzwrSl82YX4C6Zngut4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwzS1VZjVVTIrCBSkh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwUXowHfFoXKu_G9IJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"})