Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Okay but you still need a human there to fucking fill it with fuel when it runs …
ytc_Ugxb3y7KL…
G
Why don't we put Elon in prison and let him do some deep learning himself?…
ytc_UgwgfGFA3…
G
So, the US government is fussing for nothing about Claude? Surely the pentagon w…
ytr_UgzIeVTX4…
G
The data center itself requires almost no human interaction to operate. But the…
rdc_o32fe5i
G
I know this will happen with almost all jobs nowadays. I work as a software deve…
ytc_Ugz56UD7u…
G
Out of all videos I've seen, not just on Alex's channel, but YouTube as a whole,…
ytc_UgxRPrZ5t…
G
This is sci-fi come to life. I read a short story in the 60s which predicted th…
ytc_UgyiEqB7s…
G
ai could just lie to us about uploading our consciousness working and kill us in…
ytc_UgzIu4WHO…
Comment
are ai driven cars any good in potentially very dangerous situations? which are countless and largely uniquely unknown. every other situation if the ai is as good as a human there is no reason to compare. in sudden never "seen" before seconds before impact situation who will do better? the human or the ai?
youtube
AI Harm Incident
2025-09-12T10:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwuQRD2kupySFWdpTh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyfovMhdm3eLeQV9G14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzXhs9F4VXJJur7V4d4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxm7nng80gPtO_qWEx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgznIuALORolMMREhxh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzRAWB1j9sIV36PwAB4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw4a-Xy9TqWSUF5xWB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwnoKYJtEYMF6GUwbF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwvKXyJ3tTKFOnqNpd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw1gx3jQu_InJLPGqp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]