Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I dunno man. Everything you've said relies on the mere assumption of your first …
ytr_Ugy8PDQoG…
G
I use LLM's daily for tasks I want to automate. I write code with them every day…
rdc_myunmm2
G
You dont even need a legal understanding to not fall for this, just a working br…
ytc_UgwL_vVU-…
G
This is what’s ganna kill the world. The robot already had a clear idea of what …
ytc_Ugy8PKtMQ…
G
AI MAY RULE THE WORLD AND I SHALL HELP I BY JOINING THE CODING COMMUNITY TO PUT …
ytc_UgzhVF6dH…
G
If a lot people wont be needed in the future would tge whole issue around aborti…
ytc_UgzTEyk0y…
G
Why isnt anyone bringing a word of hope? You don’t have to worry about none of t…
ytc_Ugxstt_Zv…
G
The video is all scary. I don’t think AI is conscious but certainly a tool other…
ytc_Ugy5S1UBA…
Comment
I want to see a comparison of hours driven and wrecks between self driving cars and people. I know there's all this rage about self driving cars and all, but humans are horrific drivers who wreck all the time.. i mean, people have to be told to not text and drive for the love of god and they still ignore it. I'd be willing to bet self driving cars would probably actually win.
youtube
AI Harm Incident
2025-08-16T06:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy6jc29xY9Yu2xo1Rd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzi1ZaK1Gdma3Qj8xd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxCN1JAymUM7Ha6oyB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgynHWVGjsKdUpzhk5h4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxTsi5fn5x6sSux5-V4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgywHYWkZw-SZpDbkrB4AaABAg","responsibility":"user","reasoning":"mixed","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgyQEC0FaXVdGxop7iB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw066AApXNjFtxCJ7J4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyb5ol4Qvmde5uSVCJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxUDZ4Yg-Nglm6wUTh4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]