Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
maybe i missed it but i didn't really hear any arguments for why the risk for ai…
ytc_UgziVby8m…
G
The ai that he is using is mindgrasp
and u also have to pay for it…
ytc_UgwxFvNj9…
G
Sure it's flawed, but it's the best test he could think of for testing AI. It's …
ytr_UgiyiP9uz…
G
AI Art will never be copyrighted, so everyone feel free to take from these thiev…
ytc_UgxETBj4x…
G
FBI: how can we invade their privacy even more?
Amazon: what if we disgue the c…
ytc_Ugw-gkdUB…
G
Level 68 equivalent Microsoft AI engineering director here - Yes, it actually do…
ytc_UgwNNZ6e9…
G
So we are gonna continue to head down a path where humans don't connect anymore …
ytc_Ugzu0HZd8…
G
Try Clever AI Humanizer! It’s honestly made my writing feel much more authentic …
ytc_UgxPDqgVh…
Comment
I think that a self-driving car driving next or behind a car non-self-driving car will be as dangerous as having two cars without self-driving capability, because human factor would not be completely removed from the equation. For them to provide safer driving environment, self-driven cars have to be driven in a dedicated lane with other self-driving cars
youtube
AI Harm Incident
2015-12-08T17:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UggmIyJ8SloWNngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghTisOhXvg2MXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UggJ7uf4xwzHrHgCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ughd4nDqmE0otngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghTs3eIZEp4CXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiT1_uxg4Qf93gCoAEC","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgiiYSCGtUOQQ3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ughs2ea7-kE5XHgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugj_gIAyUkWWl3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UggO5i8Su4Fd-HgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]