Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Blake, you know exactly how artificial neural networks and tokenization works, y…
ytc_UgynSvMfw…
G
This also isn’t fair bc the atheist side went first so the whole time the believ…
ytc_UgyvyL0c8…
G
I find it laughable that the fashion industry would use AI models instead of hum…
ytc_UgxupasrP…
G
Huh? A.I causing mass joblessness? There will be a justifiably pissed crowd of h…
ytc_UgxHehf0-…
G
AI might destroy us, but think of the profit we will make in the process!…
ytc_Ugxyf_KfE…
G
It the same what happen when first ´facial recognition´ was introduce and you co…
ytr_UgzsrUpp1…
G
Sentient AI ain't happening until we give them full bodies with inputs to have s…
ytc_Ugz1R_7le…
G
Hey you probably won’t see this.. but I would like to ask what you use to draw i…
ytc_UgymcVREj…
Comment
But autopilot was NEVER intended to be used as an autonomous system. It’s like a fancy cruise control. The driver is still 100% responsible. The flaw is with the user who ignores the warnings and instructions that you have to watch the car driving and be ready to take over at any moment. By definition, every single crash during autopilot is caused by driver error.
youtube
AI Harm Incident
2024-12-14T02:3…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugxdmhhe2xJy38HHznJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyFZdXyfXOsO7ZwS2N4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzWKi7yOLCXXhIjoYl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzj1gf7aLSQPCHrLE14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwRC2hQ9Ro2054MSoR4AaABAg","responsibility":"company","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyPLsMmoebbolnGKoF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzRAyEwWX46oCbrWNR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugye6sm5CimNgUuqd6N4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgySqzulxczhECGzJZB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwR6MlY7-XSeYc0CHt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]