Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
These AI things are pretty scary, when something is capable of impersonating som…
ytc_Ugyeyqu34…
G
@Puglet12 translation: i like this, very cool, fuck AI art. and i love everythin…
ytr_Ugwi9FJT0…
G
AI as a whole can be used to do a lot of good but also a lot of bad. Using it to…
ytc_UgyT_5gUj…
G
We strive to imbue our AI models with a wealth of knowledge and wisdom. If you'r…
ytr_Ugw0kpd7h…
G
robot: BRO U HAD TO MESS UP F#CK THIS SH@T IMMA PLAY CALL OF DUTY WITH THIS BOX …
ytc_UgyUT_SWC…
G
I think copyright and Ai makes sense only for images, videos and text that have …
ytc_UgzJ_RDZj…
G
idc if this ai art looks better than what I can make. cause I enjoy my art style…
ytc_UgzAe5J1J…
G
"All it takes is time, and Ai will be out there walking amongst us quicker than …
ytr_Ugx9HE811…
Comment
The reason autopilot is overhyped is because the other side completely discredits it with content like this… there’s no apparent desire for factual representation of the future of autonomous driving by either side. Humans are now more distracted and dangerous than ever before behind a wheel of a car. Autonomous driving will probably never be perfect, but like subways, trains and airplanes, it will almost certainly net out to many lives saved, fewer accidents and fewer injuries. People will still die and accidents will happen, but policy isn’t made on an individual level it’s made on a population level. If policy makers and manufacturers made every decision based on individuals, we’d still be stuck in the Stone Age.
youtube
AI Harm Incident
2025-01-04T18:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgzaEYvJE2SSYW9L2Kl4AaABAg","responsibility":"manufacturer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyzFpnNOkVcKmH_aaF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwzp58DTN1IBX_8ERB4AaABAg","responsibility":"manufacturer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxGdm19dqdWYCXBTRh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_UgywLlKfnvOanu3C2f54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy4HRAiHIiHg30CIKB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwHooEFPaqbXC2ePTh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzqXpcLVaK5rSykubZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgztIu2__P6nFjw0cbp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxNGo6TFyukcaQc9mR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}]