Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Watching this video brings my mind back to the case of that killer guy and young…
ytc_UgxzBdIzq…
G
Omce I tried to get an AI to generate a completely black image
It did not work…
ytc_UgwWhgzg6…
G
How stupid you should be to compare enthusiasm about AI with tulips fever of 17t…
ytc_UgxxQYD0u…
G
So, because my hand shakes and I cannot manually draw it, you despise me if I we…
ytc_UgyCLEpHD…
G
Even putting in air quotes feels wrong. These people are thieves and nothing mor…
ytc_Ugw-jVAi5…
G
i think the ethics would make driverless cars impossible. someone will sue the …
ytc_UgiKVLZeD…
G
@Speaker-Beater You're just hating AI for the sake of it. so, why even bother ,…
ytr_UgyDJT-Mk…
G
i want to know what's going on in the thoughts of pro-ai people.
seriously, is i…
ytc_UgyJUAiwA…
Comment
A lot of ifs.
Hey, Elonia. Yeah, humans only have optical sight. But they also don't have wheels, so if your argument was "humans don't have LiDAR, our cars don't need them either", then take the wheels off.
In the real world, single cameras have no depth perception. And even with two eyes, living organisms make mistakes.
Self-driving technology cannot be allowed ANY room for error. That's why redundancies exist to begin with.
youtube
2026-04-09T15:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugy9tdBbB9hrFjKYKmV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzlKLZNeTJmLUULum54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyoEaRVuI0jq5zye7F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx2ZUtG539KTnT_nrN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyACA_tP1esiLUe6mx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw28IIUnhqMgH8l6Jd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw5piqVZRMOW_40pWl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugy_6aFmUbFEvAQ33up4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyACopbj6pnHeKvo5l4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzlEYEuPjH6G9ATpcl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"}]