Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
No one needs AI, no one wants AI but that isn't going to stop them from pushing …
ytc_Ugwha8waD…
G
Do you prefer to look at CGI/Special effects or "Ai". Been around for 70 yea…
ytc_UgzNDKkia…
G
@IndexRed "They are fixing it" LOL. It's not a bug, it's a feature. The AI doesn…
ytr_Ugwn8E3EX…
G
I was waiting chatgpt to get angry at some point or start crying. That was a tou…
ytc_UgzeCizeR…
G
@XetXetable yea sorry i wasnt being clear but i have a basic knowledge of how it…
ytr_UgycFxL0L…
G
There are a lot of non-sequitors and inconsistencies in this persons reasoning.
…
ytc_Ugzka9Rms…
G
What I'm reading is to remove gradient descent and make the LLM just descent to …
ytr_Ugzvpkq4X…
G
I don't have a smart phone, 5 yrs now. I took programming logic, using C+, did w…
ytc_Ugz9XhcJS…
Comment
So what she's trying to say is someone has to die for this autonomous driving project to work.
They are never going to be able to replicate a professional human driver.
That's why I believe it should be mandated that a human being is be behind the steering wheel at all times ( And the driver should be paying attention).
How many people have to get killed before this 5 year old computer brain learn not to drive over PEOPLE in the street?
youtube
2018-03-21T18:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugy0iRDurE5kbF4u47t4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxUw_ET2UI1DH2_rDl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzirg4tn7e-n962m7p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxl7LgqTG_8lxdhl_t4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxV8ZzJyidy7LHYPH94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgxQ7IMYr_Toyf73iRN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwFQTYdACGwhyoXqDd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzexROj7pjdcfp_2Hd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw3XOT-yQgUw7lYYRd4AaABAg","responsibility":"company","reasoning":"mixed","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgwHFJjq3Ja3_hjQjfJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]