Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The problem is an obsolete economic model that is unable to reward hard work unl…
ytc_UgyDtmyNn…
G
I’m all for medical advancements through AI. What I worry about is AI replacing …
ytc_UgwfxlWnl…
G
38:35 BIG AGREE
Even if people reduce the way of thinking of them as "it's just …
ytc_UgzwKIBkT…
G
Something I find very interesting that I haven’t seen brought out yet is how the…
ytc_UgxtdxnfD…
G
Pretty sure AI corporations were THE big funder for Trump, or am I tripping ball…
rdc_nufujhg
G
Driverless trucks are replacing true line driver’s. This is abhorrent and could/…
ytc_UgwT0XZC7…
G
Scam nonsense.
AI fear mongering is designed to get governments to push for reg…
ytr_UgwjAzmKj…
G
When ChatGPT gives me incorrect answers or bad outputs, instead of saying “fuck …
rdc_ktrfzfv
Comment
Something else to consider: I am against any self-driving vehicle but especially large trucks. I can guarantee 100% that situations will happen when self-driving trucks kill people. Consider if there is a wreck or a disabled vehicle or any other reason why suddenly there I are things in the road without enough space for the 80,000 pound truck to stop. Maybe it is dark at night and raining and the truck just came around a curve at 60 mph. Let’s say the computer driver must decide whether to hit a small thing (a person) or hit a big thing that will destroy itself the truck. Do you trust the computer to run into the big thing possibly destroying itself, or will it go for the least damage and hit a person?
youtube
AI Jobs
2025-05-31T23:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx71p4t5Kj9qNuV-I94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzWWnNhyDyclD69bbh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy5qQ40Drxssy34dXh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugy9_pxF86NWlPiDOrB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwIllj-paCyvLoztpx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyGZ7kXUN7oAIbjIHJ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzOoXODREXStYv7PgR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxCstFVgLP9yjzC_Fd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwRMbsIljqY-RiJ7ph4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzFTDkoaU3JsK9hbvl4AaABAg","responsibility":"company","reasoning":"mixed","policy":"ban","emotion":"outrage"}
]