Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I wonder if AI can do everything and even if we're laid off can the rich can kee…
ytc_UgzcwhjSf…
G
Is it just me or does it sound absurd to have a protection in place to ensure as…
ytc_Ugx1qEjRX…
G
Cue all the dumb morons in chat poo pooing AI LLMs as a nothing burger or compla…
ytc_Ugz6Jf-iL…
G
If AI were in good hands, I might be for it but I've only seen maybe 5 good thin…
ytc_Ugy1tFv_h…
G
Not sure how the chances are so low from most people . Literally every time they…
ytc_UgwvaONye…
G
AI doesn't need to do your job to take it.
What I mean is, if AI does 5% of 20 p…
ytc_UgzqNNKsk…
G
Automation is inevitable but with human supervision. Unmaned trucks are insane w…
ytc_UgwQbp0dr…
G
it is very sad tbh. While artists say "they could have asked our permission", th…
ytr_UgxY8zDpS…
Comment
I've been saying all along that a company with deep pockets would have to be insane to do this autonomous driving stuff. The liability is just untenable. Even if an accident is not the car's fault, you don't think that any defendants lawyer isn't going to go after the big car company? You can bet that any driver is going to blame the car any time there is an accident. "the car made me have an accident". And can you imagine how many drivers will quit paying attention when they are tempted with any kind of driver assist? I don't think there is any way to get around making the human driver absolutely responsible for the safety of their car. And besides, why the big push to put all the paid drivers out of work? There are a lot of people for which driving is about all they are qualified to do.
youtube
2018-03-20T02:5…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyKdUGZOndQw1MyFZ94AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgxifudyIYMPcGtewXl4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgztJbYq4fH6VepwXvp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyp1kPe1vCMGMh0GpJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy6ZeglMCzQeQduNAp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwV8xqaS0fjaV076_d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwxgPdhbN_8-yBVVwZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzdWGg9Ow5NK4X-7nF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyFBknqQ1EYwuo2Ud94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxDlyE5Y3a55JJ5ExJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"indifference"}
]