Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
My brother used ai ONLY OF HIS ONLINE BOOK uhm the pic thingy that is on the fro…
ytc_Ugx3CMjHI…
G
What “human lives have been ended” by AI? The gross lack of any specifics make t…
ytc_Ugx90c1zP…
G
Pre-crime is augmented with artificial intelligence, which will judge your socia…
ytc_UgwEup1VB…
G
Im an refrigeration n operating engineer here in nyc at a skyscraper. Our appli…
ytc_UgyOJVwbZ…
G
Ai dobre jest ok a te zle nie takze ograniczenie bardzo duze mask zlo i jego ai …
ytc_UgzykB_LN…
G
As soon as we, if still possible, learn to allow LLMs to self-diagnose itself us…
ytc_Ugzvd7ON_…
G
I have a theory in my head that the interface to the web goes through a cycle of…
rdc_nualny8
G
Ur teaching a kid to be a competent responsible capable person in society. Now w…
ytc_UgxlAwZXu…
Comment
no, sorry, the reason why we are not ready to self-driving cars and that we well never be ready, is simply that we don't want to become robots guided from a upper-stage intelligence, we have already lost so many degrees of freedom in the last decades or centuries with statal rules, and that's enough. We want to be free to drive our own car exactly how we think we should drive it (within traffic laws, clearly) as much as we want to be able to cut our own beefsteak with our hands in the way that we want without a robot cutting it for us and feeding like we were either children or prisoners, and exactly how we want to be able to chose where to walk around in our free time without a robot taking us to sit on top of it and bringing us around like puppets
youtube
2023-07-30T09:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwdQ8rmz-7exCoCrMF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy6WptzMVjBxvOoSDt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgySybH3AOEPuK0i6D14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyGe1Nurkd6qbd-XY94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyoPh3giNJtLTihOcp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxLf4WHBSIqYOO_ouJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz6yVCwj2WM_hqJSK94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy2-zH4UsWxsbKO9Tl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgztLBf8caqzf0Zp9UN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzMa-OKbQooJax1eVh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]