Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I do nowadays stray from "Is AI ok?" discussions since most of people in fact do…
ytr_UgxxQ6hO5…
G
Well, I personally think that AI will take away your job in the future unless yo…
ytc_UgyyNB33k…
G
I don't know how frequent a behavior this is in the legal field but in academia …
ytc_UgyVjrHHt…
G
Ai + Garbage Capitalism = 💀
Ai + New Evolved Non-Capitalistic Free Human Society…
ytc_UgzingD3N…
G
So, instead of humans creating "terrible new viruses," it will be AI creating th…
ytc_UgwMDfMGf…
G
I am happy to see the analysis of AI from the view of an artist, generally your …
ytc_UgzrC2WHz…
G
Ai is just white collar workers dealing with what blue collar workers have for c…
ytc_UgziqtRMR…
G
I had to quit my job at a supermarket, because they had an ai making the schedu…
ytc_UgxD28U28…
Comment
The open topic is how AI is how will be dealt with by the courts which deal in laws written with the "reasonable person" in mind.. If there is no reasonable person involved in a disputed situation can the current legal system "cope"? How much of a veneer of human intervention will be acceptable so that a "reasonable person" argument can prevail? As in sci-fi will the computer become sovereign due to the actual or perceived lack of human input? It seems already a boiling frog situation to me as no loud political voices are being heard either at the political representative or AI company levels. I think that software developers with legal and ethical training will be at a premium. ps In the good old days, a coup d'etat would involve soldiers taking over the radio station etc. I'd like to know who the generals have got in their sights nowadays.
youtube
AI Jobs
2024-01-15T04:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | deontological |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwS1GOHjIGps7ijcCN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzLCHU8LsoBxVOEUNh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyO2-q7ST3fUWzTRnh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzKUp6HyGSX1d1sr1l4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzXj6P9XYiQLpMbwJp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyEEwbnPcZa3ze8YKZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy8iI8G61qn73fcOed4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxFGt2uA71vY0NLDI94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwyWYKqaBe2TZcVKf54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgywY8dCABLEaT4vCqJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]