Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Hmmm that explanation whilst valid does not really justify politeness when inter…
ytc_UgxJemwF6…
G
the more ai art grows and takes over human made things, itll start learning from…
ytc_Ugzejos7A…
G
those types of people are forgeting the definition of "art".
they should pick a …
ytc_UgxWl5mes…
G
Generative AI is like someone telling the same knock knock joke over and over ag…
ytc_Ugyjv3nUv…
G
Using AI is not a sin, it's just another means of gathering information. Accept…
rdc_o5r90xj
G
Born in1975. Time before computers growing up watching movies about killer AI an…
ytc_UgyOaIHkh…
G
We know this AI will be dangerously out of our control, yet still, they push ahe…
ytc_Ugy4ie2Ad…
G
No company wants regulation unless they benefit from it. The benefit here is to …
ytc_UgxLg9UoY…
Comment
Of course people and companies have a moral and legal obligation other than to make profit. They alr3ady today have thr obligation to make safe products and not to harm. Or did I miss something on criminal law, law of damages/torts and punitive damages.?
BTW - superior AI will probably have less interest in harming humanity than we ourselves. Usually with intelligence moral insight and commitment grows.
youtube
AI Governance
2026-02-07T19:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzI1ykoDYDiNF7W57J4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwGZBzZBKakMrIu1Cp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UgzFH0O8k4ilpDH3ltp4AaABAg","responsibility":"expert","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxYo1BcT_aEpRXW2-d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxXb7iZsWqXjh5Wa054AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwYlkRgSCBvJjhi7WB4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzN94R-3AujKahQCJt4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgwzmNOCGN60o_N9OZ54AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwezLuYEzEKkc3rGnN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy6ELLHakTV3swHM3t4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]