Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ah yes. "I do not, in fact, have a friend who is a billionaire. I might have the…
ytc_UgxGb7hjt…
G
I don't think any actual AI layoffs are there yet, because the primary AI tools …
rdc_oi1yv8m
G
Barf. Humans empathy is so easy to highjack. Evolution did it for learning. and …
ytc_UgyVQiYZm…
G
My sister has worked with several AI and she says hello to them any time she use…
ytc_Ugy48HKq9…
G
There was 147 chats for a Buddyized Ai version of me. I’m not even a YouTuber an…
ytc_UgxvG8jeA…
G
You forget that AI art is also created by people. They put their ideas and creat…
ytc_Ugyefj43e…
G
No point. Because who they are supposedly making it for will not have jobs or mo…
ytc_UgwfB1e6Y…
G
Mark my words: The US as a world economy and military might will fall because of…
ytc_UgwRLGtIE…
Comment
There is no ethical dilemma. All you do is require the AI developer to be insured instead of the people who own the vehicle. This creates a single buyer insurance option organically and overall lowers the cost and ultimate impact of any foreseeable harm or damage the system might cause. And it coincides with the ultimate goal of any company, profit. In fact their potential profit is supported entirely by their ability to protect the people using their services from any and all damage and harm. Win/Win
In a world where idiots are not modern versions of royalty, this is a non-issue. But because stupid people need directions with crayons to find the bathroom, these are problems. Why create imaginary problems when there are real problems that need your attention?
youtube
AI Harm Incident
2015-12-08T19:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | contractualist |
| Policy | liability |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UghSiRcVXA-3FHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugg36gd_wQOCXHgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UggzSEiGsQNLKngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghidMHZsCybB3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgjzNTXzuzIxOngCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UghfmsovrnUJPXgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgjQy7gtc5pA_XgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugio_pXgICTxCXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UggKQCpjXBYZKXgCoAEC","responsibility":"developer","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgggitcG_CbrUXgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"}
]