Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Automation isn't just coming it's here. Y'all better learn a skill that isn't ea…
ytc_Ugwc_sEj8…
G
Gpt4 was lame to begin with. That's why everyone moved on with their lives. It c…
ytc_UgwD_iUtb…
G
When I see good AI art I want to know more about the original artist whose art w…
ytc_Ugyzs46br…
G
That sweater says different. I'm kidding. One positive outcome is that people wh…
ytc_Ugycdxbct…
G
What is funny here is ChatGPT getting defensive and denying it gave him that adv…
ytc_Ugyie8XK3…
G
I’ve been relatively chill with AI for the past while. Yes how it’s being used b…
ytc_Ugzsu6ydW…
G
What if the robot had a virus and changed opponent. Those shots would've made th…
ytc_Ugxq2jP63…
G
Implement the AI plan with real, human teachers who truly understand the technol…
ytc_UgxbyFuw2…
Comment
This story does not increase or decrease my wanted for autonomous car. I am an engineer and I know like as the host said sensors and electronic fail. We as engineer try as we must to make systems safe but as long as a human is making them they will never be 100% safe and will fail. So for me this story is a matter who is responsible because no matter what a humans, animals or just objects will appear out of nowhere and will cause an accident or death of something or someone. If car manufactures are going to take responsible of driving away from human than they need to take all responsible for when someone dies or property has been damaged because it is their system that was not good enough to stop the accident.
youtube
2018-03-21T18:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | liability |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugwt3BiPlp9ro1_hrdV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyZciDCmXZTzdJXt9R4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgywelGNAIfKRx4HQzh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyu-4QIfI2bE30ktBh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgzdMJJiR5Z07vC0dpx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx0WubZVZkPIVv2T8h4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugz30mNkDrc7bou2Uo94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugz1XmMUdyiR38KXsat4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugzqq1iviYPq9eVz1wJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzvXVSUeIuyd2bDDwN4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"resignation"}
]