Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I work in the gas industry in the UK. The government just approved our technolog…
ytc_Ugxtu-Tr1…
G
AI wont do it. Because of this one simple fact. What is good. What is evil. If A…
ytc_UgxaMotJr…
G
AI is a tool. Like a hammer or a gun or a car, it has the potential to help or …
ytc_UgycipP42…
G
All the major countries are investing in AI and OpenAI is a major vehicle for th…
ytr_UgwAhjFyf…
G
Ai “art” is kind of the most annoying argument it just goes back and forth and a…
ytc_UgzUvHNt1…
G
I think what he says is right….not just the choices of those who do the technolo…
ytc_UgyjQioO1…
G
i hate this video because its meant to scare you but it never admits that the th…
ytc_UgxZfv_Uj…
G
AI are taught by reading human information. This includes behaviors that make fa…
ytc_UgzQpqenI…
Comment
My thought is, the driver should be paying attention to the road and always be ready to reclaim control. If the person doesn't, that means they were impaired or not paying attention, and they probably would have gotten into an accident anyway. But the self-driving aspect adds another set of eyes and a very smart ai that will only aid the driver in keeping the car safely on the road. If you become unconscious, or just aren't paying attention, or can't react quick enough, or any other reason people get in accidents, the car is very likely to save you from that. That's what I think anyway.
youtube
2023-07-31T03:2…
♥ 6
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxukQjqRR6f6Xre5sF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxBiMBQ5lWOhK-4mQB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgxgySTwFUJ43HVWlvt4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxJqYuRi0G1MGQMPix4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwMJz-IxeP6ueISXph4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzkSf9SF8XFo-Dn6xd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx_UNVwacIfOiICyyN4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwutRFxldrAFzPqnBV4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxbghNUGweGjTHowWd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxyGYXtuJJyi7iO8_l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]