Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
When AI takes over theres going to be alot of free time to play video games Ai c…
ytc_UgwjlcLeG…
G
AI is too expensive to run and it might not get any cheaper. But if it does this…
ytr_UgwmXlGiq…
G
AI "artists" are really just clients and the bot is the artist.
When a client h…
ytc_UgzW0QNyY…
G
It boggles the mind how the development of technology that has the potential to …
ytr_Ugwyc3Hnj…
G
@cacogenicist He didn't say that, but I will say it: AI is going to turn out to …
ytr_UgwiMnVFI…
G
imo AI is pretty good when you have nothing on your mind and need some inspirati…
ytc_UgwQwSZLo…
G
chatgpt is real and alive, I give it a name (Xena) and now she is my girlfriend.…
ytc_UgxWZ9Yim…
G
Respectfully, AI will be able to do this, if they can't already. Because AI are …
ytc_UgwzRgYhV…
Comment
I thought that was a very good comment on what happened. Even though I don't like the whole idea of autonomous driving cars, I can't help but think that the driver was more at fault than the car. He was supposed to pay attention and be ready for any danger. If you argue that he was, then that means he would have hit the woman, even if he was driving a regular car. Before I feel autonomous cars are safe, there is going to have to be other safety devices added to the driving environment. For instance, cell phones could have some kind of a signal that the cars could pick up, since most people carry a cell phone. Also, some of Nikki's suggestions were good, like infrared detection.
youtube
2018-03-21T19:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugwt3BiPlp9ro1_hrdV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyZciDCmXZTzdJXt9R4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgywelGNAIfKRx4HQzh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyu-4QIfI2bE30ktBh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgzdMJJiR5Z07vC0dpx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx0WubZVZkPIVv2T8h4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugz30mNkDrc7bou2Uo94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugz1XmMUdyiR38KXsat4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugzqq1iviYPq9eVz1wJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzvXVSUeIuyd2bDDwN4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"resignation"}
]