Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Im extremely against AI developing and art. You can have AI for some things but …
ytc_Ugw3JdRP7…
G
@cxms-d5u If not Oppenheimer, someone else would have built the bomb. The scienc…
ytr_UgxECMSp9…
G
The cop seriously did not consider for even one second that the AI identificatio…
ytc_Ugx5RNKtl…
G
The very thought of this topic really alarms me, particularly when you consider …
ytc_Ugzi3iQng…
G
AI is already building on itself. Each model trains the next, writes code for th…
ytc_UgwTBA9zb…
G
Upvoting and agreeing. Not a lawyer, but in liability risk management and work …
rdc_n5hr3m4
G
I just don't call it art. It's an ai image, and as long as people recognize that…
ytc_UgwgvlN6D…
G
My husband worked for a corporation that isn’t as flashy as Amazon but went all …
ytc_UgwDf5jAv…
Comment
I think the only concievably possible way for self driving cars to work, is if there are NO human driven cars, and all cars were controlled by a single computer, that can give orders to all cars at once. Of course, this then gives way to hackers, and terrorist attacks that would cause utter devestation. Honestly, I think we're doing pretty good as is, I think the best option would be computer assisted driving, which is already advancing.
youtube
AI Harm Incident
2018-06-04T21:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzmTWvkx_16kZxRipB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwU3JKTd7DXea630ch4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy5Al8HpdWeQXVA14V4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzR-I1zg7Fd24xkO2B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyTRsoOWGCfnwZIeM14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyLGAxBlRBrU6Q9HzF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzGRQqzWI3R8ooBq5t4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzTIXhunuFErqKh5k54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgypFlEyseWL_KumX5F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwMlxjfJsg85RHvnPZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]