Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
don't worry, stupid people like politics will never allow AI to take full contro…
ytr_UgykPqCUS…
G
Blumenthal and his fake Vietnam combat history is fascinating. Anyway the federa…
ytc_UgxWgspHl…
G
No... Judging by this, the stockholder boards will take that seriously and posit…
ytr_Ugw2_svn6…
G
I am not entirely opposed to the use of AI to monitor and direct learning. I wou…
ytc_UgznDFPdA…
G
Id prefer they get sensor vision right cause ai vision is inherently flawed. My …
ytc_Ugz0BmI-a…
G
The biggest hurdle is the demand side. If most companies automate ,even 75%, the…
ytc_UgzI6Uxz8…
G
Given that AI just follows algorithms, if no developer programmed it to self pro…
ytc_UgwW-SFU5…
G
Yeah its real the 2nd point mene khud ak din curious hoke pucha tha even khud ai…
ytc_Ugw6CmICP…
Comment
I disagree. We should not accept any advance in technology like this without question. Self driving cars, when there are accidents (and there already have been many), present many ethical dilemmas, and we need to be sure the people in charge of solving them will do right by us. We need to think more about what could happen, not just blindly accept progress. I would love to see the day where most cars are self driving, roads are safe, and travel is enjoyable. But not at just any ethical cost.
youtube
AI Harm Incident
2022-06-26T19:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgzyoQHfkvKymBmesal4AaABAg.A-sXnd6yXDiA2Eam8enQJJ","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzyoQHfkvKymBmesal4AaABAg.A-sXnd6yXDiA2EbmVWX5QB","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgzyoQHfkvKymBmesal4AaABAg.A-sXnd6yXDiA2F4IIrYzpR","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgxO_Ujk5rSvOjWRKBB4AaABAg.9qpyK0rFpPmA2EdMqzaPey","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytr_Ugxr6E9-mHJqZTtbMkB4AaABAg.9n3_k2CWmk5A2EdhXW4VlG","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgwCHoJvEcws5sbtNMJ4AaABAg.9gAd7y-h4HM9usT1V03sv4","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugxhzhse5PJvVG9QIF54AaABAg.9eKDGjvyiy7A2Eet4FGYqd","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytr_Ugw4jM93_9cAtGe9wgN4AaABAg.9Wp06dt0zPM9ckPtZde_5N","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgyfZZIYhGkJqOVkj3p4AaABAg.9NJbV2HYicl9UBo94mRsqU","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugxx9whhaDWkfEJOjy14AaABAg.9GhF4K2osdN9KjCloO1jaE","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"}
]