Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
How can we stop this? Even if half the world woke up to this danger and began pr…
ytc_UgyqQPf2_…
G
New rule of the internet: rule 416 if a artist sees AI art they are legally allo…
ytc_UgwLOnQnI…
G
I feel like us as parents have the right to look at our kids, phones and compute…
ytc_UgwcT-RXt…
G
Freedom of Access/Sharing/Commenting/Information is anticapitalist and antiprope…
ytc_UgzdpUyiO…
G
Perhaps its not a bad thing that robots taking over these jobs, but politicians …
ytc_UgxLGeK_I…
G
An AI is just an electric train that goes around and around on the tracks that h…
ytc_Ugz_HrcKm…
G
Think about how many lonely people will get hooked into these chat bots. Those p…
rdc_m5ll1ck
G
@alko_xo no, did you know a self driving car ran over a women and then dragged h…
ytr_UgzTtgXSN…
Comment
Absolutely right. We're ok with killing millions because they won't get vaccinated or they drive carelessly, but self-driving cars have to be 100% perfect or we think they're evil. I would WAY prefer driving on roads with self-driving cars that cause a handfull of accidents than roads with loads of very fallible fellow humans who kill thousands.
youtube
2023-08-20T22:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwCZ7IUKC2kPv7mjVZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxyBugT38gDf6DPkr94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwkNL79IGkgLwu3f5V4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyPKLWf-iKeHolPNHh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"approval"},
{"id":"ytc_UgxvvqmVxx36U_sWnZx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwF9H0-dWEpMBO9cFN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwvP9XtheBWqtKfGZp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyUAuO4yjKLQFT_Fp94AaABAg","responsibility":"user","reasoning":"mixed","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgwlZBhUFRkYAaTKre94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyLTkAZcFoq5ECfjll4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]