Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Did someone NOT watch Terminator? And now I am just waiting for ChatGPT to get …
ytc_Ugxo_LVgx…
G
Bill gates saying he isnt confident about AI is scary? Bill gates the same guy w…
ytc_Ugwpiruss…
G
you're dishonest or uninformed. John keynes discussed automation and its effects…
ytc_UgxCsCMfh…
G
ERM actually it's ai images ☝️🤓 But yeah it can come out ugly most of the time 😭…
ytc_UgzVkhsMu…
G
Was surprised not to see any language-related jobs like translation and simultan…
ytc_Ugx0D4J70…
G
Thanks for sharing your disagreement! It's important I clarify two things:
1. P…
ytr_Ugzp3auak…
G
Is it fair to say that, when comparing the amount of researchers/developers for …
ytc_UgzwyHiM1…
G
Sorry dude, this is an uber brain dead take on the matter.
These are words, th…
ytc_UgxZFqYCM…
Comment
The main problem we have is that it’s getting to the point where no one wants to do anything. I don’t like self driving cars for 2 reasons. You are putting your hands into a machine that can have someone at the other end say, I am bored so let’s have it get into an accident. Who then is responsible for the crash. Why do we need things to be so easy. Over time things get easier because of much practice. When older people don’t want to drive any longer, they should have family and or friends. We are working on distancing our selves from everyone.
Humans will not get better in social settings if we keep distancing ourselves from others.
youtube
AI Governance
2025-09-04T19:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | virtue |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz8oxavMk6uMeDdMfJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxesuBTqbKd2w5qIvx4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxsKPTRUwOgBWgDel94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz7w08ShSlb9umJd7d4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzn8gYUkZeS2JnebOh4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgytV9QLXPrBAkIINRx4AaABAg","responsibility":"user","reasoning":"contractualist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgyeSul8TyMh1px2RXB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz2vD3Xzt-2hYMl3Yp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxv_TFb614GuiqTcLx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugw_NkyO2-AAKYjRZzd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]