Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Let me give you an analogy:
Making AI art is like commanding someone to make an …
ytc_Ugw9tGKzr…
G
37:18 him openly criticizing Elon Musk saying he has no moral compass without ev…
ytc_UgwhdlHLx…
G
@firewhite Well some phones nowadays have LiDAR sensors on the back if you have …
ytr_Ugybw9qmh…
G
AI finally gave some people the thing they've wished for the most - the ability …
ytc_Ugxhf-zmx…
G
+ he is accusing OP, when the policy/rule states the use of AI needs to be prove…
rdc_kgpyx3o
G
The car cannot defend itself in a court of law if a human does something high ri…
ytc_UgzQY-6PD…
G
https://youtube.com/shorts/xzUOWLAz19s?si=-6Dr6llftixUFKvk
*9 / 10 / muharram k…
ytc_Ugy5meBiy…
G
Bro is forcing chatgpt into evolution he is trying bring humanity to doom close …
ytc_Ugw8Jr4xU…
Comment
Hmm..did you know that driverless Uber/taxi cars were being presented as a real thread to the driver occupation since 2014, it's 2025 and nothing! Absolutely nothing! 11 years later and there's a few areas with self driving but 99.99% of the cars are human driven still! I do fear that there'll be some truth to what these AI folks are talking about and that by 2027 we'll be in deep water with Super human AI exterminating humanity by many means which we'll be too distracted to notice. However, when looking at the self-driving vehicle trend, then this all talk just seems to me like rallying up support and investments and we probably still have a decade and plus until there's a real threat. Until then people will definitely lose jobs as that is what's happening atm already.
youtube
AI Governance
2025-09-04T14:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgweJhuDRG8hzGnYpW94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxKjz5w0ijLeKVpgpB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgyW9mPbjk1iDsasGRR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzWZvHc1QUQplZL_YZ4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwmievZBQ5wo4maJHR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzmraWXKXjpV0eYgIZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyJnOB0THp_tC5r3Vt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzGe5EWsEQKwkhAas54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzzV_cfdmeWrYV06HR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx0MKKtzUni3DSRcYt4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"}
]