Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
„The only thing you can do now is with AI. Is to use it to your advantage”
Ther…
ytr_Ugy7uXax1…
G
Someone said it won’t last cause humans can multitask …😭😭😭 no buddy we make more…
ytc_UgwnmD2yC…
G
Another situation is that I found ChatGPT does not allow for opposite theories t…
ytc_UgxAMxS5o…
G
@nevaehlynch3399 Hey easy there! Sure they may be lazy but playing around with …
ytr_Ugz35Pt5g…
G
Thank you for the links for safe use of AI. I’m a teacher and trying to plan for…
ytc_UgwnQQ83I…
G
16:50 Something isn't quite right here ... you tell us not to have service and t…
ytc_Ugwmq5gok…
G
Easy: A.I. will - they already do half the clicking/Reviews/etc everywhere... th…
ytc_Ugy8KrbR2…
G
AI will create a future full of unemployed people, increased poverty and crime (…
ytc_UgyjJRAzO…
Comment
My question is, why are we using AI for this? There are so many applications for AI that can save lives we need to focus on, especially in medicine and emergency rescue- because the AI in those fields need serious sample changes due to some heavy biases. But we thought, hey, you know that thing people put a lot of effort and soul into for fun? who are already often underpaid if they go into it as a career? I've got an idea, let's teach AI to do *that.*
It's like teaching a service dog to juggle- why would you ever, when it can be doing better shit...?
youtube
AI Responsibility
2023-10-31T00:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz4n76WlABrdfmr-nF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy9ALNsSWJaCRTESgp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwztswS-UYHP3RipIJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxizAG985Dl838LbYZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgzQRlEHx3g5edCOaBx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgygETVQcOwF5rBl5gB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugx6FehiAGpvTp885wZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgycmS2DVCRhf9OGRqJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx0fC5oWkwoDi7xCMt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzO9pkRXhSA6ZNCFDR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]