Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This could also be the case for the AI judges, not basing their scores on OBJECT…
ytr_Ugw8kN0LM…
G
A little late to be in an uproar now that it's here. AI IS THE IDOL THAT SPEAKS …
ytc_Ugy-bwe8Y…
G
Wouldn’t that be so cool if they just shut down the ai art and worked on getting…
ytr_UgxpLGilg…
G
As far as I'm aware, all the drone models currently armed with weapons all still…
ytc_Ugwo3-nht…
G
Bro I was talking to Oliver from FPE since we besties somehow and he suddenly ju…
ytc_UgxYPTlhd…
G
Will a.i take over the man who works in the local fish and chip shop? I doubt i…
ytc_UgzM97fH8…
G
Sasha Luccioni regards AI as dangerous—not because of speculative, far-off exist…
ytc_UgyxYCyu1…
G
@Apr0x1m0 What do you mean it has been proven? No, it hasn't. That is the CLAI…
ytr_Ugw5uiOjn…
Comment
Universal income might be a very good idea. This is because AI is going to cause enormous inequalities - it's massively favouring intelligent, educated people in the 1st world. Universal income is a way of mitigating against the coming massive inequalities. Otherwise, I can see problems brewing like terrorism and war from the 3rd world because they won't be able to afford anything.
youtube
Cross-Cultural
2023-08-10T14:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugwe8mh5yAlVTCaXOyd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwdSoQ-O-5kpzs_mu14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzUtXh20mR5CG8zzg94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgypSixbkmyE51HU9Ep4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgzDOs6670zkriCvyAN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzcbvNCmQqduil8X6Z4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy6uyWIDx8d09oTTuV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzldzy3zbQGx6lCfEV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz__WTHihozScBSphl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwmESAAaT1GSNyvzcR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}]