Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The only time i think that when someone uses ai and calls themselves a “artist” …
ytc_UgzSRmcx4…
G
Hope you don't mind as a Indian iam saying I have watched a live chatGPT live us…
ytc_Ugw0yo-wq…
G
I’m definitely not a fan of Bezos or Amazon. I’ve canceled my prime membership t…
ytc_UgwGV0GsW…
G
@HandsomeMalcolmcats when the AI and robots take over, why will they need peopl…
ytr_UgwjKAJzI…
G
The reality is that by the time the world realises that AI needs to be stopped, …
ytc_Ugy_XJd2S…
G
How have we degraded so much that we now see AI as a viable source for emotional…
ytc_UgxIqOzWD…
G
And overfitting is when you let the machine learning process go on for too long …
ytc_UgyTDUtg4…
G
No offence but a software engineer and computer scientist can say a radiologist …
ytc_Ugx2MrZiE…
Comment
They say 'we' need to do something about AI. Who is 'we'. Seems like power is in the hands of corporations and billionaires and their interests are all about profit and not the best interests of mankind as a whole. I think what we really need is a redistribution of power/wealth so that AI can benefit mankind as a whole and not just the wealthy few
youtube
AI Harm Incident
2025-09-28T02:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugwwm-u8875qkXIkOGV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyteeo7HsGTTPJQHjh4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx1fYEf0HarN5XlJqR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzYVgzng1vPNQgtLut4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxMlU91B5E1JOHWCWF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwIAqm0N_JSILdW3CF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyRhkHr0oAcgV5PSD94AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx5O0bnKiDRaXtwnaZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzL_kReq3Ewzj4UGCB4AaABAg","responsibility":"distributed","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxj9lGeiZiiahUesVF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"}
]