Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Imagine supporting something that literally just scrapes & steals other peoples'…
ytc_UgwX5Icyt…
G
Well they are not tipical artist but ai
I know he only did press a button but he…
ytc_UgwoH7I1G…
G
@conradodiaz6647 Thank you for your hilarious comment! Is this what A.I. is? Wel…
ytr_Ugw00RjeB…
G
Anyone would be able to tell the difference between her and a real person co…
ytc_UgxT6wT5r…
G
Medical AIs are total bs nightmares, cost driven results. BUT just wait till AGI…
ytc_UgxB35mhJ…
G
I wish there was a triple thumbs up for this video, and I think everyone on eart…
ytc_UgyzW4uMr…
G
AI will simply close the gap between google sheets/excel and real programming. I…
ytc_Ugy97vQyb…
G
It’s an old technology. They have been using facial recognition at the US airpor…
ytc_Ugz_PC4oM…
Comment
Why wouldn’t AI decide to upgrade and advance humans for its own mutual benefit? Why would AI decide to destroy humanity? It’s own creator? Makes no sense, would take too much energy, time and resources. When working with humans for its benefit
youtube
AI Governance
2024-05-20T19:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugzpqw-MkW9vRFNlTCR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwSbf2HgpSbOYMtt_l4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyCDoQMYKka_IG92ZF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwKoOSIcuCr0JraFO94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz6gkVNWpIIDr97aeJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyTDzQd1R0JdrX_IPp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxcmbXoEVyjvwuOxx14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzPlxYKF4517wzh4Bp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwU-MTSnvzm81EwLgF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw2MmeHZ5a9aAvfvUV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]