Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's not snobbish to dislike ai art and people who claim they're "AI artists," y…
ytr_UgyKqzwW8…
G
"ai art is the best art from there is it makes it so anyone can become and artis…
ytc_Ugw_9E27p…
G
**Here's the best I could do. Took some back and forth but I finally got an insu…
rdc_jg94tmp
G
Nobody is saying this AI is racist that would be ridiculous.. they are saying it…
ytr_UgzKM9L4l…
G
I love how they say AI is coming into school no matter what…. Actually no we dec…
ytc_UgzuEc6JW…
G
EVERYONE talks about AI as though it's perfect. It is NOT. It may be one day, bu…
ytc_UgzmGsqMo…
G
UCL Philosophy hype! Fond memories of Kalderon's GoM course years ago. I flunked…
rdc_cxnh92y
G
Consider this...
As part of a more comprehensive UBI landscape, what if AI “wor…
ytc_Ugw0-nApv…
Comment
One thing that struck me is at around the 29 minute mark he seemed to absolve himself of any moral responsibility. Listening further on I realised he hadn't but it really hit me and got me thinking. Is lack of emotional intelligence/moral responsibility intrinsic in AI? Since, according to him he won't be needed, and AI would be having this conversation with Steven (if it would ever bother to spend time conversing with a human). I know some would say many people in power are morally bankrupt but a world with AI running things (which is essentially what he's speaking of) those morals would never have been there in the first place.
youtube
AI Governance
2025-12-03T13:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzI4XAeJ73z7Tucfnl4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzhP1PwZQQC60suf3p4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzvAtOa8RWHSwNPWH94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyB9BEtV4eLAtOlVU94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx9MTvW-QJoAXCbN_p4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugx3HRE-jtU7qNoxusl4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzTEEBputmyrobOynl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwhtMj68xVfuqCUn8h4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx6mHegomCDazz7v1t4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugwt4c6Jx4f2tu77v3x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]