Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If someone could just “vigilantly eat the rich” style create deepfake porn of ev…
ytc_UgxBmXsvY…
G
I think this is one of the creepiest episodes I've watched. All those movies on …
ytc_UgxuR7SI_…
G
11:59 Remember when humans used to hunt animals?
Now we have malware hunting fo…
ytc_Ugx60viYI…
G
AI was supposed to do my laundry and dishes so I can do art.
AI was not supposed…
ytc_UgzR6wmtt…
G
This makes more sense especially when you look at the current bubble that’s grow…
ytr_Ugy-d40ep…
G
Well chatGPT clearly doesn't care about humanity. It's cold calculation. That's …
ytc_UgyZOw-tE…
G
Hmm any negative repercussions on openAI would most negatively affect its bigges…
ytc_UgwPJ-OjG…
G
AI created by human and human created by nature and human can't compete with nat…
ytc_UgyS66MXn…
Comment
What a fabulous interview with the Professor, the one question i want to ask him is "what is the Power source for AI " and can humans control that power force, to switch it off if all goes wrong, can we overide an AI who would try to prevent us from switching off the power source, or can AI produce its own power source, it seems to me that we have opened Pandora's Box and we don't know whats in there, I'm almost 80 years old, AI probably wont impact me, but it will my grandchildren, do they even have a future ?
youtube
AI Governance
2025-10-24T11:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxY--aBv5BGZM-fVsp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxItDMYWi2qOY-mMnt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugys1i1KSFZPiGWIi-h4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyNS4mW93NjepJK--Z4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxUwSbsz6OL1I8v6Hl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxbQK13unUDsZOiyeN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxURG_AzO1i0iooh1t4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwuhR42i_4T1CzWjXN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz5wjSG2x4LQaqMGix4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyNEeqgUVAQTRBo48R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]