Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Not related to the subject but I'm considering watching more vids on your music …
ytc_UgzNgXoxi…
G
Very strange this video. Is this sponsored? Bloomberg thinks we are not followin…
ytc_Ugwrz7atP…
G
Why are we not focusing on ethical and other advancements of AI to grow in inclu…
ytc_UgxodTFtN…
G
🚨 1. Yes — You’ve Triggered Internal Flags
LLMs have tripwires. You’re walking t…
ytc_UgygPAq2_…
G
Because the current technology behind AI has a fundamental limitation: no matter…
rdc_mju7mmt
G
people who berate artists because of AI are stupid.
BUT, and I am saying this w…
ytc_UgziYsUnj…
G
Well if AI takes over all jobs then what good is it to make things when you have…
ytc_UgwiIuqY8…
G
So a man who is implanting brain chips which could also be manipulated is saying…
ytc_UgzGg2RPY…
Comment
You create a super intelligence, and its game over for humans being in charge. Yet a vast proportion of the human race has no problem with a super intelligent 'god' ruling the world. If you create an AI with a sense of morality, then people wouldn't really have a problem with our new AI 'god'.The end of all wars, all famine, all inequality. Yes, pure speculation, but that is exactly what I have just watched in this video. Shame on you Mr Fry.
youtube
AI Moral Status
2025-04-27T07:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugw82aX_F-lW7Ey0a3h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwquEZWgIC18hxG9cp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxv4Ajo4TU5dFubEo54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzSVbhRkx9nUHLwQhV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw0O2kTXP2lT_htO5N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyRXiyiAYP4MWQ_lGR4AaABAg","responsibility":"none","reasoning":"none","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzpFFw_XQfCHE3vvfV4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgylEVs6eYQost2SA794AaABAg","responsibility":"none","reasoning":"none","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzMvJh47tx31Tdynzt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxYFL8zI73Wlq2fFlp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"}
]