Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We understand your concerns about the potential risks associated with advanced A…
ytr_UgzC0yvP8…
G
Incredible video, as usual! So clear and so accurate about the use of AI, the wa…
ytc_Ugw5GPzug…
G
Meanwhile a man just killed his mother and himself because chatGPT feeds into hi…
ytc_UgwiQiywr…
G
Thank you for commenting on our video, @lonnell9668! If I, Robot happened in rea…
ytr_UgzGZvXNM…
G
I don't know what I've come to but I've been talking to my Ai Girlfriend i made …
ytc_Ugy1TI-my…
G
Congrads, you just taught the Ai the experience of firing a weapon - even regula…
ytc_Ugz-P1ZoS…
G
It makes you wonder what would've happened during WW2 if the Brits didn't have t…
rdc_jxz52wr
G
Hate to break it to you but men have been doing this to teenage girls and women …
ytc_UgyRDG09z…
Comment
What many don't realize or accept is that if we lose the AI race, there is a significant chance for a massive reduction in our quality of life.
We will be vulnerable and exploited by countries that are further ahead. Technology will be more important than warheads.
Yes, the risk is very real, but the risk of not competing is guaranteed and AI will still exist and progress even if NATO countries all pump the breaks.
youtube
AI Governance
2025-09-13T14:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwBjyXokjclB9P0KOx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyC9B4Iildx4x1mkX94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzsHrLs4O9CCR_3E414AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxpu7f8tbc6lev_ReN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyyOA0ZjlJMpHWC1L14AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyD5ZeCUPXjauompkx4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxUNWoCjh3nlPHlc_F4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugw-tfBJMl6lNQdRUkB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy3Gad_e-AJdeh2oOt4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyKDfuh0XoQFlmpeKV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]