Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
there needs to be a term like 'dunning-kruger effect' but for people that think …
ytr_UgzBOXooJ…
G
Every time a self driving car Is involved in any fault the CEO and/or stockholde…
ytc_UgxJKM23p…
G
If ChatGPT stopped development exactly where it is right now, it would already b…
ytr_Ugw9JwkKs…
G
Boycott any company employing AI technology and laid off employees. Thats the in…
ytc_UgzKEJ__V…
G
You are right but for now it reminds me of "Alexa, make a picture I can sell for…
ytr_Ugw8zJpIj…
G
The explanation of AI is a long-winded way of explaining that 1 group wants to e…
ytc_UgxTwNLtg…
G
manu ochenta A.I. is building itself right now the smartest scientist on the pla…
ytr_UgzYYKkeq…
G
If we all rapidly lose our jobs to AI then nobody, including tech investors, is …
ytc_Ugxg2b350…
Comment
The problem with AI is it doesn't have any mercy, wisdom, compassion, love, or empathy. It's also biased because of who initially programmed it, and currently manages it.
youtube
AI Governance
2025-06-23T13:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugxmfr-yt6SnOy5g-gR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzBB4jz5PeLKVkCZfV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwFBXboTnYCB54gjPZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyCDDXy2Au46mlEQkx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxEzmSkmWLZ0qNo6jp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxENBZs2kOTR2jtBIB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwZ2u5-_cFaLadGjM14AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugw9rtdwhxG7VrkvnvB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwHLshQ4nwE6xpeaAl4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzSCnWx1Tr3Sjiab_54AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}
]