Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
the ai read the green and commo manifesto and the bible, no wonder it hates us. …
ytc_Ugx6isM_B…
G
Ai ,if done correctly is a huge step forward for humanity. It does work that fe…
ytc_Ugzvb8Fbl…
G
Ok but how do we fix it AI doesn’t understand race or even know what it is it’s …
ytc_UgxK2bjj3…
G
All this proved to me is that Legal eagle is incapable of talking about a Musk o…
ytc_Ugz1l6tgW…
G
I mean no one's gonna stop developing AI even if some stop others will pursue it…
ytc_Ugy29oz7T…
G
AI might have the capabilities to replace a lot of jobs for fraction of the cost…
ytc_UgzAUD39I…
G
My company keeps telling me that we're using AI to "augment" workers and not rep…
ytc_UgxpOpLG-…
G
Would companies be more successful if AI just did all the work and didn't requir…
ytc_UgyvP7FKC…
Comment
What is he even saying ? A lot of nothing.
It kind of sounds like:
“The other companies are bad, and they’re a.i are dangerous- let me make another “safe” one, trust me, i care about you”
isn’t that what they said about chat gpt at the beginning? And every company ever?
The way he said “they use fear, but not me” fallowed by- “think of your children” and talking about apocalypse almost made me laugh.
Another man saying a lot of pointless words and using the same tactics as all of them.
youtube
AI Responsibility
2025-07-06T05:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwKwYaEtyhoNxiE2CF4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzfhG6NwNkQSGbPKlh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwU59UlteFqjGqfkDB4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz6IG2cwYB7_cj4ieZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzjokweHPVb3VoUIIN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzkOZ9gP_HOkPWjKnJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugymv6MtTbDOOkEH0qF4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyiB5OAnSUVxckdrqV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzqE-NQYeV3u8KQ_xx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwVitd1fr_9U2-h3194AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]