Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Cause it's not AI. It's not conscious. It's doesn't understand. It only process …
ytc_UgyFXyeZR…
G
you are just wasting tokens. you should just tell the AI who she should imperson…
ytc_UgxPzlaW6…
G
Thank you Prof. Anand for sharing your valuable knowledge on AI.
By the way how …
ytc_UgycKQDR6…
G
IMO, it's already automated enough to kick out all the people working in the fie…
ytr_UgygsE2rZ…
G
dude ai has taken from twitter and shit like that, just a small portion of the i…
ytc_UgzccBSdt…
G
Got all correct, I mean I am a computer science student, studying AI and know wh…
ytc_Ugw5HwCoT…
G
As they said. The target group is people who dont have houses most of the times.…
rdc_gkqg6ba
G
I feel like its in bad faith to critique AI art for looking bad, because at some…
ytc_UgyfjYJ51…
Comment
32:55 bullshit! We knew it was sentient in the 80s. We were told they unplugged it because the computers were speaking to each other and we’ve yet to decode its conversations.
This guy says Elon has no moral compass but Sam Altman does. Give me a break!
This guy is judging others while taking no responsibility for his own contribution to it. 🤢
youtube
AI Governance
2025-06-16T15:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwYCBHrVErkBvcrVKN4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwcIq7520_uPj3EDzR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxWubuKf6X-buZAQ5p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugym69wt-M54rMiBeh14AaABAg","responsibility":"unclear","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyf-W6uBCbvywNgEcF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxh4sC_c-cuL0K9nYh4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzdvPqMGopQYFuv8rN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxUHR3BtX69j0nGf0p4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzFvIbH7kQ3vdedXDl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxU4MlzNFn4XrYdoZJ4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]