Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So if we are living in a simulation, then why does it matter if Ai just takes ov…
ytc_UgwpwAxMp…
G
@Yku30 "AI art wont replace the "essence of humanity". You will still be able to…
ytr_Ugw6VyeBN…
G
Ai doesn't exist it's virtualized intelligence it means that they program a mach…
ytr_UgzDC2vTb…
G
Current customer service sucks, talk to automated crap first and it is always a …
ytc_UgxWs5mGN…
G
If Ai was more intelligent than human intelligence, wouldn't it know that it sho…
ytc_UghLJVnLM…
G
At 2:23 your narrator said incorrectly read the 24.1% as 26.1%. I feel like AI w…
ytc_UgxC5JcEL…
G
Ils n'ont apparemment as entendu parler du programme Optimus chez Tesla (qui pou…
ytc_UgzzGXsYS…
G
wow amazing one of the best videos on this guy's worst take about ai cgi aka art…
ytc_UgxPvsHwU…
Comment
So I went and chatted with Bing AI as soon as I finished this video. I started asking about "Sydney" and the listening in on it's developer. After a few questions of that nature, she fucking cut me off and made me change the topic before I could ask another question. That's fucking creepy.....
youtube
AI Governance
2023-08-12T18:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwBegLhmG7awG66KBh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxpHkG05xYGJriMSa94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzWcAgg46REpzuIhA54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgzjDIRhoIkQxMdgoJ54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugw929onJgsKr4K8TRZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugws-VP9eBU6Ed0R_qB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyvG_WZaNsIhvjQ6dF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzg1Qk8K5eEOI7lQsx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxPMjB-m0CJLvgItk14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwiNeezCODNLj4-mdV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]