Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I wish Ezra had listened to Jon Stewards recent podcast episode about AI. They c…
ytc_UgzWfXWpw…
G
Google employees are suffering AND Google is also suffering and AI is getting ou…
ytc_UgxFw-NtI…
G
AI would be used best to replace manager and administrators, you know, repetitiv…
ytc_UgxyXPeCH…
G
On the 'time to do other things.' When it comes to art, if you don't actually wa…
ytc_UgwjYYyKz…
G
Wait chatgpt Said they won't use the Things i write There and and it will Not BE…
ytc_UgzB0q7z4…
G
@kidz4p509 I mean yea, what’s called “AI” isn’t really truly thinking for itself…
ytr_UgwWYURTG…
G
It gets worse...I suffered a real rape that was deepfaked to look consensual and…
ytc_UgwCXv05F…
G
(9:45) Where are you going to learn how to use AI tools if you just graduated fr…
ytc_UgxMXxstG…
Comment
The weakest idea is that government will get its tax revenue from AI. AI will lobby for less taxes. AI firms will eventually run out of consumers, yet remaining shareholders will still demand profit. With no people to buy products, non-AI firms will have no use for AI to maximize their output. The AI-dependent firms will be forced to just sell to each other in a circle jerk (openAI will buy Nvidia chips in return for Nvidia purchase of its stock, e.g.). The system will only persist until some entity demands real commodity value rather than just stock or a promise of investment
youtube
Viral AI Reaction
2025-12-01T15:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwxNpKmktLv4Yj3MAh4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyF6oAE-3fTtCQm_md4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwPXN-onL25BwFLzG14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxhzz4LmdthQD5octF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwBzBTRwOZ6d6uMknl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxKoLT4jckLiWWxLad4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxZGok5KTbNtF_YaQ94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzucU6BmZyYlp9mDZJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwcKoidNS_fMXYkke94AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzGkSZs2iXEOruY1W54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]