Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I[m a graphic designer with an art degree and 27 years of experience. Get out no…
ytc_UgzYipn_V…
G
Pay attention what I telling you maniatic COWARD, look a professional sport boxe…
ytc_UgyBVEyA-…
G
It's probably some guy in India remotely driving these Waymo cars, and they driv…
ytc_UgyqmM8kw…
G
Ai is just a robot that makes images so like we need to stop ai from taking over…
ytc_Ugzpg6F7H…
G
AI can perform speech recognition on an 11-minute long TED lecture and an NLP mo…
ytr_UgwuDNZI-…
G
@AinAemAet Wow, way to entirely miss the point. Charlie's point was that AI gen…
ytr_UgwXqd1aJ…
G
To simplify point 2, the difference is intent. You, as the artist, make every de…
ytc_UgwDwuPSc…
G
AI is not artificial. Just gather information, and amateur intelligence used the…
ytc_Ugx2OmZFj…
Comment
There is this book called Homo Deus by Yuval Harari and one of its ideas is about how companies make their profit by adapting their products to the customer and not the customer to the product, based on the "customer is always right" thing. The author gave an example of a car built by the greatest artists, with the best technology, and the best materials, and was promoted as the best car in the world and the only car people would ever need, yet the customers wouldn't buy it. Why? Because the customer is always right. Same with AI. Despite all the promotion for AI, its services are still not demanded, thus aren't bought. Because the customer is always right and to him AI is not something he would need to buy.
youtube
Viral AI Reaction
2025-03-31T07:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgxzrWU5fMjh4p408y94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"ytc_Ugy5TxF8SBzWzVfyDD14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_Ugw3OTo-wg9o--bUwqh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_Ugx7khJgTnXmR8vz9uJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_Ugwm_71BuiWQ1DFUgBR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},{"id":"ytc_Ugxj_L5CnFveZYiMFO94AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},{"id":"ytc_UgwbtgzydN-dPYskdlV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},{"id":"ytc_UgxTt6h83ahIVsF_hrp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgyNuMky569yD90yk0d4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},{"id":"ytc_UgwMyyJirJnyyjHy3Op4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"outrage"}]