Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@Tate_THGNot necessarily. If proper legal protections for artists were passed, t…
ytr_UgxfHvxqa…
G
I quit going to arena concerts because ticket master has gotten way too greedy w…
ytr_Ugyj9zSPr…
G
But did you not say in one of the interviews the challenge with AI in customer c…
ytc_UgwI2FP5L…
G
8:00 We have a lot of work to do regarding the catastrophes that AI can cause. F…
ytc_UgxrS1dtr…
G
My artist friends are upset over AI, as well as I. I’m a professional Videograph…
ytc_UgxAv-aaB…
G
I think it's an extremely interesting concept. However, I don't think it should …
ytc_UgxA47KsM…
G
Drugs are much more dangerous than AI and AI is much more dangerous than nukes.
…
ytc_UgyoaZoDa…
G
Is that the wonder of Ai or a indicator of empty (pop) music culture that we get…
ytc_UgymkUED4…
Comment
This is an interesting theory; however the model you have presented does have a few holes. The problem with an ai takeover will become consumption. As in the products a company generates be it physical, service-based or virtual are ultimately designed to be consumed by an end user who tenders currency for them but if the only ppl able to consume these products are a fraction of a fraction of the population then these companies will inevitably consume themselves.
Secondly, there will inevitably be issues with scale. As the demand for ai increases so will it's cost due to availability dictated by the power grid, availability of microprocessors etc..... there will exist a point in which there is no profit in replacing humans with ai.
Not entirely nor as quickly but over time it is certainly possible if not probable.
youtube
Viral AI Reaction
2025-12-06T17:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy5Y1md8zBTQDShez94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwLHeUXnyzg1t_leAd4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugytl0F5Sa8J6Shq6ud4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"disapproval"},
{"id":"ytc_UgzreNYqFmJ6GNV8d8x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxBZtDhqRvHKS8ffuN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxQWtFPcBmC5_IpNXR4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxPyLv4b_rfAc-mLwl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyjF5aIOdQLfpDEHc14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwLHdqBDzxjiFaI5HF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx-KQ7cTDzDpDrOLQd4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]