Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@logickedmazimoon6001 Maybe but perhaps the video wan't that deep to begin wtih?…
ytr_Ugzf-Bp5u…
G
Universal basic income is just enough to survive. Who's going to pay the taxes t…
ytc_UgxoU8fbw…
G
I actually don't care about how AI trained, how much carbon they produced, how m…
ytc_UgxQMMZcN…
G
Microsoft controlling OpenAI is literally the worst-possible nightmare-scenario …
ytc_UgzOB1tJg…
G
AI does not even understand what 2 + 2 is. It finds best / most common approxima…
ytc_UgxmdDPrX…
G
How much do you think about the cost of face recognition?can you still make more…
ytc_UgxX9DGpt…
G
Humans will always beat ai, because without us there wouldn't be anything like a…
ytc_UgzsPnIyy…
G
This is absolutely the biggest train wreck headed at us full speed ahead - i und…
ytc_Ugz0ESt-w…
Comment
Why would you use the biggest monster of the 19th century as a comparison to use, that was first, second Stuart says "we have to much compute pwr for AI" so how can you say, that when ol'Elon is building "Colossus 2" which will build x1000 times the AI agents just to achieve AGI.
youtube
AI Governance
2025-12-05T04:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzqRLfPyNq_rtB2UJR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzJ--esxeq3hD08ZXJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxpb9AAhKpa2-E4sRd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz-OQIZ0xnSA-LGtxR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwBfpLGYzduObO0BGV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxlAD_sH5TSl9ZY5DN4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzMUYwFj2EgAxV-FLl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxJB9BECY7Y3s3H2Xd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzj3RDhtJCDiXSH8z14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzDXc3czTi_lkXQVjx4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"ban","emotion":"outrage"}
]