Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Absolutely! Tech billionaires own the government. They bought the election to g…
ytr_UgzuYp57A…
G
Advanced would be Harvard CS50 for example? Or do you have a specific University…
ytr_UgyQOpISH…
G
sine and soul are two different words, however they're the same string length. S…
ytc_UgxwQPz03…
G
Resilience, adaptability, high rate of learning. Great advice. Just advise kids …
ytc_UgxVxtMbh…
G
I used copilot in a .env... That had the password I use everywhere.
Is that te s…
ytc_Ugygm5dcX…
G
How many people of 8 billion citizens of this planet will be able and capable to…
ytc_Ugyr6akwC…
G
So let me get this straight. Robots will do everything? So how will companies ma…
ytc_UgyM-V7Km…
G
As a teacher myself, i am quite sickened by this. God help the kid who daydream…
ytc_Ugz5XYNIU…
Comment
Apparently, no one seems to realize that if the entire economy were automated, all companies - including AI firms - would go bankrupt, since they could no longer make a profit, and everything would end up as public property. It would be like a kind of robotic communism.
youtube
AI Governance
2025-12-05T01:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyUxRpcW3m4Oa8MQOt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxNgHvetyJmP3wNpPp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyMDC8m8jDdWEVovjR4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz4lg1qUiS4XAVXo-t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxRG72du5S7mL9FC2B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyQVBtuL3R9eNXG3yt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyq8p95FKR0z5RJl5d4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxnrwHG1jZaYPrmbth4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzdTgHylS6wkMQUOsd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyK9fPBALTcFds3HAR4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]