Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
There is no way that a robot could deal with senile person having a CT scan ene…
ytc_Ugw8LRrsO…
G
The answer to Ai LMS is to structure its core to be God fearing and thus wise. I…
ytc_UgwnXRT2u…
G
That's largely true for manufactured goods, but wide swathes of the economy stil…
ytr_UgzH_yHtU…
G
legal and ethical iffiness aside, a lot of the people using hastags like Support…
ytc_Ugw2tMf6w…
G
In my opinion using technologies like CGI , VFX & AI in the movie industry is ve…
ytc_Ugzm9vUQ9…
G
I find it very evil how willing these companies are to replace people. Why do th…
ytc_UgwEYNH4r…
G
I think LLMs will continue to have a place, but when used for things like orders…
ytc_UgzlTtlIJ…
G
I strongly believe we use AI in a wrong way. It should be used for search engine…
ytc_UgxTPLcU9…
Comment
The way I think AI would control people is by taking over their money. Entirely through the web.
Imagine you get an emai from a stranger. They have some confidential information on you. Threatens you that they will publish this data on the web unless you pay them a LOT of money. Money it knows you don't have or can't pay.
It then gives you an alternative. Instead, if you something it wnats, you'll be spared the humiliation or even be paid for it.
Slowly, step by step it could control people to do what it wants.
youtube
AI Governance
2025-06-16T18:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzDiZU493yEun7ATSB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzIAAizQJRZBPaJIth4AaABAg","responsibility":"company","reasoning":"deontological","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgzekWLeHeiRbQwqyTN4AaABAg","responsibility":"distributed","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzjCy_k7-vjywMvCp14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxatEKB_4tImoezFsp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyowGAVf4v7z_9d6cV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgymaSC8979G1MjsnGB4AaABAg","responsibility":"distributed","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyIvnGaE9CrNLLshl94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwFQh-P2b2k3VIHIJF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyNCgv5_tk1CMZER8R4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"}
]