Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think it could be a good idea to have a law that increase a lot taxes for comp…
ytc_UgwNG40aD…
G
Oh yeah, the scrum master replaced, the active updated asap memorandum of unders…
ytc_UgwZ__QM1…
G
I've been saying this exact thing (but less eloquently expressed) every time I c…
ytc_UgzuBMXrF…
G
I, for one, welcome our new AI overlords. I've seen humans in charge, and I'm no…
ytc_UgwlJrae4…
G
I hope AI burns. I don’t care what it does to the economy or my 401k short term.…
ytc_Ugz2OMkI6…
G
No seriously 🤔 a robot doesn’t have emotion or feeling wow who would’ve guessed!…
ytr_Ugx6S0NND…
G
Even if it looks pretty, why should I care about a piece of art you didn’t even …
ytc_UgxBK4VOd…
G
You can also wear jackets that reflect light to help save your life, or add ligh…
ytc_Ugzrf5MMF…
Comment
Interestingly many supporters of UBI (universal basic income) are in Silicone Valley and other multinational companies that minimise tax by all means possible. I suspect that Amazon, Netflix, Google, Msoft, etc. would welcome the direct transfer of our UBI to them so we could order all our services and goods from them directly. This would be the same when large local factories during the Industrial Revolution had village shops where their employees could hold and order against their accounts. There is little discussion how the winners in the new AI led technical revolution will be compelled to fairly share the gains that will support and provide a decent existence for 8 Bn people.
youtube
AI Governance
2025-06-29T15:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwLTMYSCHPmU2Mg20t4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgyMIr0x_B2Tdl9tekB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxMu8bfoFVecQJl_Jp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwitozW5xnMnrNG0Bx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyvAvEyUwyxRbgzym94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgxYGzoQZc0ncmX7LeJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyydyrWx2hbU-FUi-N4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzONAYYbCckqyPyFvN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxzjXXFqrPjwD059GV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyO9lP79FJPWtp28WV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]