Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
A certain amount of moral code plus empathy needs to be written into AI. .. not …
ytc_Ugw0EyTyG…
G
So many excuses for the Tesla, so much confidence in “AI” …Tesla owners are like…
ytc_UgwkPUAe4…
G
2 things, #1 if you believe the AI when it tells you this is the code you need t…
ytc_UgxnaPFfG…
G
Can AI replace crooked and corrupted politicians and businesspeople with fair an…
ytc_UgwASOous…
G
Best way to beat this is NEVER to use them!
AI is already ruining life.…
ytc_UgzHw1DLH…
G
There is no economically viable way to run ai language models. As soon as this i…
ytc_UgwSasZ2l…
G
Assholes get me on here. 3:31 you dumb ass. “What to learn” what job, how about …
ytc_UgxhZCd0r…
G
He went from ai is a bubble to ai will kill us all after reading one book. I do …
ytc_UgywYasgp…
Comment
If there are no human jobs, social costs will go through the roof, and the only way to pay for that will be to super-tax the AI heavy companies. That could be one thing to hold back the advancement to Super AI in the general space.
A second thought. With no humans, what would Super AI do 🤔
youtube
AI Governance
2025-09-06T22:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugx_A6m8N3g4r28Jo3x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_UgzCaYHeNXcBMZB4DOR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},{"id":"ytc_UgzkQawZF_qtl6ImS1Z4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"ytc_Ugx0-QowgPO4emYQoeh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},{"id":"ytc_UgyaMNSCxX5z5--azBZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgzWkMgeLJVHduB8Zqx4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"outrage"},{"id":"ytc_UgygwZE1vB-ypYX_pyh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_Ugw5hlnquJoSEvlaAyR4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UgzfQ5YzKixyPmj0_NN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"resignation"},{"id":"ytc_Ugxqr3vDcZqrH2KE93l4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}]