Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Most systems are very basic, Unless the crime is discovered the day or week it h…
ytr_UgzEOt-VM…
G
Well at the end of the day those robots, humanoids, and AI's will only ever be l…
ytc_UgiVAEnmc…
G
I’ve been thinking about this for a while - what does man do when he doesn’t hav…
ytc_UgwBEB3G2…
G
2027? wayyyyy too soon for a skynet-human-extinction scenario (less than 5% chan…
ytc_UgwO0wFEW…
G
The funniest things about "AI artist" as a description is, that both "AI" and "A…
ytc_Ugx8m0452…
G
They've found through testing the latest AI models are more creative than the av…
ytr_UgwEYQ4Ps…
G
From what moment on has a human conscience? From conception? Has an embryo consc…
ytc_UgwHvIP3A…
G
But if nobody has jobs. Then who will buy stuff that these large automated facto…
rdc_dt9khre
Comment
The real issue is how and when will some entity work on a new system on how to remunerate people to be able to acquire basic living necessities and also travel and hobbies. Actual Money system will be wiped and digital currency, but will currency still make sense, or a worldwide form of digital reward, far-fetched maybe. But at the speed ai is going, a team must be working on the human living system issue. I will be happy to not be working and enjoying nature and explore the world and people as we have lots of things to discover, spending more than 75% of our life working is not what I call life, and now we have reached the rewarding stage that human can stop working and controlling machine and ai to do his job is wonderful.
youtube
AI Governance
2025-09-05T14:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzxRCNf7iGX-Q6ihgp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxB7V-AAEXABYtCZp54AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzkLdbSwH3TxmteiJh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwOZBw31wfLBqNrWJZ4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgyJTHUu1jPzlRxYJoV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy0yCcBvEQ528UcMcp4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx23fgMhxzjkjJS7dp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxeHe5CH8EQHjdSlwZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugx9NVRD5Mb7H5NIz6J4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugz-7jOWrYHphDHW0OV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}
]