Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
People not understanding what chatgpt and the other "AI" tools actually do, and …
ytc_UgxJog9AY…
G
Ai can be programmed to handle such cases with care, which is not currently take…
ytc_Ugw037wI8…
G
And yet he went over to the side of the demons...his robots Optimus are using a…
ytc_UgwgvN9PY…
G
universal basic income isn't enough for the average American it doesn't even cov…
ytc_Ugzaay4ZQ…
G
Ah, yes, today's culture who has an app for that, trusts A.I. and asks Google ho…
ytc_Ugx0wXxuG…
G
The only people ACTUALLY supporting generative ai in the current silicon valley …
ytc_UgwtPwowX…
G
And how do humans generate coherent sentences? Once we've been 'trained' to use …
ytr_Ugw0R9Mf0…
G
I love how ypure having an arc vs the AI bros.
Kudos to you because youre winni…
ytc_Ugx0ZuykR…
Comment
If AI takes our jobs at the extreme, it’s not job losses we should worry about. It’s how we use our time and the whole socioeconomics model around that nobody has any idea about.
When Humans are lost and bereft of goals, leadership and where civilisation is heading we start Wars to decide who to conquer, rule and benefit fully.
AI will free us to slaughter each other and it won’t care one iota who wins because it will already have won once we’re at each others throats to cull the human race.
So begins chapter 1 of my new book 😂
youtube
AI Governance
2025-12-31T00:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzUmaAYckuxi098Y2p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyBxq6NsOpz_8f-8dl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzpvpoIuS_CDvHBVcV4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzZwjf8uhOgK2dTDl94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy0K5is34m2KMJPng94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxmR6W7N1WQMpBx8et4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxwsfkb5fSWU3Wlhu14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyIgCdtobjYPsiRn1d4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyT4O15vgc8mEiv51t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx2Mc_a6REgVlbhPYt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"resignation"}
]