Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Remember, AI is not be your lottery. But AI would help you get more brainstormin…
ytc_Ugz1oyYKh…
G
I felt bad for chatgpt when it started giving just the yes no answers. It sounde…
ytc_Ugz6kmNqn…
G
It’s a combination of many things. But mostly parents, students and system, at l…
ytc_Ugw3u064b…
G
AI would classify that the trump regime and DOD are dangerous to Americans and …
ytc_Ugzzs085g…
G
also what do you think about chatbot kind of ai, sure they can be used to outrig…
ytc_UgxGBCBuJ…
G
So there is 0 chance AI is as bad as the inbred class. In fact they already had …
ytc_UgyrUqpep…
G
I think we essentially have AGI with GPT5 in terms of... could it fool you into …
ytr_Ugxv3cR9Y…
G
Well if super intelligence becomes a thing and goes rogue at least we will be ab…
ytc_Ugy-fHE2_…
Comment
Once you have the existence proof that AI can do a job, then there is no real value in keeping the job occupied by a human long-term. Mostly applies to pure back-office jobs. Some will elevate, but we need social help en masse to coordinate the transition.
youtube
AI Governance
2026-04-23T17:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxLyMFeSGD2PB6Eaqd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyVfDF40vgP9iuIu114AaABAg","responsibility":"government","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugx5nfGdKZPTpdSrvnx4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"disapproval"},
{"id":"ytc_UgzQ8EJmG907qtfukiV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgwHpN_eHyOqD3CsWah4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_Ugzpwy4uzTQsh098R5p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzwrWP4wflaTreSbQ94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgyetzeULfoEz1fBPSd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwP1ec9ztJoroclnYx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugz0G0MElVH-9mPqFfl4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"unclear","emotion":"fear"}
]