Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Great video AJ. The greed and ego of humanity is causing this. If AI take over, …
ytc_Ugyx0kRX6…
G
If it has "genuine intelligence" then we'll have to give them rights because th…
ytc_Ugi-qHjeu…
G
There are humans who have no emotional attachment to others. Men are taught not …
ytc_UgySCjRX5…
G
Seems to me we are on the evolutionary path of transferring our consciousness in…
ytc_UgyxGYAq1…
G
something i find really funny about a lot of these people defending ai generated…
ytc_UgzTtwOjT…
G
The real question is at which point do we stop training AI, and it starts traini…
ytc_Ugypo3e_P…
G
You made a misstatement. You said the system was designed to brake. The fatal ac…
ytc_UgyVGp0Qa…
G
Anyone from the Trump administration sets a big red flag!
AI is dangerous and …
ytc_Ugwypb_Jl…
Comment
My job is safe at least for a while. I work in finance, resolving escalated complaints. It is legislated that a human has to talk with the complainant. I don’t see any government, in the face of giant unemployment figures, changing legislation to allow AI to take over. I’m about to start an MBA specialising in AI integration into the workforce. I have a partial scholarship, but it will still be a $40k+ debt. Hoping this combination of skills, a slow to act government led by fossils, will see me out for at least the next 15-20 years, then I can retire early and i’ll be okay. It is still a decision weighing on me, what if I’m wrong and end up with a giant student debt and no job? Tough time to be alive.
youtube
AI Governance
2025-09-09T00:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugzzbx9vHzsE0VXqyQ54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzRIFFmsvdWwWl9YcR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwaAHIgvku5oj8adxp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgwHoQ7iALtVtNFaUPt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxiDDxIhEzMHFZfqi54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyu7QELa2d2YSmv0Ax4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwrMudl86G997DWrRl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzjL4fSeerPRGbjkDZ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgyRJiiJV2_OtddlXq94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugw250M2RVuRl9i-9s14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]