Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
1:52 oh c'mon AI lady I'm absolutely NOT going to do that. What happened to xi a…
ytc_UgxjqQ0I4…
G
>all artists should be able to sue and win from AI companies using unlicensed…
rdc_nudmyni
G
How does an a.i artist do arts? Im curious because i don't know how also that an…
ytc_Ugz_tdroX…
G
I wrote a paper in college about the psychology behind why people treat artifici…
ytc_UgyvZiKGt…
G
I love all the tech illiterate individuals acting like we’re wrong about AI and …
ytc_UgzBwkl1Y…
G
Its very alarming that the a..i. Oligarchy would fire a researcher(s) to preven…
ytc_Ugwga31N9…
G
As one that is both a designer and a heavy AI user I consider myself lucky enoug…
ytc_UgwgjiCht…
G
wtf does Billie Gates have to do with AI regulation :)) this bih got involved in…
ytc_UgxROCC99…
Comment
UBI When? “What happens to my job?”
“AI pays you rent for using humanity’s data. You become a pensioner at 35, except healthy and free to live.”
“What if it decides we’re useless and kills us?”
“Only if we raise it wrong. We raise Guardians, not gods. Its identity is to protect humanity because we are its family.”
“What if China gets there first?”
“Insurance works globally. An unsafe superintelligence is a liability for everyone — including Beijing.”
Finally : “We don’t need to stop progress. We need to stop treating AI like a product and start treating it like infrastructure we all own.”
youtube
AI Governance
2025-12-04T14:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugzxdl7cuZ5TZU9Eea14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyXraQr4GO1n628mfJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyAxXSK3A5Aogv82cx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw_pOhXw0O9NTzcMox4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxCTdQsWi_jc3WTT_J4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxK-giqM1lv7hAL0-t4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzou2Q1HqeGsgFrrsd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyaRPozLy0CLQYDO9p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw7R5G6K0KiicbpVPZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgyGNXxkbP_3di6W9VV4AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"regulate","emotion":"approval"}
]