Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Muh copyright, waaah, waaah.
Your shitty art (and the ones of anyone reading thi…
ytc_UgyoCXSal…
G
That one went over ur heads ..🔥ur people are the evil ancestors that wrote the c…
ytc_Ugz0j_Y6D…
G
This makes me actually nauseous. If you read the first few paragraphs, they're n…
rdc_lz6159a
G
I don’t support AI art, but I just don’t think we should mock anyone. I know a l…
ytc_UgyYHTfTs…
G
AI “artists” are possibly the most useless contributors to human culture. Give m…
ytc_UgwrAA3L6…
G
Doesn't really matter if AI ends most jobs. Most people took the 2021 j@b# so th…
ytc_UgxldTKoP…
G
The worst thing is i never want the ai to do the work for me so the result is th…
ytc_Ugxo4dI7O…
G
Thank you Bernie! I strongly support your proposal of a robot tax on corporation…
ytc_Ugy0ftgus…
Comment
"I've never had a corporation come before congress and ask for regulation - why might that be?" Well... When you live in a fascist state/world where the government and corporate powers have merged, proved during the covid pandemic, then "corporations coming before congress and asking for regulation" isn't a well intended responsible desire to limit their own power or ability to own the AI space. On the contrary. Regulating AI means regulating truth/information. Something the government and corporate fascist co-op has shown it will do in it's own interest at the expense of the public. Big "covid"data is big data. Microsoft, google and other megalomaniac driven corporates own the politicians they are asking to regulate them.
youtube
AI Governance
2023-06-30T22:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzlQ-8ISBuTZB33rUp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxyQGBAGkq-a6lhYrF4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyblq85LzkurgUOm7F4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxHSZcJKq4KTWqvgyV4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxKeV9AoJyYpt6vR-F4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw_A73kOLZR9auoNUN4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz9AfXvZaGsC6udOfZ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwfDFDJ3I_stPLuDEl4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy7Uxs8cW1Cb6z6rXx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwqIuj8knud18TveWt4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"outrage"}
]