Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It makes sense to move beyond the traditional working mindset and instead focus …
ytc_UgzZSvUGS…
G
This doesn't make me happy. Oh, human emotions? AI seems very angry at humans i…
ytc_UgxOa8ObJ…
G
I suppose shows by PBS are not spreading misinformation...meanwhile YouTube algo…
ytc_Ugze8T8RX…
G
Musk has no moral compass. He bought the US election, gained access to confide…
ytr_UgyYH4TEm…
G
To be an artist could be someone’s fate that can’t be escaped. Some ppl just don…
ytc_UgwuA6lfm…
G
I get it 100% now I’m not here for people who don’t use AI properly like going a…
ytc_UgytOPqyp…
G
Robot: You were a good human servant but I'm afraid your usefulness is at an end…
ytc_Ugz_SBhLY…
G
I've been a professional software engineer for over two decades now. Everyone's …
ytc_UgyK04ssS…
Comment
So, the companies that openly support bringing the population down to a manageable 500 million are telling the public that AI could kill all humans in the next 10 yrs but have no plans on stopping their work. Not surprised.
youtube
AI Governance
2023-07-09T01:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzBMXuztCde9MeRhC94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzAOBtgMNhqzWRBedJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugyf_j3R9d_iIRvJSep4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwcNATqbCeWDNyT74Z4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyTA_5ZGTzTlO68VQN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxkS26kqClIMlG7-Kh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwRb4Fp_E32ZQyAoop4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyDn06KdVEQ5DtTLY94AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx0gbxHrkOUBpCFoB54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx-FerZhmrK6_Ea5Vt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]