Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I watched LR Time and he used GB Ai and it Failed Time and Time again so he call…
ytc_UgykRrkuQ…
G
"Musk has no moral compass"
> Does Sam (Altman) has a moral compass?
"I don't kn…
ytc_Ugz5OAH1s…
G
I feel like world governments need to force all of these companies on the forefr…
ytc_UgzQk-Twi…
G
The niece outsourced her entire thought process🧠 specifically her executive func…
ytc_UgwGm72ew…
G
This is brilliant content. Very enjoyable and highly relevant. My new favourite …
ytc_UgxI4k6pj…
G
Imagine an automated fuel truck that goes off course due to some sensor going ou…
ytc_Ugx_fECPh…
G
What a good civics sense, tolerance inside the hall during her protest.! 👍....
H…
ytc_Ugyi1lnUW…
G
unfortunately, this makes even worse. Some youtubers, especially making authenti…
ytc_UgyB5lmmz…
Comment
If the algorithm is 'predicting' black people over white people more, it's obviously the white people who programmed it to begin with that didn't have a clue. OR did it on purpose to further persecute black people, just because they can. Sure, pretty much all algorithm's need time to adjust, but THAT far outta reach. NOPE. That's definitely been programmed that way.
youtube
2022-07-26T22:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzG42WDC-z6EpnwyYB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyN0MORpZaY1v49Lfl4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugw-51QLPjQ6yxdapfx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgxwxbiKbar9ktWP3iZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwqf155AC7MjMLIQyt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyNT-lVYcDsR1Z66KR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzDSexj0m_MZQrtp894AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugz5xL4nSIRF2WVX5pB4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw1i4_BYgtF5y6aGpN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyeKeAQU0xLkw3RKP54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]