Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Elon seems really concernd in this video. I am so glad he is so forwardthinking …
ytc_Ugyx_f4_R…
G
This i not a movie. This is not AM. This is not ultron. This is not the matrix. …
ytc_UgxmAbi14…
G
AI has certainly belittled many under graduate degrees but be under no illusion,…
ytc_UgwOpnLBL…
G
AI and automation will eliminate jobs with no replacement work for the broad mas…
ytc_UgyFaBwmv…
G
That’s an interesting thought! It does raise questions about the balance between…
ytr_UgzVBWyhD…
G
The worst part about all of this is that it was entirely predictable. Large lang…
ytc_Ugwtq_2xt…
G
AI artists suck, they take art from hardworking AI and claim it as their own…
ytc_UgzcRB4BP…
G
We are NOT going to get Super-intelligent AI or even human-level AI anytime soon…
ytc_UgxUpwisc…
Comment
Is AI learning from copyrighted material so wrong? Isn't just about every official article written copyrighted? This seems like a very minor thing to have an issue with. He also does not have the look of someone who is about to kill themselves. He looks in good spirits. back door? windows unopened? if someone in the tech sector organized a murder i'd assume they'd be extremely careful about everything.
youtube
2025-03-18T00:1…
♥ 4
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugzt2VJDX89-5OCUsct4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyMi4wJTK_UWQpc9AF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyZetTb6f1V5YyFZ9R4AaABAg","responsibility":"government","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzutm1XDSNSvlbMXjB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgweZmRxDLPdgXkvdkl4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzM-tCHD9aGIq-aQnJ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwvy_a7lQHB8LEQmAl4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwrmHAHcekTdE9UdJt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgwjPkxEDUeuj_pXr6p4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz2WLScBLRGbt1Yyo54AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"unclear","emotion":"fear"}
]