Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Thank you, Bernie. I am a victim of AI. I lost my editing job in July and can't …
ytc_UgwB33j3L…
G
@Yaromir2008 in this video looks like no one except the CEO really tried to adap…
ytr_Ugw9NbZe2…
G
Using artificial intelligence to learn machine is not a correct way, because ML …
ytc_UgwI7Xptf…
G
The words and logo RISE seem very foreboding... and the fact that these A.I's ar…
ytc_Ugxmhiy26…
G
Yeah this was my first take when ai suddenly became huge... I thought companies …
ytc_Ugw3jU4FS…
G
Here is your argument regarding the AI training - For decades, artists have bee…
ytc_UgyhWG-jh…
G
Replying to the top comment to put some perspective on this.
H1B has a cap, we …
rdc_l3o1mwq
G
I like to think of it this way: AI can't make new things, and I kinda assume thi…
ytr_UgyABA5q-…
Comment
I don't understand why tech companies would want to destroy the world along with trying to make everything worse for the job market. But when it comes down to it, I would say the optimistic job set would be essentially where people would be allowed to do whatever they want without any need for jobs. And where, when it comes down to it, I would argue that when it comes down to something like AI, there should be more safety regulations put onto it. And it's kind of like renewable energies versus fossil fuels. Fossil fuels are becoming increasingly more and more obsolete. while the renewable energy sources are becoming better and better. Not just in terms of safety, but also in terms of all that really is needed. And also, I just want to know why these AI companies say their product is dangerous, and flat out fight tooth and nail against regulation. Right. Because regulation would be the good thing for it. Not the bad thing.
youtube
2025-11-23T04:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzYto5M1VIm3t30Hvx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwfQJQJHzCeLg0kxrV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxA9i7w0DFe6AA-njx4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw9GOh7H4f4RccCaB54AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy8S58RylMl9JfQE2Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyZP8IBhS88xkscegt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyUdLgu_exlaKekhIZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzAtADkZU8pGpZd2NZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxmJlu8CnDZ8oRotUp4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzqzp0wYtzrwuj5a_l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]