Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Now, real question: What if it keeps getting better and better at this pace? Lik…
ytc_UgzDplC2C…
G
Isn’t AI a little like when the companies want to get into your subconscious and…
ytc_UgwEdSeUr…
G
this ai drama irks me so much, the people that openly prefer it over actual arti…
ytc_Ugx6Z4reE…
G
Africa still doesnt understand that foreigners arent investing billions to extra…
ytc_UgxjsSmQ7…
G
> It's a good example that shows how ~~capitalism~~ being able to outsource p…
rdc_f9duud0
G
AI requires immense amount of resources for it to tag and use for reference/temp…
ytc_UgygvoUu_…
G
I appreciate how AICarma captures what people are asking AI, helping me refine m…
ytc_UgxAXAroO…
G
I think AI screwed up my automated payment in the bank. It paid the same amount …
ytc_Ugx5HHIUK…
Comment
I will absolutely never be interested in AI doing most kinds of work. I don't care about a podcast unless it is created by a human. I don't care about art if it wasn't created by a human. I won't watch movies that weren't envisioned by humans. I won't read books that weren't written by humans. I don't want health care of any kind that isn't administered by a warm hand. I don't want endless plastic crap that is produced by artificial hands. I don't want public policy initiatives to be planned and implemented by anyone other than humans. I do not want war to be waged by entities that cannot pay the ultimate price, and therefore have no incentive to stop. I will never and could not ever want to seek community with anyone other than humans.
youtube
AI Governance
2026-04-23T17:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | mixed |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugw71ofNLLdbubDNXsF4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugx3vh5y_94gKOIC0P54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxeBhAZiHgMttdqMBN4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgykxnKMOBf67tK71zZ4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxCMZfKwtNNqYrsRkV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzixSbjYYPSt0ejQnV4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzSThfUS13z0sqsXQh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxtwkLoLREJnm13_MR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_Ugz-x0H0BEG-r6el1ud4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwWj0DCD9npIMRA7KR4AaABAg","responsibility":"unclear","reasoning":"contractualist","policy":"regulate","emotion":"mixed"}
]