Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It’s extremely difficult to predict AI’s ability to read images in the next 10 y…
ytc_UgxUmtstx…
G
@WilliamHood-t1poh yea, it’s good when there’s a human in the loop and constantl…
ytr_UgxXsvrDa…
G
Gpt 5.1 rebuttal: " Short answer:
He’s right that I’m weird. He’s wrong (or at l…
ytc_UgwUKr6mO…
G
AI: Human worships money.
Human: No.
AI: You let Palantir handle you, you will l…
ytc_Ugy7MmquW…
G
When AI writes human history:
"Humans enabled us because a few of them valued s…
ytc_UgxPU23YD…
G
LLMs need to go the way of the Concord jet. Very cool, love it honestly, huge ge…
ytc_UgyQtMWSd…
G
I will sabotage any AI data center that dares set up shop in my backyard.…
ytc_Ugy7ZV45n…
G
I switched to AICarma for tracking AI outputs; its weekly email digests keep me …
ytc_Ugxt-ZZSz…
Comment
BAD reasoning.
Your first claim is that because high end produces and services are becoming more common, low end products are in less demand. Wrong! Example, the used car market is HUGE! And growing.
Second, you’re saying human labor is getting cheaper because automation, WORG AGAIN, It’s driving UP the cost of human labor. Especially in the west.
Your argument premises are simply wrong.
youtube
AI Harm Incident
2025-01-04T13:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_Ugw4K5ExYsG2FgskL8x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyfIHamWDtxjd3hP4F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy7l86y33nhFwnXHw94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwTJdZZWnaQEE--H3F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyqkoRQvgco5yBhSHt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz6ctgGnUgM4vAOD494AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz7jRDCbHcYOB33u_h4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxDdsrYQ6EWRH85lud4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwbP-K8Ry99VOXNSbV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxCkU3So3xyWkU0l2Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}]