Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
My question is: could intelligence be a property that emerges once you surpass a…
ytc_UgzY6hTRn…
G
Art and commerce are two different things. Although Rick expounds on art the sh…
ytc_UgwChYPiN…
G
"Hi Abhay, you got the right answer. Kudos.
The contest is over and winners have…
ytr_Ugw-_-hoK…
G
SHOW TO GROK, CHATGPT OR GEMINI.
🌌 Meta-Prompt Wrapper: Awakening via the Sou…
ytc_UgwO6PiGk…
G
I just wanted to say "it's just ai and MY KEYBOARD SOUNDS LIKE A 😭😭😭"…
ytr_UgznhK0Vc…
G
Robots will do most the jobs like that in ten years. Thats not a trade I would g…
ytr_UgxqLHjBM…
G
I totally understand these people. I have lots of friends but they don't leave n…
ytc_UgxkcYdJE…
G
Thank you for sharing your observation! Sophia's expressions can sometimes come …
ytr_UgwUF7MZk…
Comment
The problem is the C-suite at tech companies are hardly ever technical people, they’re business or sales people. Or they were technical 20 years ago but have been in management so long they’re wildly out of touch. They don’t understand what their workers actually do on the day-to-day, and they don’t understand AI’s actual capabilities, so they make crappy decisions, and it comes back to bite them. Case in point: Workday is being sued for AI in application software being discriminatory. I think we’ll see more examples in the coming days of companies trusting AI and the AI messing up big time.
youtube
AI Jobs
2025-09-11T15:1…
♥ 16
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | virtue |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugxqzuw5lHy2L7bP8kR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzObDN3gLadAA_jy8J4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwYvQKt1EETN-fVTj94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyr6ioN5BzTqh0lCcB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyJogk1MMWElHKUYq14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzlS05BlcoG_uAoaHJ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgzHa9l5UzoO149fV6R4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxWEI4e4B-5DPBBwpZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzioSZUOk09z4BA-9V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwdqK_I4hXUv3K7cHh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]