Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I’m a bit of a fatalist when it comes to these kind of things but I’ll parallel …
ytc_Ugy69TkwG…
G
At 58:00 you're talking about how we're on such a dangerous course, how irrespon…
ytc_Ugz3ZD8TX…
G
I am not worried about human extinction. I worry about human greed. I worry abou…
ytc_UgwPHbcKg…
G
Absolutely interesting. I would suggest interviewing a Sociologist/Anthropologis…
ytc_UgzmxfGcQ…
G
how can i do this AI voice cloning? I think it would make for a funny way to tro…
ytc_UgxD5bEhP…
G
Alex, your implicit definition of a lie is incomplete. A lie is not only false b…
ytc_UgzDNaeTB…
G
I have a question/am seeking advice. I want to start by saying, I now know AI is…
ytc_UgxS_Jsre…
G
I feel like some ai can be helpful and pretty AS LONG AS YOU SAY ITS AI AND DONT…
ytc_UgyJWsrVI…
Comment
I am an AI trainer that teaches bots to code because doing a Java bootcamp wasn't enough to get a developer job. If I wasn't being paid to talk to bots I wouldn't use AI at all, let alone for coding.
youtube
2025-04-16T16:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwCdo-QHcaw84Ws6D54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyz8WOhFURDjUfKxr14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy7MV1QKUFnQmBIwiV4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyh1dLIfZzVokeHD-N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyHpzGvBrye71VUrZp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzUroKRAmqJqJjX-Zh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwzOgFfJ6bPM9LDB9x4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxMRcJa67Cj25IVOYN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx2bcXS54QHRZE-97J4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwwK8ueXS3THttWzih4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"}
]