Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
That is a dumb question to ask a robot, considering she cannot give birth to a c…
ytc_UgwBbKHkV…
G
@noob_in_youtube842 Everything has exactly the worth you give it. Do you think y…
ytr_UgzW-7Ul2…
G
AI is just sitting there like damn i'm going to consume all the free art…
ytc_UgyP9ddZL…
G
Most doctors are dumber than a sack of rocks, so it's obvious that an AI will be…
ytc_UgzNmwQ-E…
G
Are you genuinely so unhappy with yourself that you get a soulless computer to p…
ytr_Ugzfu2fsv…
G
There is no world where everyday people sit back and let AI take jobs away from …
ytc_UgxRYwYUs…
G
AI art inspired me to work more on my comic. It's going to be a long time until …
ytc_UgyUfyn7x…
G
The alignment problem in reality isn't really something that rational needs to b…
ytc_UgyKNrXMO…
Comment
As a programmer, I think the voice system will try to engage the moment you paused, not because of the actual LLMs eagerness to speak. So, one potential improvement for gpt is to behave like agent, It'll decide if it should engage or wait and should have understanding of how many agents are interacting with it, and we as a user will be participating as a user agent and able to have a group conversation with more than enough AI agents in a single chat where the interruption is reduced.
youtube
AI Moral Status
2024-10-07T09:0…
♥ 60
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz-8_O_JEoA7y9BhQ54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz13KEygoEEjbiZkQV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwS4q0iqPETPuKKuP14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxWeWOpHS5IhxqWk5l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyYunj-P4QA8zW67Ph4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxW5dqsQNwHd05AKpZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugw_kj28LPG9boZ-JGt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy5LPy1xakmaYHBD3t4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxcJ4Lc4yEzn71IlvN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgwBlaasdKj1FAtHH3B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}
]