Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@EALoArt I feel your pain, I just spent the past year learning to design meshes …
ytr_UgzaJn1vL…
G
the only regulation that will make sense for ai, if manufacturing plants replace…
ytc_Ugxw02dGH…
G
That's ridiculous.
Self-driving cars are not going to be risk free. They'll be…
rdc_dmsdwq8
G
I imagine that most problems that may arise from or for AI will likely be easier…
ytc_UgjQVxHcT…
G
Thank you, for another great show. As long as we keep a human in the loop with A…
ytc_Ugx1I_da4…
G
People talking so much 💩 have no idea what they are talking about. Don't fall fo…
ytc_Ugxx2qZxL…
G
No one is talking about AI vs Divine Intelligence. The secret sauce to make AI s…
ytc_UgyFdPWNf…
G
I mean, I respect people who make the software, but I don't think that's who thi…
ytr_UgypwK9Cf…
Comment
16:00 however, if people did routinely write "I don't know" to questions posted online, it would probably help solve the hallucination problem, but it would put a cap on how intelligent the LLMs responses could be. If there is only one person who gives a good answer to a question, and thousands of others responding with "I don't know", then the LLM will choose the response "I don't know" because it's way more common.
youtube
AI Moral Status
2025-10-31T09:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxJe79ZRUS_9eOtP1J4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"tragic"},
{"id":"ytc_UgwIwA9d-TJy_ELMYyh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxOoxXxs2Faj-YeX7t4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyUQ0LirlntAuUax754AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyP8A0kx6ACM3bg6154AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxHfjosJT57L4hmvkR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzubdM5fyd2xONljAN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzTErbDjoVi_FI1WVd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx59i9jiCN5KkOb2ll4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzp8ZqUv3SNaAnW38d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]