Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Teachers could "get to know the kids and help them better too" if we weren't bog…
ytc_UgyzVbaP8…
G
Well cars and airplanes need to be safe, companies shouldn't pollute the environ…
ytr_UgxCOuBYm…
G
It's weird my ex Melissa had that exact voice. I almost felt like they spied on …
ytc_UgzqSYslF…
G
The teacher may have 10 years of experience, but ChatGPT and similar tools have …
rdc_kgqm64l
G
2:03 LOL I loved his last question, "did they tell you to say that?"🤔😬🤣 I'd be l…
ytc_UgxP_eOHi…
G
how about we save the worlds resources and not speak to them aka "AI" or the peo…
ytc_UgylOF9aw…
G
I wouldn't have an AI barber or hairdresser even if they outperformed humans. I…
ytc_Ugy03w3QR…
G
This is why I do not post my most important creations on line and I think there …
ytc_UgxtwOGgC…
Comment
Neil's point about AI being unable to create conclusions based on data it was not trained on highlights why we find out-of-date answers in many otherwise adequate responses. It fails if we think about what is happening in simulations like Omniverse, where the AI uses physical rules to construct a reality that doesn't exist but could. Ilya Sutskever and Geoffrey Hinton define the human brain as a physical system that can be imitated or duplicated by similar physical systems. When that happens, if it hasn't happened already, the resulting device will match or exceed any human being and quickly exceed all humans becoming ASI.
youtube
AI Moral Status
2025-09-21T11:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgzVJZpsdP7ZDQlsD8d4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwN1dkttfMv-ElKAA94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzpkSsVKiedvFs9z5V4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx8_2vVy_BJ-PC4iNx4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgytLVLSgn_N93sUoaZ4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugxu1qEq_xXvxzZ3Xft4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyhDnEJ_eGZPD3HhKh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwTIMSwbdBxtOHV4QN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzDgJ2IRsa3dNTI91l4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy6BOIBVGxVeHxdzC14AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}]