Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yeah, you forgot one thing. Humans have weapons and in this situation, I could t…
ytc_UgzEqfkez…
G
So.... I had to listen to this twice. This guy has contracted himself at every t…
ytc_Ugyka5bFP…
G
Don't stock with technologies but learn them until they are irrelevant or comple…
ytc_UgxkDTzA2…
G
Depends which religious system controls the ai programming doesnt it. Then th…
ytc_Ugwe2dNic…
G
Now what could a robot gain from starting a business?
How could a robot relate …
ytc_UgzlSYXOH…
G
The future you're describing? That's Star Trek. Complete automation of all manuf…
ytc_UgwU4e4ZK…
G
Pretty words, I truly enjoyed this video your formating and style is captivating…
ytc_UgywYORfx…
G
I gave AI all the relevant information on the characters of my novel, then I tal…
ytc_Ugx_Cruod…
Comment
Coming off of the videos where you implore us to not indulge in sensational beliefs about aliens just because we want to not be alone in the universe to talking about LLM's have preferences, agency, and intelligence just feels wildly dissonant. Just because we don't know exactly what's going doesn't mean text LLMs with billions of dollars of processing are getting us closer to AGI. It's inserting the most interesting/sensational possibility that gets us further away from talking about Conversational LLMs like ChatGPT in a more accurate way.
youtube
AI Moral Status
2025-10-30T23:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxgrK6C2Uao6798G7R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzrYwQ_ZYtGkegqHtV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw-_boNT2UHH-KKDep4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxtpzWAN0_e8eE9p-F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx4EJsMOUikWacNTml4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxF4bXUctfpg4nSK9h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyN9kO7i9XbC_VyJI14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzaOS5tyiTeC6YSXLd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz9FH0P2EV96FON3Yx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyIkkQde0j9HOJ2gU94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"})