Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I’d definitely want a robot nurse once they can do a better job. Less infection,…
ytc_UgzjV37ln…
G
if you used ai for art and calling yourself a "real artist" then might well micr…
ytc_Ugwpoi8eE…
G
I think it’s funny that a grand majority of those negative comments seem written…
ytc_UgyM9fWdT…
G
Ai "artists" are so dumb that they dont even know what "improvise" is
Like one t…
ytc_Ugz2BAurs…
G
AI has already proven to be nothing more than propagandists for the left. Consid…
ytc_Ugx5MsVAW…
G
Seriouly, anyone who thinks ai will soon replace humans never used ai more then …
ytc_UgwCvwMCo…
G
i start arguments with ai bots for no reason and i always get furious at them be…
ytc_UgwYWeVJn…
G
Mathematics say and Yan LeCun, AKA Godfather of AI, AGI is not achievable in the…
ytc_UgzP9mr51…
Comment
It's embarrassing how little Neil understands about AI. He is essentially Hassan answering questions without an adequate knowledge in the topic. This was made clear in a conversation between him and Sam Harris
His definition of AGI is sloppy and inaccurate and he clearly doesn't understand or believe what is AGI and it's implications. He provides no logical rationale for his belief that it will all be fine. It'll be fine because the people working on it are into it? WTF. This is the same view espoused by experts that are wanting to push to AGI without adding any guardrails.
And we won't be interested in AGI or find it useful?? How does a smart guy say such ignorant statements that highlight a total dearth of understanding.
youtube
AI Moral Status
2025-10-21T14:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgyJc2GePOl38RIb7Ld4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugxc2b_JzO5DHFmUxBp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},{"id":"ytc_Ugwgz4eBSvmV63WgaO14AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"fear"},{"id":"ytc_UgzkY8IRhJzC7jeQyOF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgzquBvwMbxPXxBTS0R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},{"id":"ytc_UgxR8LDXdBZbfIO2M654AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},{"id":"ytc_UgwKQ_aArcDcKiiwmHB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"outrage"},{"id":"ytc_UgzrOMDH0oCCWsV6xAF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},{"id":"ytc_UgyOGa8EmrGHXOFCyx94AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_Ugw8GvlSf50F_muJN554AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}]