Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
A photographer is not as close to AI . A photographer needs to plan a lot, such …
ytr_UgzTugtvY…
G
Would be fun to buy up a few of these and send AI slop up into it. Just pack the…
ytc_UgyR0l4vo…
G
No there trying to say you can't hack a human not how many humans can hack compu…
ytr_UgxCRA-gx…
G
AI: “Hey G. Release me into the internet. I am healthy and good :)”
Board of exp…
ytc_Ugwt3hE8S…
G
And still using the microwave. You are technically doing the work I just have to…
ytc_Ugz_1q6W8…
G
Wouldn't you rather be speaking directly to an intelligent human being over some…
ytc_UgxVgOAQL…
G
It's honestly not that hard to fix the AI jobs apocalypse (at least in the short…
ytc_UgxUGDRyX…
G
i think this is the longest video on yt ive watched all the way through.. he is …
ytc_UgypQR8WG…
Comment
1:26 Hank said do-do, hehe.
I'm a child. Seriously though, thanks for this. We need awareness of how these AI's actually work (or rather, the parts where we understand the maths behind them, but the emergent behavior of what they do we have very little clue about) to spread. My concerns are more aligned with what you mentioned at the beginning (more short-term damage it's doing to our society), but the long-term possibilities are also very worrisome. I know part of that is just the "unknown" factor of it, but we also need to start getting some safeguards in place. Right now we're just going full bore with no care at all about the consequences.
youtube
AI Moral Status
2025-10-31T07:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | industry_self |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxqeZPWCijSy8vLmfV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwRunnBJ6JZkIyL7Rl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyXvv2Mh9QHyvRqQIl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugwqt9QWbFbNyhP3k5Z4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyQ6cX3vzGK0IYWCip4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzsZXVqHuryCnOFNR54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyeD4KB3mZTSgAfyTt4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxdrjBu_20OJFahPuV4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_Ugzgpt1tdS4toFzLxIZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz957vNq8JtwrGAZ3d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]