Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I do think it is worth noting the difference between 1000 crashes happening with…
ytc_UgyKUsP06…
G
So many people hating on the art even tho its the same quality. People just dont…
ytc_UgySvEGb9…
G
Corrupt humans created AI.
WTF did you think they would do if they're learning f…
ytc_UgxhP2kGj…
G
Once these kids grow up and aren't having ai do the work for them they're screwe…
ytc_Ugw-IJe1J…
G
"I saw your Chara AI Chats"
I think they'll just question the Warcrimes I did to…
ytc_UgxRqIIHH…
G
Perhaps the greatest danger of AI has been overlooked by everyone except me. I …
ytc_UgxcvSNxz…
G
It's a good video. But is this channel and this nice guy in a t-shirt AI? I thi…
ytc_Ugy5xlZDj…
G
I'm in my 80's. Starting with the 1970's, (in my particular memory) we started …
ytc_UgwCe5PMP…
Comment
The only thing to fear about AI is what humans decide to do with it.
Sure, we *could* all run around harming each other with butter knives, but it turns out that 99.9% of us simply want to use them to butter our toast. Butter knives are only dangerous in the hands of a dangerous person.
AGI won't have consciousness or free will (we don't even know if that is theoretically possible), and so the only danger is us. Fearing technology has never historically worked out for any civilization. Instead, those who have embraced it have become 1st world countries, significantly improving the quality of life for their people.
*All* technology requires responsibility, but I do not believe that fearmongering (e.g, it's going to destroy mankind) is helpful.
youtube
AI Moral Status
2025-10-31T14:1…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzA7vlrd087013Y1g14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwmWwNHvU02M92Ermt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx8MvlrcMUm9hXN7gd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzIEoAWfjGSS2lgs994AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw0j09RLZBTv38Euwd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw2ZFiSByaLGFw6TbN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyrrY5B1-Vdt8F-sfV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwsyM78VoON9jVXCmV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxtjMfkynSd-a6jhNR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx6vsazHpUQ6oK0OLp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]