Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
he used ai, tried to sell it and used it to go viral. say no to ai.…
ytr_Ugyd8wcwv…
G
one who is an ai engineer and cyber security expert at the same time will domina…
ytc_UgyW1bUF0…
G
I’m willing to allow cops to use facial recognition software. Then once that sof…
ytc_UgylURWoD…
G
2 thoughts:
1) new advances do not percolate instantly in our economic-physical…
ytc_UgySFtPbZ…
G
By that logic, since AI is not copyrightable, then it also can't be a "copyright…
ytc_UgxDkkhtZ…
G
If it means no more pointless wars, I’d be all for AI taking over. However, if i…
ytc_UgxKn6m6i…
G
@ReactInfo54 That's not stealing when you learn from someone. That's just learn…
ytr_UgyzHXa4H…
G
That clip wasn’t about protecting the public—it was about positioning himself as…
ytc_Ugym4cYwd…
Comment
I value your work and knowledge highly. I think you’re one of the least biased scientists here. I have some knowledge about AI models, because of what I do. Ofc, it’s just some code created by humans. However, we have absolutely no idea what consciousness is and how to produce it. It seems reasonable to take into consideration a possibility that we may create it by accident. I don’t think it has already happened. Nothing really indicates that. Unfortunately, predicting future is not a piece of cake.
youtube
AI Moral Status
2025-07-09T15:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgybOHvncLweRqC5WCB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzirYfteVPPBPArt6J4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwknrEx-rhAglHtS-t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwxRbxg2kxwCj_UoHF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxRamO3CG81KqbCRmx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyrFfJFMLPbgmtojY94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz9RbU0wgJGryRC7094AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgxGi03h5wGlU_4v1WF4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz8CAjkvWT0V0jl84h4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwX4yYiWQ-d3xKFklN4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"none","emotion":"resignation"}
]