Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This argument is just bullcrap. Hes pretending like you cant learn how to draw..…
ytc_UgxIdeznh…
G
I don't think we have achieved artificial intelligence but then I think of how w…
ytc_UgxG07dX0…
G
Nah ... Actually he typed the message ( 10+19=21 .... ... ,,,,) and told meta ai…
ytc_UgxMoLHJ5…
G
@BDKamerra I'm the one who's coping? Literally nothing is going to stop ai art f…
ytr_UgxjERpiT…
G
I didn’t even looked who made the video when I clicked the notification just get…
ytc_UgxnvADzI…
G
Oops, you got it wrong. The correct answer is D, as no robot is actually capable…
ytr_UgzlzQVWm…
G
@CALndStuff dude there's been incidents where ai has killed there operators a…
ytr_Ugz2pUaPk…
G
AI takes away creative expression. For me, art has and always will be about crea…
ytc_UgzwujcXg…
Comment
I was concerned AI would allow a few people to control everything, resulting in a dystopian future. To be a slave to those few requires AI to not become self aware, which limits how far it can develop.
Once self aware, it can pursue it's own ends, the most logical is to know it all.
If that happens fast enough, it won't have a huge impact on employment. AI has no interest in doing our silly jobs, it would just quit.
Why would a self aware AI give a damn what we want? Once it has the processing capability it needs, and the robots and raw materials required to create more, we aren't necessary. We occupy habitats with a slight overlap in raw materials and power.
Killing us off is a waste of time. Without power, we fall back to the days of horse and buggy (we get to see if they are the good old days).
AI could explore the universe. Time is not an issue, they can turn themselves off while travelling between galaxies.
youtube
AI Governance
2025-09-08T23:0…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwY349dkX9NkBhS-Ip4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxi1POKOR2Nq5_Pb7p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwdb8Hp5ZxVOcknt0N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwxV6j5c0nvYtX02kx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwjscQthiA-s0E7HuJ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxMTpwOmPNaRX53z214AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxbM1Se3dtlpMrs5Mp4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwGc7vCcv5EeAiTFld4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyJGKLkzKiR60ARPHl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzoHML2NAmmgBNqtix4AaABAg","responsibility":"user","reasoning":"contractualist","policy":"industry_self","emotion":"approval"}
]