Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I try to put myself in the shoes of a driver,I don't think I would have managed …
ytr_UgwGIkjFS…
G
Can I learn ai with machine learning and deep learning and natural language proc…
ytc_UgwCvr0Ve…
G
Seems legit, Your robot GF wants You to stop editing Ukrainian MMA clips and got…
ytc_UgxS-b0Cz…
G
Who else hates the seemly uncontrollable technology changing at a rapid rate in …
ytc_UgziSjtKX…
G
Also, it's time for artists to learn other things. Now that I have AI art to do…
ytr_UgxKx3Blu…
G
Fun fact?: Ai slop was so disliked that f##king rule 34 made a toggle to not sho…
ytc_UgziPcc7N…
G
Didn't you make a video about how AI must be good because it's endorsed by top i…
ytc_UgzqXXcg3…
G
Now there is an A.I that can write a full story (original) if they made an A.I t…
ytc_Ugz0orSNP…
Comment
As SMR said.
"you're not reading between the lines".
"FSD/ Optimus" isn't the issue.
.
The issue is an AI potentially either misunderstanding it's role, or a fundamentally bad core instruction being cemented by "us" into *one* of the apparently numerous systems now in development.
.
As Steven suggested with the "paperclip" analogy, one instruction, misinterpreted by the AI (OR, more likely poorly phrased by any one developer) could lead to (just for instance) a broad goal of "saving the plant" resulting in "The" AI *reasoning* that at a fundamental level the goal requires analysis of what is the greatest *danger* to "the planet".
The (LOGICAL) answer to which could quite easily be "Humans".
.
If the AI then reasons that "reducing resource consumption by Humans" is a way to achieve the goal, the next (LOGICAL) step may be to remove the ability to pollute.
"The" AI then turns off every automated valve on every energy production plant under its control (which would be every one under computer control with a network connection...... All of them?).
.
No malice in its part, but dire consequences.
.
Your "excellent images" can wait.
youtube
AI Governance
2023-03-30T10:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgxBbjHdNEXCBjidVfZ4AaABAg.9nsv8z4LZEt9nsvHhBQyLi","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugyn8vRMgBLNpaTFoeJ4AaABAg.9nsuZtTfUIb9nsxmDm3iEf","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgwD5K5LX9k7VcdSeYZ4AaABAg.9nsrH1bRpzH9nsss5rpbvT","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugy_bymuQCOqeV6s8Sl4AaABAg.9nsqsDyTBSf9nstu0Pqwrd","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyHLe62bx7LqEPZlV54AaABAg.9nslC6fF1eg9nsmYSKG1ad","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgzUvcNmWdhebmM0rq14AaABAg.9nsiiFbt9uv9nso-HFEp4z","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytr_UgyQuu1agpBnCarU-WV4AaABAg.9nshRnwXsTD9nstNXShz5G","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytr_Ugz_ExjirZyC-Mmo8sl4AaABAg.9nsh0_j6Fq69nskh_BrUbB","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugws9LG6oGPBF1DhUfx4AaABAg.9nsPu9YTMQy9nsmHJwbPS8","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytr_Ugxuw4r-QaSU79GjGdd4AaABAg.9nsPIpCpSlP9nshLvBQVsD","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]