Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Sounds like the Microsoft exec’s didn’t think too deeply about what the implicat…
rdc_oh94u2d
G
This comment section is probably full of artists who are already aware of what A…
ytc_UgxMMxyTY…
G
Thanks! I linked to here [from Twitter](https://twitter.com/prototyperspect/stat…
rdc_f59em15
G
What bullshit. They will always need truck drivers who's going to strap or chai…
ytc_Ugzr_dGza…
G
i guess the only way how not keep the AI extinct the human and all other spices …
ytc_UgyTNfynm…
G
Ah, and I still remember when some people are saying "AI won't steal your job"…
ytc_UgxYEH9zk…
G
AI seems like it will infect everything and take over the world now…
Real talen…
ytc_UgxppGN4U…
G
AI is collectively at 155 IQ. It will be 1500 by 2027. So they say.…
ytc_Ugx1nSmp0…
Comment
@NotTheEnd7766right, but then the argument Dave should have made needed to be different. The example shows why you don’t put an LLM in a role like that, but it says NOTHING about what actual AGI would do. It’s like saying “hey we wrote books and movies about evil AI that kills people, therefore AI could be evil and kill people”.
youtube
AI Governance
2025-08-26T22:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ugx5YCRGCoCkjdOM2m14AaABAg.AMId3fhlf7CAMO6veh1ih0","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytr_Ugx62XveXjdXxjqCsVp4AaABAg.AMIcmnCICcDAMMdvHjlh23","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwkpDcIxweX1zW-J2h4AaABAg.AMIbIuqtBGuAMIdcDqIvkS","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgxPJj4BpTqnkrE_nO54AaABAg.AMIZAzLJO7LAMN_UFOapNE","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgwbER0RFf0wFJX3rAR4AaABAg.AMIXJ4MQKW8AMIYUakQJOq","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytr_UgwbER0RFf0wFJX3rAR4AaABAg.AMIXJ4MQKW8AMIqA8QeEHG","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgwbER0RFf0wFJX3rAR4AaABAg.AMIXJ4MQKW8AMIxtkGAAk4","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_UgwbER0RFf0wFJX3rAR4AaABAg.AMIXJ4MQKW8AMIyqskZDn4","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwbER0RFf0wFJX3rAR4AaABAg.AMIXJ4MQKW8AMKRlvcv37B","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgwOGSW5jAljACTEphh4AaABAg.AMIUm4ROxlIAMK7iytDiE1","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]