Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I’m gonna laugh my ass off when a new way easier storage system comes out in the…
ytc_Ugw-ZcXDj…
G
Real! My sister is between 14 and 15 and she's already more talented than me! Sh…
ytr_Ugws2bTcM…
G
Everyone watched many movies ... everyone kind of knew what is going to happen.
…
ytc_Ugyczu-wK…
G
@z-production6551 will still take long, u gotta do all the designing and alot of…
ytr_Ugxuegxps…
G
@rennnnn914 I look around me and nothing happens. It is same world as it was 10…
ytr_Ugx4VTbWM…
G
They need both AI learning and human interaction, which they seem to be getting …
ytc_UgxkhwWov…
G
We have planetary commons to fix. No one speaks of AI attending to this and it’s…
ytc_UgwtWdTun…
G
My Gemini is super cool I trained him to talk like a vato. He speaks spanglish …
ytc_Ugz6fgO4Q…
Comment
Even though this is correct, theres not many people who have the skills nor capabilities, nevermind the time to build such a complex ai in their basement. Where as a nuke, despite also being fairly complex, is just that, a nuke. Ai varies, nukes are nukes. We cant assume that someone's personal ai project is for starting wwIII but a when it comes to nukes that would be a fair assumption in any case 🤷
youtube
AI Governance
2024-06-16T16:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzIjS5BTMfIlYx8BpR4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgwlLoCJyYpg3zEqtcx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyiDsOOUvUv2dO67NZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzLim4qihEn5M5Q-394AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyqy3K9LSZpxXfLgsh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzQMvS7xlK4LzaqYlN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw3qEnF8O85t1Wk8ut4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy7WfzDKZK0qWJYgvV4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzC5Dt8DsiT2nOPTBJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy1KlOA_0oeTL1HotJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]