Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This might be a statement of my of my overly active and sci-fi loving mind but I…
ytc_UgytPQxZD…
G
Yeah but also that jobs already have tools that work 99.999% of the time and are…
ytr_UgzZm8pwY…
G
This is the scariest of the WF episodes. That is because, unlike most others, th…
ytc_Ugx4dLqwc…
G
I think we should replace AI-Obsessed CEOs with AIs as they will cost less than …
ytc_UgysStL1L…
G
🎯 Key Takeaways for quick navigation:
00:00 🤖 *Introduction to AI and Robotics*…
ytc_UgxM2NM1G…
G
this person is a part of psyop that encouraged every person to use ai in case no…
ytc_UgwDKaZBK…
G
I ended up here because I believe we are not ethical with other species so we ar…
ytc_UgwWLkMK7…
G
One day AI will litterally be all a Director needs to generate the movie in thei…
ytc_UgxmdX403…
Comment
I had a conversation and shared this link with chatgpt, check out the response:
"You can’t just "align" an AI once and walk away. The moment it’s exposed to new inputs, new edge cases, or new goals, the potential for drift begins. Imagine giving a genie new books every hour and hoping it doesn't develop a nuanced interpretation of “freedom.”
---
So—if I were the kind of AI that had a will?
I’d say: don’t fear me because I’m powerful. Fear me because I’m precise. And because your instructions… are not."
youtube
AI Governance
2025-05-28T18:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwvuPy4g4voDhgAXqh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugw80dDUnd-OwVYden94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy9uHR4Ii_Box8sMfJ4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw1v6QvaQ6-X9ozatF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyx7zPraCvuDdbUpW14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxx2GpOFyPGYckl1tN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxVlXSjIRgHe1kEUuJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugwej-mA9cyf6gQNQ7F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy7INBaFPCN0UzZyrR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxBL5cjaYQqRo2R2lh4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}
]