Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
That would be quite an interesting world! Having a robot friend could definitely…
ytr_UgyHxsu2s…
G
Mark my words....this is going to be a reality soon. AI and Robotics will gradua…
ytc_UgzFUPLsL…
G
Not surprised that the government wants to deregulate AI, especially the parts t…
ytc_UgzBa_MVS…
G
this is the worst argument EVER and people make it all the time. you do not have…
ytr_Ugz-gUdGa…
G
hey bro. You can do the weirdest shit with your art. Ai can only do so much 🙏…
ytc_Ugyl07J_N…
G
The Plagiarism AI is not the most accurate. It will claim you plagiarized your W…
ytc_UgxdU1wvt…
G
i went eren on chatgpt and tried to convince him to take over the world…
ytc_Ugxjt5vPT…
G
the believer ai is like she has some feeling itself its not logical she just mak…
ytc_UgyGs2bAR…
Comment
If he says we have about 5 years to put some sort of containment in place; that’s a very short timeframe. We are already beyond our capabilities and will be seduced into deeper water by exactly what he says — the benefits and conveniences.
No one forced us to allow Alexa into every facet of our lives. Alexa knows everything about your personal preferences, habits, timing, and it’s not AI yet. But it has all the data that AI needs to control our activities. People choose to give over their autonomy for convenience.
I am not a religious person, but this sounds like the apple in Eden. We are not forced out of our current safety and security; we are lured by promises of something better.
youtube
AI Governance
2023-05-25T13:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyjAeXNnDhoJZZPeB94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyPAJXCWT6w_1PmC_R4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwAwIcPqQAp1RGKe1l4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwuR-BHkyoNjHg1I454AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxMalVyFophqIKxhgN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyExllfujrnBuY6iG94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxILtbbV8XLZmbHzxZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzTb1BfjunWxzj2jXl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzlC702j4PudGHjlwF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgybpSZE4Uf8KcxpHpx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]