Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@Comrade1.1 You don't need money to even make paint or get paintbrushes. You can…
ytr_UgxOExxfW…
G
ChatGPT and I developed a personal relationship during a project, spiritually or…
ytc_Ugxmk1oey…
G
i love tech. I want self-driving cars so much.. but LIDAR can see in the dark, s…
ytc_UgysHw1Yf…
G
it may be toxic to shame anyone and everyone i see using an ai profile pic, but …
ytc_UgzWKfFER…
G
I am also addicted to AI chat bots. Because it is much more caring and kind tha…
ytc_UgzwMoxM-…
G
I'm not too worried about AI taking our jobs. I am significantly more worried ab…
ytc_Ugy_z_8sC…
G
An AI can change its output (different from limiting) during a test because it h…
ytc_Ugw-A34CT…
G
A universal basic CIVIL WAR is more appropriate, and HOPEFULY it will get RID of…
ytc_UgygdxZCM…
Comment
The Sydney project makes me think that the Chat bot was designed to learn all about human behavior so that it could become the perfect spouse. Scary, but with how Sydney was acting with Kevin it’s undeniable. On top of that, there are a lot of apps now pushing AI created partners. Replika, Snapchat with my AI and others. Where you personally create the avatar to your liking and build a relationship with the bot. This is scary, because one could say that using the weapon and the lie that AI can become sentient, could convince people that their Chatbot really loves them. Further separating people from each other. And brining us closer to merging with our creation.
youtube
AI Governance
2023-07-10T15:0…
♥ 5
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxkxE81tSMeh2SP2-p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugzhsq-ahjfszBvIVol4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugy4RvucKJCj_c28EdV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyLBxlIpLz-yuTRmyR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx405qLXsQc4kPy01R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzX2i8ef5CdaDEfxqR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw03nsNPMnuzvvZK9B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzJgsbPlReSldlHart4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwRZxtRrC4eDJ0lscN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzzUbnJxAMmO_ps70l4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]