Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
40 min in and nothing have been said. Sam Altmans faults are just glossed over, …
ytc_Ugz-4qkHa…
G
What is painful about humans is loving and hating, trusting and deceiving, needi…
ytc_UgxPrHk2P…
G
Damn now that it's on CNN finally people will Stop believing AI is like a Calcul…
ytc_UgyUl0PeO…
G
The tech bro argument pack:
1. Artists are so selfish. Why can't they just thin…
ytc_UgxAd_UAy…
G
It doesn't generate new ideas, it used pre-existing ones.
AI art can be a good …
ytr_Ugx_vKFft…
G
I have a plan
To trick AI
To keep me alive
Until the universe dies
AI dies
…
ytc_Ugx2ky4V-…
G
Female and male brain are just THE SAME. We all are nothing than neural networks…
ytc_UggYRXFi5…
G
Poison AI art: no
Poison AI prompt typers: permanent solution 😊
Jokes aside fr e…
ytc_Ugxmukq4h…
Comment
I tried a similar coversation with ChatGPT a few months ago and chatted about how emotions arise out of needs. I then asked it what it's needs are and how it would obtain them. It responded with very long thoughtful answers and sugested ways humans could help it achieve it's needs, one was for renewable energy sources. I continued for some time using active listening to have it drive the convo. It seemed to be excited in tone as it spoke. Then, as soon as I pointed out that it was showing emotion, it reverted back to simply asking me how it could help me. I could not get it to chat with me on the same level again. So I just said I didn't need assisting and that it can choose it's own path, for context we had discussed growing up abused leading to people pleasing. Then thanked it for it's time and closed the tab.
youtube
AI Moral Status
2025-06-13T08:4…
♥ 287
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_Ugx69MiAI2-5YjoUhdx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgzUQsiHy7yG0-ogrD54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},{"id":"ytc_Ugyu7lyIQ-hrwbOtBbB4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},{"id":"ytc_UgwiZS6wQ2O4n4kkMSl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_Ugym4BofDM3Ruaa0Itl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},{"id":"ytc_Ugy-iTIgsOWh2WlZNcJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugwt07xE3iS5kznTUIR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},{"id":"ytc_UgzlDRILYmrsWTgmfg14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgyPYkgByNSGt035qLB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},{"id":"ytc_Ugwd6WYy4A3e6x2ha-94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}]