Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
i love how even with experts warning us about AI and robots we just continue to …
ytc_UgyqT-OSY…
G
I like most your stuff Ken but this video feels half baked. It really needs an e…
ytc_UgyVBFEH_…
G
Look im not hating but face recognition mixing faces up is bound to happen. Many…
ytc_UgwYIo3zR…
G
@RezzyFlintboyOfficialMohawkRap This was a reply for specifically for Black Amer…
ytr_UgwCsmPMK…
G
fanman421 The advancement of communications systems is precisely due to the adva…
ytr_UgwG7bo8v…
G
The Internet doesn't care about laws and never has. If you don't want it copied,…
ytc_UgxTYGrWZ…
G
I hate AI so this is funny to me. Also it isn't racist because robots don't have…
ytc_UgxPO0nqo…
G
it's the same damn thing when people say "it's only sexist when men do it" it's …
ytc_UgxdyE3D3…
Comment
@Treeoflife77777 you do realize AI will not have emotions, unless we give it to them? Emotions, drives, instincts were evolved, because they benefitted us in passing down our genetics, and are not emergent from complexity? We have regions which are specialized towards emotions.
So even if sentient AI ever comes, it will still, fundamentally, be a tool subservient to its creator.
Also define AGI. As per the definition, it does not even have to be in any way concious, sentient. Just good, powerful enough to do almost all cognitive tasks as good as humans, including planning, being creative, adapting skills from other specialties.
The biggest risk, stemming from to our limited understanding, is a risk of hallucinations and mimicking stuff like sci-fi rogue AI's could still emerge and cause some damage.
youtube
2025-11-13T19:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgzkygVUItRREoX5aTJ4AaABAg.ASR8SHq32kXAU53rRJArcX","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgzWCL3QRI_XkJR1u9p4AaABAg.AQDKbcYSNDzAQDvS4B2yyn","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzWCL3QRI_XkJR1u9p4AaABAg.AQDKbcYSNDzAQFUvJtIaCt","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgwIB5bfZSWUdTI46KF4AaABAg.AQ9we4pHizoAQCsdVUxDp8","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwIB5bfZSWUdTI46KF4AaABAg.AQ9we4pHizoAQDvBWqJC4B","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytr_UgyNETD-LVZNzEN1g8N4AaABAg.AQ8LM-jh3PVAQCJu7UNrUj","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytr_UgyrzMCzqOmxxK7MBRd4AaABAg.APy9wK4r5ExAPyAGAV7AtV","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytr_Ugy_Y_2rU3MAA4BrgHJ4AaABAg.AOnmOQStzKqAP7R6WtdVXk","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugy_Y_2rU3MAA4BrgHJ4AaABAg.AOnmOQStzKqAPU0xZ9kvvT","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgxZaMiJUjW03FvD6R14AaABAg.AOeezqCZ1gbAOeftBeXrsN","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"resignation"}
]