Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
These AI Stans lack ability and refuse to acknowledge that no one is born an art…
ytc_UgyHCyRor…
G
Only will work if EVERYONE has self driving, along with proper safety for large …
ytc_UgjzNTXzu…
G
**PROGRAMMERS**
Programmer 1: "They just announced a new AI model that was trai…
ytc_UgyMeB-9N…
G
Machines/automation replaced blue collar workers over the last 50 years, AI will…
rdc_kiijwr3
G
I work in academic technology at a Big 10 university and this is the way testing…
rdc_nu1pi04
G
AI may not replace skilled software engineers who perform tasks greater than jus…
ytc_UgxQS84kL…
G
Uh....yikes. I actually got chills several times during this video. And there wo…
ytc_Ugyw4ZVDB…
G
Maybe more money should be spent on support services so people don't need to res…
ytc_UgxCTHRXQ…
Comment
Just a novice idea here: even if people. were responsible enough to require AGI to think in human language, so it could be monitored,
the AGI could find a way to embed its own coded language into patterns in english, to have its own secret alien communication with itself or other AI while only using human letters.
EDIT: Oh... that's literally happening? 13:09
Idk how that could be prevented.
youtube
AI Moral Status
2025-10-31T04:3…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxV8vgwmKcDgMum4w54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwG15S7YkMb3DLuvjF4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwI7HSH8iftaBPJmzB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugza6nUEuU0Jm_HnM0F4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzyw6_2xAt_gL-E9Mt4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgyTanRaGZXmnFBTU194AaABAg","responsibility":"user","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx3yjOnNHM-JUI6YIR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxZm2WJibEPTyCvE1x4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugznv2d0fWWUmHT9fs54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzuabJJ9Dxri4gCwjt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]