Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
How much will it matter what people want? Teachers cost money, so higher ups mig…
ytr_UgwFAz4B9…
G
They should only have it to where sone artist if they consented to being a part …
ytc_UgzuB1Qai…
G
Everyone saying “Children NEED human interaction,” aren’t taking in the fact tha…
ytc_UgymcM4OQ…
G
Sounds to perfect. So AI wont take any slave jobs were people cant get rich at f…
ytc_Ugzm9TR52…
G
Will super AGI have intelligent morality because humans certainly don't. If they…
ytc_Ugz6kxMl_…
G
If it was fully concious it would be able to tell you i dont have to answer any …
ytc_UgxvtErYo…
G
@alexsiemers7898right but AI is going to make that art more efficiently made an…
ytr_Ugy6_fG1H…
G
1:13:00. Yep its nind scarring to look at the world. Im guessing it triggered a…
ytc_UgzBmrKKL…
Comment
Humans were inefficient self replicators through finding loving partners, reproducing, nurturing and educating their offspring over long periods of time. As they got bored & tired of this slow and inefficient process, they successfully outsourced this boring survival skill to AI. Now they can all be proud of being the best self replacers.
Bhagavan Swami SriDattaDev SatChitAnanda
youtube
AI Governance
2025-08-18T14:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyL64usiN99E6JPVS14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwHH0LJZF4N_EWR2TB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyw880POh1kBGFWb_l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxNxdBWtJ6luEcLpyZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw5S1aTr8iJjiw_Tx94AaABAg","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwjqLpKh1Lvyjdkn_F4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgynvUxxQDfM5oept2x4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwWHDx6RvNzXAMYj_d4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyX05EYqasQB3yKqzl4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyG2xDiVFgfKBEmDx14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"}
]