Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
When will we learn that AI can’t really replace people that actually do things. …
ytc_UgxqVGWTi…
G
When the time comes I will buy a robot who does the coding job and I will find a…
ytc_UgyoB8dKE…
G
ooorrr they go all overwatch omnic crisis on our butts and we get shot a million…
ytc_UgxcHqDQB…
G
Stable Diffusion, to this date, cannot even understand 10% of the prompt, especi…
ytc_UgxtkB9ij…
G
@Statick107 @Statick107 I know this is a common talking point, but in regards…
ytr_Ugz0qiujv…
G
Seeing as the "original" school system was made for the purpose of teaching kids…
ytc_UgwnuvKyw…
G
Absolutely not
If AI was actually correct and unbiased maybe
But that’s not t…
ytc_UgyRmSV73…
G
Thank you for your comment! It seems like you might be referring to the name "So…
ytr_Ugw-85KoI…
Comment
I think the reason why AGI is so hard to achieve is because the people developing AI are obsessed with getting the AI to act correctly. An AGI does not act correctly, but rather does what it wants to do without any regard to what the designer wants. A true AGI is its own person, and cannot be controlled into existing as only doing things correctly.
Until we start to develop an AI without an image of correct design; will we achieve true AGI.
There is a deeply dangerous nature to this methodology, as we will not realize AGI until after we have it, and we will only realize false AGI until after it achieves AGI capability with none of the benefits (terminator skynet).
youtube
2026-01-08T10:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwYHWbqZ54ejGxUq-x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz38yoNwCGprM9Gr3R4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz-hK1LOR8_MRDn6Lx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzqhyYrSJZgq10c9mp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyuJcq3hbENEtLnvyl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyTtnbsDrCRj52umzZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxKjiXlPpsmKLuOdGp4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyvLH7rbAIn3V3ImIh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw72l4Fqx5K8MOcuTV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugyr-Dl4q-EPkMB37-F4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"}
]