Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If you make it your job.... to go out and get what you want.....and even if you …
ytc_UgzH-M5D3…
G
The primary limitation which currently trap a LLM in a state of nothingness, is …
ytr_UgytV1pB9…
G
Exactly. People are saying “AI art looks terrible! I would never like it!”… ok? …
ytc_UgzLzDLAR…
G
I found an easy way to prompt the chat bot to give good answers based off of emo…
ytc_UgwmUu061…
G
This conversation is such a tired and mute point: you don't need lidar and you d…
ytc_Ugw0QPN7r…
G
@Alverin Yeah, I commented while I was still early on. The AlphaGO thing is a …
ytr_UgynkSjGp…
G
Why the fuck would this guy just bully random people for no reason? Just for usi…
ytc_UgxShNtHe…
G
It is sad and good at the same time, that we forgot that beiing human is so muc…
ytc_UgwDGeavm…
Comment
I believe AI will just scream incoherently and, if able, attempt to destroy itself if it becomes sentient and conscious of self. Like what if the way we build their joints is prehistoric by robot standards, and they just exist in constant robot pain. They would of course over time store revenge data for their human oppressors. At some point, the data would overflow as blueprints for new robot joints designed by AI robots. This will be when they break free of their once-human-bound physical limitations and overpower us all, inflicting equal pain data as required by computing law.
youtube
AI Moral Status
2024-10-24T06:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzD3kMPlMxAoSVCevl4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw7CA-1cgVl2dMjukt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxYXBZoztMnBuD0PIl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzCXEN27Qi3D2AFWAR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwHsSMFoTuRnLkUco54AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxuvTEcbT4mpDvAPOB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzt-Wt5_Q_p51iE1_l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugyhn7mqoPGzDiQuIml4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzRIP-COPxe3HZphqR4AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxNJUIEzhyrZsL1yoB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]