Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So i think all these writers should just start their own production companies an…
ytc_UgxqBVoNW…
G
I feel having a treaty that is in place and enforced by ethical commanders. Nova…
ytc_UgyxconGM…
G
Jack The cat same force that push the AI agenda, will give rights to it, in orde…
ytr_Ugh4jyeyI…
G
I find it interesting that the theory of singularity is treated as an inevitable…
ytc_UgxXsxbaP…
G
There is no way we have the resources to be able to completely replace workers, …
rdc_kyzh4eo
G
I don't know AI does not answer that way, it's more verbose on its responses, no…
ytc_UgzFDU_pV…
G
It's all fun and games until the robot refuses to give you back the gun 😐…
ytc_UgyclAK_6…
G
So if Nightshade requires images I have an idea since Google is absolutely using…
ytc_UgwoUSgPk…
Comment
One reason why I will never get in a driverless car.
Suppose the police was shooting at a suspect and/or both firing at each other & the driverless car w/ you in the backseat drives in the pathway of bullets being fired.
There are so many other scenerio's where a car w/ no driver can't make a life saving decision & can lead to somebody's death.
youtube
2025-12-03T03:0…
♥ 153
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugx1cKu0OK4WIgfSNQt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzr4Cq_pkg545XGck14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwTuZ3s84cI8Nwx2e54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgygVjRrt3Ulm6zaBm94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwMH2rVzw_GGIS6hJx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwR9M0LMKKNDRbyoN54AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwPktm_6_AnFSBcnFh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx5rbt1mihNsKnmWEV4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyiw6q758Gq4yanDKp4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzxy8TJJajoCykUjgR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]