Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If I hard coded a program to have the exact same outputs as chatGPT would it hav…
ytc_UgxydWeox…
G
AI has not achieved consciousness. It lacks semantic understanding, which is a f…
ytc_UgyFzCBkb…
G
I find it somewhat ironic that:
- jamming is used to disrupt unmanned systems
-…
ytc_UgxY4zM2e…
G
The dark side will be all this stranded investment in AI when the next new tech …
ytc_Ugzhp1iYe…
G
this is granted that the customer is talking like a robot lmaooo no way they wou…
ytc_Ugwkn9FLa…
G
Tbh, I love that AI Art exists, as it is a cool and trippy way to potentially ge…
ytc_UgxoCBsM-…
G
Hopefully, there will be surplus wealth and resources created by/with these mach…
ytc_Ugxp_kSVt…
G
I think that self driving cars should protect the driver at all cost, because it…
ytc_UgxU5tCZM…
Comment
@TXA-TXAT I'm not coming up with any of this. The guys actually creating A.I. have been warning for years now that A.I. is a civilization crusher, an existential threat more dangerous than nuclear bombs, a possible extinction event, etc. That's not me, that's the people actually creating A.I.
youtube
AI Moral Status
2025-12-12T12:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_Ugy_WNIbBomIGKAcfI14AaABAg.AQ_iY6lZEwYAQ_ifftVS4n","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytr_Ugw61iVoFLr5_5NgL6d4AaABAg.AQ_ahDUi0wMAQ_sTHmBDHJ","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugyi9260BQpZLn8Oyv54AaABAg.AQ_ZFGjR5ZuAQ_aKV2F70l","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugyi9260BQpZLn8Oyv54AaABAg.AQ_ZFGjR5ZuAQ_awivC_vN","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugyi9260BQpZLn8Oyv54AaABAg.AQ_ZFGjR5ZuAQ_c3WAhdQc","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgwmhVYTzWvcn3kZPmF4AaABAg.AQ_UznOuOu0AQajUCWvfav","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"ytr_Ugz3aSVslnH5kTOyCc94AaABAg.AQ_UOqn_hYKAQ_i-Aon1fr","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugz3aSVslnH5kTOyCc94AaABAg.AQ_UOqn_hYKAQcvENmpLyd","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgxPNcZga7pERXvs0pB4AaABAg.AQ_PfwLjPa6AQaXJcFgwa2","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgxPNcZga7pERXvs0pB4AaABAg.AQ_PfwLjPa6AQiRDRTJ7tT","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}
]