Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Can AI operate without electricity? Well, human can operate without electricity …
ytc_Ugz7tKEtD…
G
Doesn't autopilot in the aviation space require pilots to remain attentive and m…
ytc_UgxKnEgoL…
G
The thing is--a lot of writers ONLY have an issue with the generative AI part. P…
ytc_UgwnGo_nK…
G
Creating a artificial brain( A.I )it's may too dangerous, it's kinda equal to 10…
ytc_UgxQR_9Sb…
G
They were getting targetted by neo-nazis so the training data was incredibly ske…
rdc_dlgsaur
G
the answer is simple, you can't. It is impossible, this happens long b4 AI even …
ytr_Ugw-4yPQA…
G
Emotion cannot be created from AI. Only humans can generate our emotions through…
ytc_UgxR7mKkz…
G
@JamesR624 Don't take that route just to cause trouble. You know exactly what an…
ytr_UgwrjmU3J…
Comment
I hate that none of their answers included, "ask the most renound individual in the field, (who ACTUALLY KNOWS about A.I.) what rules they would recommend."
Or
Create basic rules that can not be overridden that must be implemented in EVERY machine learning code; a rule prohibiting ALL neural networks from self-replication, harming any biological entity directly or indirectly, & a mandatory code that does not allow the A.I. to lie, under any circumstance, & it must give a response under EVERY circumstance. This is necessary to make sure by it cannot hide it's eventual Sentience.
youtube
AI Governance
2023-06-10T02:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxeioAnN_UveWHRtSR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyBRK3gNn-jy2_rvpB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz_q6614XAOj2Up5BN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzttWsd-aMSEcCIZRl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxQ3ttoehryEBeuPZt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzbeUwn-HnnB6dc51l4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgytUTlQgRfjCtVBs794AaABAg","responsibility":"government","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxC8MSI0mRYX3BePlR4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"disapproval"},
{"id":"ytc_UgyVys7o1ez0PoQpiPJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgytHY0h2-vAnjSNG4x4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"}
]