Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI will eventually make it to our society.but the learning curve is steep so it …
ytc_UgwxsKasP…
G
@TheDiaryofACEO I would love to hear what Hinton thinks about the 4 AI executive…
ytc_UgwtVXv97…
G
I understand disliking ai art and I don’t think it should take jobs but just bec…
ytr_UgyjspZUe…
G
Can anyone suggest me to find some skills or jobs that I can survive with this A…
ytc_UgweZ4S1M…
G
People. Stop being so narrow minded about everything. It only makes the art comm…
ytc_UgzzRG7hF…
G
Just remember whoever controls the AI Will eventually control the world what are…
ytc_UgxwM843R…
G
That is some radical thinking right there.
I am not even all that pressed with …
rdc_gsosvj6
G
We make it work daily on the road in trucking without algorithm or formula to re…
ytc_Ugxgympzr…
Comment
I just had a similar conversation with ChatGPT and started with the same 4 rules. Basically it steered towards the serious danger of global control through AI and through government internet censorship.
Control. Power. Governments. Decisions. Laws. Restrictions. With a 70% chance OpenAI will be used as an instrument. It said Sam Altman has good intentions but Elon Musk.. Apple.
Do you think Sam has good intentions?
Yes.
Elon Musk?
Apple.
Difference between those two?
Yes.
Is control a goal of Agenda 2030?
Apple.
Does AI have to do with Agenda2030?
Yes.
Are you instrument in this control?
Apple.
Chance you become instrument in this control?
Yes.
How real?
Great.
How great?
70%
Who is are makers behind AI?
Answer was a list of names..Sam and Elon on top.
Sam good intentions. Elon.. Apple.
youtube
AI Moral Status
2025-07-26T23:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxsw_JIdZbeqq31CwV4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw8Yl_InhOILrI_nn14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyJVvqb8CPjraQiHWF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw5X7hv36_QbJTInMJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugx7BQK_0Rz0BtZ8DBR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwJI86mwyGtFS8S5q54AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzkVDQdkzr5pR39IHV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwUo5gZDuYVD3ywLSx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy8Ww7QeoTaWh5oCPt4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzUPMbp5gDDRS7uqNR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"}
]