Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
At a certain point, society is going to have to restrict AI gen/usage or human m…
ytc_UgwG6tsm4…
G
Look - its never going to become "sentient" because sentience implies that it's …
ytc_Ugzlchf-c…
G
Blood, sweat and tears ♡
3 things the machines do not have!
i.e. no compassion,…
ytc_UgxYsase4…
G
I beleive this AI is more dangerous for humans. Ots already killing humans think…
ytc_UgxeUpDDo…
G
This is what I was thinking! Art doesn’t need to be optimized and removed from h…
ytr_Ugz562kCL…
G
Mahāvībhatsa
Those who know how to speak Sanskrit will understand how i feel abo…
ytc_UgzOdimPo…
G
Hi nice to meet you .. I am a student 👩🎓 part of Tesla humanoid family … I am s…
ytc_UgwXJSFd_…
G
The worst path AI can take is not any of those movies, but "I have no mouth but …
ytc_UgzYnnl0d…
Comment
it's crazy like 15 years ago or so we were crazy for siri on iphone the first virtual assistant that you can have conversation with despite limited and then Alexa a tube whom you can ask anything you want. Meanwhile, few days ago I just asked chatGPT 4 to help me preparing for scholarship interview and then continue with nice little chat with it.
youtube
AI Moral Status
2024-09-15T12:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgweRDgbprku7jG0f9d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzKyTfLcI1KSRuZBbd4AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyII0dzH5PU6-vQ_lB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwfJwS2iBY6-m5EE454AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxdVGaT7zaS6ZXYGat4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz1_iGio_zC0yX876R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwMXteBUTw4eKtKGat4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzxGZvaHSq078X0Lg94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw9e_IiMSYl_qcANdB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwc2_C3bsxQJPmu5rB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]