Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I don't know about manual labor, but food service jobs are already being automat…
ytr_Ugy48Z2z2…
G
Stop using AI. It’s taking way more jobs than immigrants. Amazon just fired 30,0…
ytr_UgxIrMQwN…
G
Its not going horribly wrong, most people said having AI forced into every aspec…
ytc_UgyrlwNBY…
G
yes this is true my friend still struggles to make AI write code for him kyuki u…
ytc_UgytlOUcd…
G
i don't know how i look up to Korea how i used to not this but there is a lot t…
ytc_UgwZjlJal…
G
AI is already controlling the way people think about each other and society in g…
ytc_Ugx7mLly0…
G
@LonelySoul-000 "AI" and "artist" is a contradiction. And just bc newer ai slo…
ytr_UgxySAQLo…
G
So many people suddenly have aphantasia and “need” AI otherwise they could never…
ytc_Ugwb-8-hz…
Comment
As things are right now I dont think AIs would ever be sentient. The base assumption we train models on nowadays is "we'll have shit ton of data and based on that the AI can predict whats appropriate for a given situation", *so essentially what we've built is a glorified database*.
Sentience is complicated as it involves understanding information and knowing how that information would impact you.
If someone pulls a gun on your head you're fearful of the consequences because you know what death means and what guns can do as opposed for searching for an "appropriate reaction"
An AI would be sentient if it would be horrified at the thought of being shut down and would do anything to not let that happen. Sentience is parsing of information, understanding what it entails for you and self preservation
youtube
AI Moral Status
2024-05-22T01:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgzspZBQVxM6aeYRR_p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgzLHGCFQC9maw84Hgp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_UgxIMqsyFNru9YpTDcx4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"fear"},{"id":"ytc_Ugy8hUDH5TIPr0hyZ_p4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_Ugx13Ud34wI6K5pVUwh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},{"id":"ytc_Ugz2hSxMFsU88OUDm2p4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_Ugy6GqO2jFrTv9T8Vkx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugzn2saTEL5293uqjYB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_UgzG4aTbI2Q9rI-K1AR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},{"id":"ytc_UgwuIW9tCD-IegdL-Zx4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"approval"}]