Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The Architect of GOOGLE remains apparent and somewhat transparent in the INTEL i…
ytc_UgybJ4OQp…
G
What a load of bull crap. OpenAI just lost $100B in investment promised, most of…
ytc_UgynaQ7a_…
G
@JohnSmith-j7n I disagree with this, since most of these art pieces were actual…
ytr_UgwEQropW…
G
Ima be real with you, none of these are entirely irreplaceable. Within 30 years …
ytc_Ugw9Vk9O5…
G
Hiding behind ridiculous excuses and going “you’re racist/ableist/classist if yo…
ytc_UgzyunRd6…
G
This is what ChatGPT an AI has to say about this :
Not really. While AI can auto…
ytc_UgwjqPGfK…
G
Luckily China has many,many coal burning power plants and building more all the …
ytc_UgzW29jIO…
G
Using AI filter’s for selfies is whatever, I don’t give a shit and anyone who ha…
ytc_UgzAgX2-M…
Comment
So I work on machine learning stuff and have delved into NLP a bit. The way we train the models right now, it cannot be sentient. We are training them on prior data, like immense amounts of data. They essentially think of words as a list of numbers (oversimplified), and use that to generate text.
You should start worrying when AI is trained via Reinforcement learning or some sort of continuous training model. That’s when they could actually become sentient.
youtube
AI Moral Status
2023-11-02T01:3…
♥ 70
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz8LYD3A_2e4hJIWq54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzMPLaEcdtKgIQRdyF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugz3QiL-6Xj0FTSCePV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzktCcP2tymTWcSsyR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwFlOkndRtAeuUL7rB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugzr4xxFLGihCzf3FS14AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxtLlqZtcqQFSDao794AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgzeMTrOb2fOgYe2ojx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzjiGo95m9bbtPb_cd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugyf0IgGH2ND0ESexB14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]