Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I dont know if this robot will be a mistakes or anything else that put humans in…
ytc_Ugyoaq3WC…
G
@jaca9566 - He literally stated the correlation that using AI means you're lazy…
ytr_Ugx7FeiEB…
G
It’s not AI , it’s DEI that killed million of new jobs, as smartest people for t…
ytc_Ugx-HhD_X…
G
I highly recommend everyone read Sam Altman’s (creator of OpenAi) blog specifica…
ytc_UgxIAD5gV…
G
@arha13 that’s just one opinion though. As it exists now, AI can’t replicate hum…
ytr_UgwqH-ji2…
G
I got an idea. Make up a story and say that the AI is eating all the eggs and we…
ytc_UgyF75hfJ…
G
How do you take an A.I vehicle to court when there is an accident? Its impossibl…
ytc_UgwEqzgs_…
G
When you start to build a robot, you want to replace the creation of god…
ytc_UgzUvikIb…
Comment
From my experiences I think we
1: Misunderstand AI's modelling capability as deep thought and thus
2: Vastly overestimate the actual intelligence of AI but
3: A sufficiently focused AI with the intelligence of a worm could obliterate anything connected to the internet. Stuxnet could defeat firewalls and air walls with a few megabytes. Now imagine something with that capability that makes its own decisions on targets on the fly.
youtube
AI Moral Status
2026-02-26T21:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugwe0HxlroXScWsUkm14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwvqNlom6u4DB4JlFF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwQlmmHIjG4vND3Bc94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzSW0v2YCPDMUjbL8d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwpgNNEY5lX2r6xbLV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxSrRL4C5ORf-37BTB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzMdWXXx5T_m_5Fq6d4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzLFd_x34RVyvJ-Lct4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyvJrUvmOZoViu2AJJ4AaABAg","responsibility":"government","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy8KhCdY-uLbwjHc_p4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}]