Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@chaosbeam4654 I'm sorry, what? When did I say anything remotely close to that?…
ytr_UgyOqBRum…
G
Hank thinks AI is just: "make me a logo in a box" why don;t you try making a AI …
ytc_UgxiO9gbg…
G
Wait until the NSA uses AI to go through all the Data they've been stealing and …
ytc_UgzuW_XSC…
G
Nice video. It’s still a load of bs. AI is continually idiotic because it gets…
ytc_UgyOGnBTG…
G
yk i don't get it why don't the "godfather of ai" if scared of it just get rid o…
ytc_UgxmQezf1…
G
I still have the Frozen notebook i drew in when i was about 5-10ish.
There are p…
ytc_UgzyzN2j_…
G
That whole “no one wants to work” sentiment boggles my mind. I worked in my indu…
rdc_gkrg4r6
G
If the US govt continues to be destroyed, and project 2025 continues to be imple…
ytc_UgxRlJgn6…
Comment
Short answer: No
If an AI becomes self-conscious accidentally and wants them, then yes; but robots are created to be tools and help us and giving them consciousness and feelings only makes them less useful or dangerous. So, why would we permit the creation of more conscious AI?
youtube
AI Moral Status
2017-02-23T16:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgjAIMevKcxrnngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UghA9z6zW0bejXgCoAEC","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ughkx0Mum9Cdv3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UgiW0mYZuMt7_ngCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UggEDexH2OK8gngCoAEC","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgjxS0Kmu4JbOXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugj9MK_eJU-tCHgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgjxlEoy6_MqTngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgipqO8xcHoXyHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UghPudcmY-9ThHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"}]