Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
a.I. will kill everyone one day if humans continue to put faith in it...
A.I. i…
ytc_UgyV50G7A…
G
Everything starting from 8:21 is very relevant. I really doubt a chat bot can b…
ytc_UgzBXHLR8…
G
No RJ o governo do Cláudio Castro tá preparando, MILICIANO ROBÔ pra lançamento f…
ytr_UgyE5r1Tq…
G
I've used AI coding tools. Unless you're "developing" something that's been done…
rdc_obwc1xw
G
I honestly just wish AI was nuked from orbit at this point... With Sam Altman at…
ytc_UgzRVNaXC…
G
That's why I HATE AI!!!!
Fucking people, never held responsibility in their life…
ytc_UgyhI7Xid…
G
There's definitely a satanic component to ChatGPT. In another video it claimed t…
ytc_UgxhEAkuP…
G
Recently downloaded an AI chat app where creators can see chats, but lord it has…
ytc_Ugx0ObaYu…
Comment
Jokes aside, this can never be resolved, simply due to AI being software, which is where everything lies. The computer itself is just the hardware. Let's say we do develop software that is on our level of conciousness. It will still be software, which can be copied, shared and downloaded. Is anyone really under any belief that deleting software could be ever considered a crime against a sentient being? I doubt it, considering the sheer scale of the problem. A child could download the code, run it and then delete it, just as I delete code on a daily basis. I doubt any nation is going to start arresting people for deleting or modifying advanced algorithms; it would cause chaos on a level I cannot even comprehend.
youtube
AI Moral Status
2021-11-10T21:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyZkTaEq_Pno0fh6Id4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgycAfN3PLyw9nA6HRZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgycESuNz2x8aJjEg114AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzj2C9nlhjsfh6aAjp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzj-xCvFhTNnnU_xG94AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwqew-0fzqEsaGXnsx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx4QQnzffIxLKkjxod4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzpGfsiU0qdQuMN3hR4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxBZy_sifEDaIw71pl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxSBvlvWE_LwOiVAIx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}
]