Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Fine. Our AI bots will answer the questions and send them a bill for training. A…
rdc_n6rx97x
G
@hunterkauffman9400If they are ethically trained np. But remember, open source =…
ytr_Ugwryu_mE…
G
You broke it down so well! It’s like how Troof analyzes our customer reviews aut…
ytc_UgzMbaqPO…
G
Who noticed the very first part were he poured water was AI. THATS HOW GOOD IT I…
ytc_UgzPxLV_n…
G
Imagine your parents were significant other found that you would be destroyed of…
ytc_Ugwm97pmd…
G
@lolcatststs Why don't you go ahead and watch his other fabulous videos? I noti…
ytr_UgyhcOHM8…
G
AI don't have such emotions so it can't do the art, just generate or form.…
ytc_UgyVzMlJC…
G
What are your thoughts on Embedded Software/Systems Engineers and Robotics Engin…
ytc_UgzV-AWSd…
Comment
With the concept of AI being the cause of our extinction or destruction …. Woildnt thst depend on what their intent or goals are….. I suppose if their main concern was preservation of the planet we would be fucked … but what would be the reason to actually go to the extreme and eliminate humans . I could see overpowering us but then why? Becuse they can do it better? What is “it” that they are concerned or focused on improving .. idk if thst makes sense im still contemplating all this
youtube
AI Governance
2024-05-05T17:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgwN1-z0vAqgtezSTPl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyVPL8inu7CaY-UMTR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxMcyfnekA2hOPdT4x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwg3u9lUpXe0KyCgTh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyhFQFUYB9COapPjW94AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzoxbak2bCZJm9YqIN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx40TDgjcORyEcs1654AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxF8vw6dbArl-oqrTt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxsvXovWuElrj6-B8J4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwuDwHE-09C9dABMQN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}]