Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Scary. This video was presented 4 years ago and now 2022 Elon Musk warned on AI …
ytc_Ugz55I0SV…
G
We'll reach a point when AI will become exponentially more complex than humans e…
ytc_UgwCFMbzT…
G
Robot: 'Human cant create smarter robot then human brain"
Human: "Why"
Robot: "I…
ytc_Ugweb0APd…
G
@thewannabecritic7490You actually sound so corny right now, whats corny is the c…
ytr_UgwF5kvpo…
G
We are also robots and without skin & flesh humans are similar to robots interna…
ytc_UgziPdsu_…
G
What does "SI-AI could view humans as [...] raw materials" means ?
Raw materials…
ytc_UgwmZKLL4…
G
Moral to the story. Speaking to AI. Without a connective network/method of commu…
ytc_UgzgEpCJw…
G
@ImginaryHeroineyes, there's ai slop there's also just normal slip. There are p…
ytr_Ugx7UWCbw…
Comment
@perrydimes6915I watched the video and work on LLMs everyday
I run 4 personal ones at home for experimental work I do
have you worked on LLMs much? I don't mean using them, I mean developing them
youtube
AI Moral Status
2025-10-30T21:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgwCYqWq4Qbl8PSoD514AaABAg.AOv8jxc80nxAOvJ4WdT3mU","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugz-7kyDNsbzE1_2scN4AaABAg.AOv8h1fhwUnAOvAazTjgAA","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugz-7kyDNsbzE1_2scN4AaABAg.AOv8h1fhwUnAOvEVh4mrm5","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwZodMs5G-ScGJk6NJ4AaABAg.AOv8B77ai73AOvAx5LxVRi","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugzy2yuIDLIM_CF3lWN4AaABAg.AOv85PiQaTMAOvIOFE6C6f","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugzy2yuIDLIM_CF3lWN4AaABAg.AOv85PiQaTMAOvImtPifam","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytr_Ugzy2yuIDLIM_CF3lWN4AaABAg.AOv85PiQaTMAOvIyv-UHR0","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytr_Ugzy2yuIDLIM_CF3lWN4AaABAg.AOv85PiQaTMAOvJGavgURI","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugwz6ZtSJ4_bZauzZ6N4AaABAg.AOv8549-nIGAOvCSg2YlKp","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzU0EmuZ55E9T-ENeh4AaABAg.AOv80pWyMdBAOwGvZFHVRI","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]