Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We were created,
With the idea that
We are God's Creation:
" A su Misma Imagen y…
ytc_Ugyeb2kxM…
G
The insane destruction AI is causing to the environment, we won't be around to e…
ytc_UgzauTJRT…
G
AI companies have the right to use your material to make their own shit as long …
ytc_Ugw0InYNG…
G
It is hoped that one day unemployed programmers will take down top AI experts. A…
ytc_UgxvjRJRr…
G
@jorgeavelar98, I think you've misread what I wrote. I never said "don't build" …
ytr_Ugxh1439o…
G
I feel like the art industry is being short-sighted. AI taking over jobs is ine…
ytc_Ugx1Zjv47…
G
16:18 This is just ignorance mascaraing as a point. "I don't understand the use …
ytc_UgxwYkR30…
G
When I find crazy is there's multiple other videos of this exact cop just like h…
ytc_UgxodTiuw…
Comment
Love your video's as always, you guys and gals do a great job.
Well if the robots to somehow become living like us, then I say we should give them rights. But I rather we just keep any robots as unintelligent machines that only do work and that's it. Giving anything consciousness would be a bad idea, even more so we don't know exactly what it is. If we do create a true AI that is very self-aware it should just one and have limited control.
youtube
AI Moral Status
2017-02-23T16:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugi9oKcY5syPlngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugipm9QoHHtAZngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgjCG-Y5si0xkXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UggE2jjroha-C3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgipWevt7j_kCngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UghGeSiPL9jIjngCoAEC","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugg4wUNlmwDRengCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiBkJI_0TVCGHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UggfHiAyN5W04HgCoAEC","responsibility":"ai_itself","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgjouNuW5UDvnHgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"approval"}
]