Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
When you look at a situation where we, the American people are seen as “customer…
ytc_UgyxmQalx…
G
I call BS real quick, flag this comment, Politeness in AI interactions, such as …
ytc_Ugxv0O5Co…
G
Love this. AI slop is plagiarism garbage and needs to be stopped. I want AI to w…
ytc_UgyJEyaUJ…
G
I say if you know you have a friend or family member that does deepfakes...rat t…
ytc_UgwIls7IA…
G
Satan his real name archangel Samuel God's venom he will be using AI 2027. Cern …
ytc_UgylwJ-cr…
G
What artists DO need from A.I is an A.I that would do in-betweens and line-art a…
ytc_UgyL2krhJ…
G
No it happened 2 years ago and the worker stoodt infront of one of those heavy d…
ytr_UgyXdOIUP…
G
There have been SO MANY statements only 14-15 minutes in that I feel you should …
ytc_UgyGzV4p_…
Comment
Robots will be superior to humans.
1. Our biological bodies are only 20% energy sufficient! Humans can only turn 20% of the food we eat into mechanical energy, and advanced robots would have much higher efficiency.
2. Humans need to sleep and can't work 24/7. 1 worker is working 12/h shifts for 7 days, and sleep 12 hours. The robot could work 24hours and is 100% more efficient than humans than the worker.
3. They are superior in space travel. Robots won't be damaged by radiation.
4. Mass reproduction
And the lists goes on.
We are inferior to future robots, if we can't control them, surely they will kill us all because we are a waste of resources and energy.
youtube
AI Moral Status
2017-02-23T20:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgjpHbD1cb_bGHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgiwpEgnkVIjz3gCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugj3v0gqenbpS3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgiMAV2WUQbo3HgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgjPO1aWk3kRM3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgiaWn-BMIFxdHgCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugjesjn2d2Is3XgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgjGnJ_vguQsu3gCoAEC","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"unclear"},
{"id":"ytc_Uggz-8DSC64i2XgCoAEC","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UghST1ICt0Ozk3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"})