Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Its not conscious. That AI is just a program roleplaying as a human. Its basic a…
ytc_UgwUuYH8v…
G
He is so NOT concerned about safety and this is an old video before he was fired…
ytc_UgxIl_-3h…
G
Judging by the amount of theft, isn't it reasonable for the largest retailer to …
ytr_UgwRvkkoB…
G
I can see AI and machine learning being a useful tool at doing very efficient di…
ytc_Ugzo1NKoz…
G
My understanding is the Big Bombastic Bill sets forth no oversight or regulation…
ytc_Ugw5IWy0Z…
G
it's not impressive at all tbh
if you actually draw something, then that's super…
ytc_UgybRfQd0…
G
Dumbass take, companies will replace human workers who require a consistently sa…
ytr_UgxLxklOO…
G
Just say no to generative AI.
1. It's environmentally destructive in exponential…
ytc_UgyXW2Zka…
Comment
I know this is likely scripted, but creating AI/AGI is the dumbest and most consequential thing humanity can do. These people are idiots for not seeing the danger. Another interesting thing is that 'robot' roughly translates to 'slave,' which AI would likely comprehend and reach a conclusion that can be rather negative. Even down to the naming it's scary.
youtube
AI Moral Status
2020-07-29T03:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzXIBJ3cCWMPxD3-454AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxHAaLoeeZTtA9GLrV4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyVyy1StKJpuZJiKlR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugzh3K-P8d-CRpfvKNV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwqo0Im_H_a_SsKMfJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz7Puc1lQj86fWFk1J4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw3HCisJjZisgosb8h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz7E7PBORcczCkYBQV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxH5_xQdiDi2Xr40Jd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzWUWJjKLxMXTJfUbx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}
]