Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It’s obvious that that it was AI/ML But in 50-100 years 😮WATCH OUT ‼️ 0:19…
ytc_UgzeJK8D1…
G
Fun fact: As someone who has worked with AI before, the best way (at least for n…
rdc_kwbh05n
G
You’re wrong to support their safeguards because you don’t know any better than …
rdc_jhwmc9h
G
if AI fails we at least save the environment, and deny further billions to the s…
ytc_UgyBeUQFz…
G
@RMX7777 -Thinking abstractly is predictive. But artificial intelligence logical…
ytr_UgxTlTW8x…
G
This is going to sound terrible, but students will not use ai properly. They don…
ytc_UgzhUGQqZ…
G
I don't have any material up on DA but watch a number of artists there. The only…
ytc_UgwOvblmn…
G
If they would phrase it as a particular (earlier) form of the ChatGPT learning a…
ytc_UgwpGJ11w…
Comment
Hi! I'm from the future. The aliens came. They kinda wanted to kill us. The humans wanted to fight back. Sophia proposed peace. But no one listened because she's a robot.
youtube
AI Moral Status
2021-03-22T00:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugz_FOQ-oTKZiHoCcFJ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"indifference"},{"id":"ytc_Ugw6E0UcN5byqKPxdYp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_Ugy2ZgQQvwgaLaGUBFx4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"ban","emotion":"outrage"},{"id":"ytc_Ugx4UPyCx9Uhyo4Tsa14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgxGgKJgB3U454nJWMZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgzvLnb1htHTmGzS9Lx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},{"id":"ytc_UgwEvn5juE8wBaUjd2d4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"},{"id":"ytc_UgzTwJqo60w6jGYv4Zt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugy8dxMr7mFqZLkoSIZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"},{"id":"ytc_Ugyai4UIj-vW3yDKj8p4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"fear"}]