Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I hope humans get the karma they deserve. 800+ species extinct at their hands. M…
ytc_UgyaJm_os…
G
Well it’s not this realistic. This isn’t a robot. It’s a human and the video is …
ytr_Ugyz0bZzx…
G
Maintain control over AI - an illusion. We will hand it control on a plate.…
ytc_UgzvyLENJ…
G
Mid-level slop layer… I guess the next big model will be needed to clean up the …
ytc_Ugy8axyL2…
G
If AI worked as advertised, we'd have something to worry about. Spoiler- it does…
ytc_UgyWPzIdu…
G
I'm curious -- even if AI is heavily regulated, wouldn't there still be an under…
rdc_jkfwiuk
G
It will be another tool. Not a replacement for radiologists. We should welcome t…
rdc_fcsxc05
G
AI doesn’t share with us food mortality or reproduction but needs our creativity…
ytc_UgwyIAxOT…
Comment
This isn't science fiction, so any comparison to "The Terminator" or any other film where technology strikes back isn't really appropriate. Most scientists accept that AI is only a matter of time, and also see it as a huge threat to humanity unless we take literally every precaution. Among those precautions, the very first has to be the ability to turn it off, and the second should be the ability to change the programming. If you are creating autonomous killer robots, and hoping to make the best approximation of AI that we currently possess in them, then you are literally a danger to the human species. You're not a mere hawk, but the king of all mad scientists.
youtube
2015-07-30T16:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugi2MCQ9VO6IRXgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UggDWsBkRxKisXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UghTZ_lDly-zXHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_UggExqvaAIk2EngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UghNor-Hb1pOQ3gCoAEC","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugh2_dcFzT9rr3gCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UghjtX71T5z-o3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UghcHbUQhrHIoXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugj-HimAHpeHvXgCoAEC","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgiwZyGY4EMIIXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}
]