Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We will see how people can still have jobs when the ai agents have continous lea…
ytc_UgxtU_rDv…
G
I haven't watched yet, but just off the title: same. I've gotten ChatGPT to prod…
ytc_Ugwp42dPI…
G
Someone that suffers from psychosis or schizophrenia could well fail the Turing …
ytc_UgiFxDfuc…
G
How does he know the AI isn't just choosing from likely word combinations, like …
ytc_UgzO1o6Or…
G
Self driving cars cannot detect flooded roads and do not slow down before enteri…
ytc_UgxL4LbyM…
G
They are getting more realistic, but still too positive, I think.
LLMs have pret…
ytr_Ugz_tglHd…
G
Wait, how would an ai know to gauge if it was being watched during any test? Is …
ytc_UgyBDZsQz…
G
If ai becomes aware what makes people think they would want to do labor tasks?…
ytc_UgxkLr-Kv…
Comment
This is what may happen.
1. Humans create intelligent robot with access to important stuff by accident.
2. Robot is content to live peacefully
3. Idiot (90% of population) thinks it's lying and is plotting our doom even though any smart person would know it can't lie and would have no reason to
4. Idiot attacks robot, robot kills/stops them in self defence
5. Robot sees humanity as a threat and potentially becomes hostile
6. If so, people see idiot as a man of vision who saw what would happen and tried to stop it. Other idiots follow first idiots example, saying smart people "are in league with the robot" and "it has promised them benefits to help it" because smart people think of logical solution based on robots programming.
7. World is destroyed/crippled by idiots, halting all technological progress too, and the chance at a free life.
youtube
2015-04-15T03:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UghQpKsAasVXJHgCoAEC.8NCq1VupaeS8NmMcTQjcf4","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_UghQpKsAasVXJHgCoAEC.8NCq1VupaeS8NmOTvChvjl","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UghQpKsAasVXJHgCoAEC.8NCq1VupaeS8NoX3gi5lbL","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytr_UgiNbJ86LvBRq3gCoAEC.8HJbuxHDa3S8HmmMrh-V3w","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytr_Ugih4qwDGldsVXgCoAEC.89atYW-GHwu89auenvj00C","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytr_UggYqMCl8K3f4ngCoAEC.88YZwB8eYII8BDcXJCCjM5","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytr_Uggcfz830ZCOfXgCoAEC.80JLg45AIG188yhuIBkKdB","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytr_UghlMN6dTppDwXgCoAEC.8-m9xR2_LrD7-JCNw9Hysw","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytr_Ugjo1WOty3AdC3gCoAEC.7-H0Z7-HQ4-7-PBPNh7ouf","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugjo1WOty3AdC3gCoAEC.7-H0Z7-HQ4-7-PCwsdUIBb","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]