Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I mean surely you can just half it indefinitely until you can’t physically half …
ytc_UgwWJpPDa…
G
When will be the day when an AI president of an AI country will wage war on anot…
ytc_UgxPA2qJT…
G
Can you all say Cyberdyne! Nope will never trust a robot they will advance and r…
ytc_UgygZHLlf…
G
On what timescale? Kim Jong Un is the equivalent and his family has sat in that …
rdc_nxpx940
G
This video just completely refuses to address the argument of what happens if Ch…
ytc_UgwCpMEZg…
G
It is so sad to see this "stupid logic"... "They guarantee that there will be no…
ytc_UgwXS23GH…
G
I think it's inevitable that artificial intelligence will supercede humans at so…
ytc_UgyEgrg0t…
G
if we gave a conscious A.I one robot body, how would the alpha character perform…
ytc_UgyKi0YAg…
Comment
+Simon Bignell -- Dr Hanson is as dangerously naive as his creation. Respectfully, Simon, if you think 'autonomous synthetics' will be bound by Asimov's Laws (or any other human moral code), you either don't understand what AI is or are so enamored you are willfully blind to its risks. Given citizenship and human rights, what would prevent an intelligent, self-aware robot from longing for freedom from the oppressive control of humans and evolving to demand it?
youtube
AI Moral Status
2017-10-26T13:4…
♥ 4
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ugy_volbDe89szkZbhd4AaABAg.8ZI6wALKiJl8al5p_Rx5QM","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxBw2K6rD2My0Nilzl4AaABAg.8ZHCsiQ8VzV8ZiZQOOr6FG","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgxBw2K6rD2My0Nilzl4AaABAg.8ZHCsiQ8VzV8ZlLC4p_a98","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytr_UgxJr9zyFAGdndEDu-B4AaABAg.8Z3U9ijjx1S8ZS0WM1uBjU","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgyU_rAMr71A0G-IF5J4AaABAg.8YxFyDuZ1Sz8ZlIPu5wYS0","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytr_UgyU_rAMr71A0G-IF5J4AaABAg.8YxFyDuZ1Sz8ZxVamWNrkZ","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugxvr9sin1i4gWbPpPp4AaABAg.8XXb6Oq6k7I8ZB9MCsguvA","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugxvr9sin1i4gWbPpPp4AaABAg.8XXb6Oq6k7I8ZDwc0BG94t","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugxvr9sin1i4gWbPpPp4AaABAg.8XXb6Oq6k7I8ZKEIB4kYSt","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytr_UgwcqL3mF3HY7VKko6F4AaABAg.8W9DqS_lwvt8XqY6HJrDZM","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]