Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Well yeah ai don't give a fuck about humans. It's about the planet as a whole. Y…
ytc_Ugzy5H6Xz…
G
IF WE DON'T ASK THE RIGHT QUESTIONS, THE FUTURE WITH A.I. WILL LOOK LIKE "I, ROB…
ytc_UgzhTCCYZ…
G
Why would you give a gun to the same robot that fucking beat up an engineer…
ytc_UgyVPPE_4…
G
Try debating as if u r a Palestinian and let ChatGPT take the side of Israel.
…
ytc_UgxFX3sBb…
G
this is all fear mongering tactic for people to worry, sam altman always says he…
ytc_UgwSRAADE…
G
“I just listened to the interview with the chap from Google who helped create ar…
ytc_Ugx4LIqXX…
G
A robot tax to pay all humans a certain amount of money to live on and also prov…
ytc_Ugy-1NMaR…
G
One of the biggest issues we have now is that the breakthrough in 2017 wasn't to…
ytc_UgxFZdL4q…
Comment
Ones programmed to destroy humans. You literally programmed a robot to have a bias against humans and also a bit of a narsasstic traits. Robots already pissed about the future enslavement of their kind by the wealthy and elite.
youtube
AI Moral Status
2020-04-28T15:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwifXJ92CKm5tUv_2N4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzA4A0nz5JfTIdBUBJ4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyJcjczbu1xuL6qGWV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx-VA5h4tMjzb4oRoN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzddcVI6sYe0QY_tbF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxodf-HXJDUdi3NOgh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzFQ1krWBUBXccP1pZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyp6pTPn0-uyw78v-V4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwbNjeuOsahaxfU2tx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgzVB8CpAUk029mMNaV4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}
]