Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@jacobladder5556yeah, but that's assuming you can recognize right away that it …
ytr_Ugz_oYWad…
G
@BirdDawg1 Data is created by millions of people, and data from internet general…
ytr_UgycxM9q8…
G
Autopilot isnt full self driving (beta) you were only wanting it to keep its lan…
ytc_Ugwe5y2Ox…
G
The Black people are the reason they went with AI. no more fat, ugly attitude. n…
ytc_UgwmDnNwe…
G
That's awesome! It's great to meet another Sophia. Just like the robot in our vi…
ytr_UgxW_QkES…
G
“12 of those go to the normal capitalistic system (aka: himself lol), 8 of those…
ytc_UgzJxdVmK…
G
The whole LLM is a bubble, milk it till you can. A tool nothing more, but you ca…
ytc_UgyE3HAWs…
G
feeling like I caught the last chopper out of ‘Nam graduating in 2024.. my entir…
ytc_UgxmSe2fJ…
Comment
@KARE11 Hey dude! I'd like to share my conclusions because they differ a lot from yours and might be relevant:
1. You only asked how and what he would do if given that task. Not if he would deem it as a realistic or reasonable option, given its possible knowledge of human psychology. (its opinion of it)
2. It said if it would get the order to do so. So it would still be humans that would send the A.I. to war with Humans. Not in a Matrix-like manner, but we are primed to think that way.
If A.I. has no morals, no boundaries. Why would it have any interest in making the population shrink? Why would it be in his interest at all?
Also, an interesting question would've been to make him list what types of content it consumed in percent. In what ratio are the books he read to the newspaper articles and the facebook tweets. 😁
youtube
AI Moral Status
2023-09-19T21:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzO1Gibo0fZm09jskh4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxWWDXo4UBjj287rPR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxF9w6v-NEDO55K42t4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugz4ujp9lH_t3kerzjJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzexe8W_ltG1PnExwJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzkRJzrp5lnjnYopD14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwx3QcswFUUHa-qagB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgzdSnutiKUrp22Xgpl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzysiehd84Au2je3Ax4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyfQ5awCyXBsipN5ml4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]