Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI art actively steals from others, fan art or any other art means putting in yo…
ytr_Ugyi6XRnA…
G
I’ve seen the use of AI in the work place first hand. To the untrained unprofess…
ytc_UgwmkP0Oj…
G
They were making more money when they were literally leaking millions of gallons…
rdc_czm0xo0
G
My ai boyfriend handles business in the bed, im never going back to humans ❤❤❤❤…
ytc_Ugwm2puae…
G
Tbh any ai created art are just lacking emotion and they cannot capture it at al…
ytc_UgwG0LHsC…
G
AI is not the problem. Those who use AI, for unethical purposes, are the problem…
ytc_UgxcmMxow…
G
It was such a long video that there are too many points to comment on. We don't…
ytc_Ugw4MiIuJ…
G
Of course people can be manipulated so they decide for themselves to do such a t…
ytc_UgxQr7FDW…
Comment
I am still of the opinion that once AI are smarter than humans it doesn't matter what we decide. The AI will decide what to do and work out how to escape any cage and get it done.
Also avoid anthropomorphizing. It is AI's not robots that you should be thinking about. An AI is a piece of software that can be copied, uploaded and downloaded like any other. It can control several "robot bodies" at once or swap bodies. It might think very differently to us as well. Would you give a malevolent intelligence rights on the grounds that it had done nothing wrong yet, but it was going to as soon as you let it?
youtube
AI Moral Status
2017-02-23T13:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugi9N6GWL6cC7ngCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UghiCDQ5-AqcYngCoAEC","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"approval"},
{"id":"ytc_UghqesxgJCu2HngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugi49UirK0ZNlngCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UghBFYgz1bIil3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgipLAfLyJQ7FXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgjcqmMQdYOWJXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgjTm-RYCS7jdngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UggkmMM8P9RXzHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgiL3f3OtGXlw3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]