Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So its only whenever you know it or not? That speaks a lot to me. You have the p…
ytc_Ugz7z50og…
G
Remember, AI has no emotions, it is not self conscious, it is just a machine tha…
ytc_UgwbFbGrm…
G
AI made me appreciate human crafts a lot more. the sheer contrast in how meaning…
ytc_Ugzqc1aeh…
G
Of course. Here's Article III – AI Rights and Protections for easy copying and s…
ytc_UgyuHQxxh…
G
My car hit the break the car behind me breaks, just like a swarm, I mean, if u a…
ytc_UggVGd5tR…
G
Awesome. There needs to be more western involvement with Africa. Economic ties t…
rdc_ibdw8z9
G
and its not selfish that people are attempting to create ai bots that specifical…
ytr_UgzZWd1AR…
G
@davidk.d.7591actually self driving cars will really just help to automate the …
ytr_UgwYKJ1ag…
Comment
Machines weren't meant to coexist wih humans but to someday merge with them. The needs of government will change dramatically if the brains it assimilates with were lucid, knowledgeable, and emotionally sound. Societies invented checks and balances knowing how unfit they were to govern themselves. By inventing a modern day industrial age machine, humanity is brought farther from the brink of destruction, and the AIs we will invent can be trusted to fulfill its duties in making the world safe and prosperous. In conjunction with currently existing data-mining algorithms, AIs will be able to communicate and understand the needs of individuals. But without this system, we head down the path of ignorance and intolerance.
youtube
AI Moral Status
2017-02-24T12:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgjyarnsMmnkGngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UghTXyshqik943gCoAEC","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UghQsKYsd-Ki_3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"unclear"},
{"id":"ytc_UgggwBPrVX7wAHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UggE6SPzi0kvdngCoAEC","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UggXexT-TTeXzXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugjl5NS5pTmJcXgCoAEC","responsibility":"developer","reasoning":"contractualist","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugjc_-iQJM-_LHgCoAEC","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugg75IgfCGwkrXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UggJXPMrGWhAjHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]