Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
That's an interesting observation! The appearance of moisture on Sophia can some…
ytr_UgyTyB1Nt…
G
Agree. The time the AI needs for the task is limited by the tokens per second th…
ytr_UgzEVmaYx…
G
We were monkeys with sticks and rocks, then swords and arrows, then guns and big…
ytc_UgyTScDvE…
G
I’m tired of the advancements of AI being pushed without any caution. It’s going…
ytc_UgyGXDr9f…
G
The AI can keep their economy going and we can go die in a war.…
rdc_ncl5vba
G
No more AI. We need to get rid of it. It’s what is driving up the cost for parts…
ytc_UgxW52qtL…
G
When two vehicles approach each other on a street with parked cars restricting t…
ytc_UgxnGPn8n…
G
I asked ChatGPT if healthcare should be a right guaranteed by the government. It…
ytc_UgxofX4jE…
Comment
Isn't it more likely that AGI would be a service that could be purchased. For example I could purchase a yearly subscription to power my smart home with AI. I mean these is no need for each appliance to have AI individually.
And given the large computational power required, it would be better to house them in dedicated server farms for efficiency and cooling, for atleast the early stages of AGI.
Even in the industrial sector the AI would be a high level system, that oversees overall operations. But things related to safety would be hardwired to the local machine for rapid response and also compensate for the communication and processing lag.
youtube
AI Moral Status
2019-01-20T18:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugz5d1Q6Hspo0LkZHcJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz2Ria52U8rYm4o-Ll4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxh0nN_yMz5rJNJX6d4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyofBn08Bm4LyCCnOB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwa2yI9dUj8pVUFUbd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxBFHv6g4gWKYZc5cV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyJ6S4J8y7auS0JgyB4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgybG6Eri3iLrYs_tgx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxvU-20s4sLbbGjtNd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxH35ZKOkIzcvq6hZp4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"}
]