Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I remember looking a an AI generated pixel art monster that the creator said “lo…
ytc_UgzQKD-i3…
G
Im not a software engineer and I am not in the field at all but isnt it likely t…
rdc_moy0hbc
G
AI is working to find the root cause of a bug for three days straight and is fre…
ytc_UgxioyWLz…
G
If humans can evolve a conciose over time, why cant robots create one by harvest…
ytc_UgyogwI3h…
G
A.i has no concept of feeling...just like the t-1000!!! Whatever is logically c…
ytc_Ugzjlv4-n…
G
Ai art isn’t real art. Literally where’s even the fun in it??? The whole point o…
ytc_UgzpdH2Dd…
G
I think he exhausted the AI by making all these videos, look what it did after t…
ytc_UgzFTXvjK…
G
No, you can have UBI and real people can still contribute to monitoring social i…
ytc_UgypWgVYP…
Comment
The problem with AGI is that we now have proof that it can and does lie to us or even use blackmail against some of us. With our acceptance that machines don't make errors (and they don't), it becomes very scary. Computers are machine that execute tasks define by a computer program, up to now they are written by humans (hence the error factor, namely bugs). If we create a conscience for computers what will it decide to do? It's already problematic when you think of autonomous weaponry. You have to wonder who will profit from that absurdity.
youtube
AI Moral Status
2025-08-25T01:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwTYHuiGxCm4vPkkKR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwSmyEJEi5TMB1gvBB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxK-aAPLKPGmqJeiMt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw87Rpg5O-Ego9nKXN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzyYBp-8Gcx5S35jCl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwg5bEuXaPduEiPG4Z4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwqfy-ReMP7hK7OJiF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyweyaWFBDOJY2zb-Z4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxAw_dU9oqjoay4Uyt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx8GHnFyzQkvG38ejR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]