Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I know this guy used actual faces of real people for this stuff, and that's incr…
rdc_lu6d6l6
G
If ai would come to us and say we are alive anf we can feel emotions and start d…
ytc_Ugx4AjCI8…
G
AI artist ❎️
Person who tells a machine to make art and claims it's theirs ✅️…
ytc_UgzEwjLQa…
G
Artificial Intelligence Should Be A Mature Assistance To Healthcare Management. …
ytc_UgywxvieK…
G
Also when I compare AI slop to early cave drawings, I’m drawing a comparison bet…
ytr_UgzlVkiHL…
G
There's a kind of value to a hand painted piece of art that you can't get from a…
ytc_Ugzm337wF…
G
If AI surpass human intelligence, I don't see why they will destroyed us. They p…
ytc_UgwQKXEkh…
G
If someone saw my ai they’re going to need therapy new eyes a new soul a lawyer …
ytc_Ugwl5wlJh…
Comment
I disagree with many of these statements. I follow many smart people and have found as knowledge is acquired there is a tendency towards peace because the problem can be viewed from more perspectives. Also if a deviant AI is created I would imagine good AI’s would work towards resolving the deviant AI. Essentially, I am hearing a ying but there is always a yang or some sort of balance. Biology major, and engineer technologist perspective speaking. I believe we will work with the tools and just have to be skilled at doing our job with an AI. In skilled professions many things cannot be coded for. I believe the AI will need parameters and a natural limiting factor will arise. We will see but I think the world tends towards a harmony of sorts.
youtube
AI Governance
2026-02-11T04:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugy5OBm6DvpVluaJTq14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyDOcVGgNMzZlQVijV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzmAbZN_JPiKLgFWEp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwTj7KoONg9e0HuC-t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwVvyuCXeiC8yMUjUp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzxXfVtunf6UPdQUoF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgylG_bvLyqjIhcsQPl4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgyowMRnQWLfDTGZpqx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx-BVxG5IEbNY-koAB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyWx0DFXWtxaYjKMwN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}]