Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If you know it's a null waiting to be found then yes it's easy. But more often t…
rdc_oi1kx1d
G
i think AI art is cool and all, but the process of putting prompts into an AI ar…
ytr_UgxDjU45l…
G
Imagine having a long heartfelt conversation with the customer service guy becau…
ytc_Ugw6KKTx6…
G
Saw a recent new story about current college graduates having high unemployment …
ytc_UgxXhIhcU…
G
I would say the Uber tech failed terriable but as with most accidents several pa…
ytc_UgxcAqQOb…
G
@2nd3rd1st Well openai is founded on ideas pioneered by Eliezer Yudkowsky, found…
ytr_UgzsbpQH5…
G
All you need is the Arnold face on each robot to make it officially intimidating…
ytc_UgxmRnfGw…
G
What's next? Remember that the Oligarchy despises you.
They still need many of …
ytc_Ugw4f_-nG…
Comment
While the interview is interesting, this man is obviously not a believer in the spiritual workings of God the Father, Jesus Christ, and the Holy Spirit. God said we were made in His image and nothing else is. AI will never have the “humanness” that humans have (consciousness, emotions, feelings…). They can be intelligent and act like humans, but never to the extent of an actual human because of their lack of being created in the image of God. I say to all worried after this podcast, read Revelations in the Bible, we know the end, and AI isn’t it. Repent and be saved and do what you’ve been called to do on this earth: Love God, then love yourself so you can love others well! ❤
youtube
AI Governance
2025-08-03T01:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzAPqwGQeR6uYPLql14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwAz7vgqpc49iOyeZN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwFx-H-pJihj2fi1R94AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw62bnUxrkxkZRBbtx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwTpZ0vGP6TgIB5VRl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugwq4Ms5JU9DpYdOy2h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugyb99UcUlG-sU9Ff6J4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzUxo1x1Xsr4b09dJJ4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwRhHDc45fjnYPO_bN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx6Hk0entIAZTbt5Od4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"}
]