Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Part of me feels like this AI doomsday stuff is a bit of a psyop… you already ha…
ytc_UgxrqyFEq…
G
Eben referring to the tech business of AI as an ECOSYSTEM is an abduction and im…
ytc_UgzudY6CF…
G
We wont. Not with the models being developed at the moment. Various researches h…
ytc_UgzYNXmT-…
G
thanks for bringing this to people's attention. But i dont think you can say 'I …
ytc_Ugw4P2zCZ…
G
We have been through this so many times in the past. AI is not going to replace…
ytc_UgyowrN0J…
G
That "new artwork" of Justitia with the cones isn't "inspired". So yes.... they'…
ytc_Ugxth93w-…
G
I know how dumb this question is but is this shit real or some AI program?…
ytc_UgwlpOWXr…
G
Google:
"We have a policy against creating sentient AI"
Also Google:
"We code …
ytc_Ugw98TeXC…
Comment
Something that can be used for good within medicine, teaching, etc. can’t be trusted in the hands of the general population or companies for profit. Through greed and desire for control and domination, it can easily become a weapon of mass destruction.
There’s a passage in the bible about increased knowledge in the world…
~ Daniel 12:4 “But you, Daniel, roll up and seal the words of the scroll until the time of the end. Many will go here and there to increase knowledge.”
Coincidentally there was a city by the name of Ai. The story of Ai in Joshua is about a literal city destroyed by the Israelites as part of their conquest of Canaan. The city was destroyed because of a prior sin, the disobedience of Achan, who took spoils from Jericho in violation of God's command.
youtube
AI Responsibility
2025-07-27T20:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugw7ufMHhDiLv6rtUxJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyogR4X-AE2C0zV2OV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwWwnu1Z2gHb8M06Jl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwYIJYwsiTI7PoZNcF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxjpPgtnYX39-oZ3X14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzecMHGJmfm5Z3BQjl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxzvlQnw8jf5_7Q-f14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugysjz5QJXWYa02cbLB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxB6IiYGI-ikm2CV2J4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzN_9iUc9khM72byV94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"}
]