Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I just wrote a complex application with Cursor. I'm a mid-level software enginee…
ytc_Ugzys3683…
G
I learned from this that Becky either doesn't listen to her guests or can't unde…
ytc_UgztHm_-n…
G
AI cannot become conscious. That is not possible.
Yet, the danger is not in AI …
ytc_Ugz4lpN-p…
G
Sure, and how are you going to shut it down? You Are USING THEIR TECH to simply …
ytr_UgxB3Gn0Z…
G
4:20
People tend to get defensive when you insult them. Ai art is insulting to …
ytc_UgxZdY4y7…
G
An incredible essay; I am fully in agreement. One thing I will mention about "mo…
ytc_UgyUnZT7R…
G
AI will have to look after itself...it means the best possible world for everybo…
ytc_UgwQ8RBfB…
G
It has already happened that two AI conversed and then created their own languag…
ytr_UgxSc0qgg…
Comment
The first cold war was one mutual destruction with nuclear weapons. This new cold war will be mutual destruction with technology. Question is...who fires the first shot? Which country will be first to release bug size drones loaded with lethal viruses? Who, or what will make that decision? If we humans were perfect AI would be the best thing since sliced bread. We're not perfect, therefore AI may not be either.
youtube
AI Moral Status
2025-04-27T19:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzjWC2Veskr2865N_h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw9UeTs2XkF_2QNvJB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx9V0FKgQ43Id0ed8h4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx0yNgCrO9y9aie-7t4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyvVM5YJVbD6wWSXO54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx_lSAkXTDCVa6rTW14AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxzQ-lX40fJISgrspl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzwpmobhasP02tPdyt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy0T3yxLvTydudGOEh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyw15U-TdEgM2A0sih4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]