Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Right! A utility! OWNERS of AI COULD make it beneficial to humanity; “we the cre…
ytc_Ugwxzp648…
G
I don’t like using this kind of AI in the classroom but it is just a experiment …
ytc_UgwbIH30e…
G
Well how foolish this people are they don't use a.i to improve their drawing but…
ytr_UgzNUcJIM…
G
No one's life goal is to work in a call center.
If menial work is automated and…
ytc_UgzHt7X_0…
G
The sad ass part is that some people use AI for good instead of evil but they ar…
ytc_Ugz7Vu2sR…
G
Depending ho image was made you still can think why artist made that character c…
ytc_UgymuC2dx…
G
Undetectable AI is perfect for anyone who wants AI-generated text that sounds na…
ytc_UgzEe0LYA…
G
And the self driving car example was from "Westworld" (the recent tv series not …
ytr_Ugzp5vPlp…
Comment
I think it's more important to use AI as a function, not a being. I think AI's final form should simply be a translator between current tech and quantum tech, and also used to manipulate the pathways of electrons and nothing else. Just a translator making superpositions functional with 1s and 0s, and utilizing hardware on a nanoscale. I don't want it to talk back or think.
youtube
AI Moral Status
2025-12-16T09:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwnshwQ7aHs0DgDhMl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugzz5MVjWj-8gJIy8hV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy0fUH7nX-47eW523N4AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyfkEIbHcrIpyZHI9x4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxwyO9H_9it8hGozAd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwnueM9xA3Rc0KrtLZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwFj-FmCw7WfttXkh14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxM5e25bs0z-04Y4Cp4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugyrlm0rmKugab4czlV4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgyxRBNZIYvWdKozUKV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]