Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The AI by itself will do jack shit. It doesn't have agrncy, purpose, free will, …
ytc_Ugwozc-ir…
G
This is wrong. People need to stop using AT&T. We all hate automated systems and…
ytc_UgxGTdNxA…
G
Ngl its kinda hard to not hate or dislike someone who not only actively thinks t…
ytr_UgxjgNKHx…
G
Mr. O’Brian’s personal perspective adds favorable value and understanding to me.…
ytc_Ugwzuy6-1…
G
U either not doing that as a career . Should be worrying if Ai takes off. Or if …
ytr_UgzN2Q3FW…
G
One thing that I see may help in this transitional period is to make all these A…
ytc_Ugw2zGcMY…
G
Maybe we're looking it it wrong... Maybe AI's most effectiveness could be in f…
ytc_Ugy8xzTXb…
G
I agree with your anti AI arguments, but I'm going to be real here. Your argumen…
ytc_UgzlyGSMe…
Comment
How long did we take domesticating dogs, a deca-millennia? Cats are a work in-progress. When you flip an Ai off+on and it gushes about near-death-experiences, THEN I’ll give Turing the nod.
youtube
AI Moral Status
2023-09-04T08:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwuvPMvW1Yd5WAkGBd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy3gnCoP-xn7-S94MF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyCEPREMTYhaJWhC5J4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwk6wTlq55JmN6yt8t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxn1WdFZU03E9Dt6NN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwBaL9lMw5ttOIWXfJ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz2p_H50QBytY42knB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwq61-e9i9c3mZ6Hy54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx_tLfNlBoYL7u6GuB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzshgq8A2AcZcL7Ftt4AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"outrage"}
]