Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
5:33 here go the bullshits from a master bullshiter.
- are you willing to bet …
ytc_UgzGa3jF2…
G
Right, but again, it’s open source … so if there are subtlety engineered biases,…
rdc_m9ha3gi
G
The only thing that robot would be managing is all the testosterone in the offic…
ytc_Ugx0th6am…
G
Baloney, no AI will be able to turn wrenches to repair cars and trucks which yep…
ytc_UgxmBz8i-…
G
Don’t focus on the tech. For musk orbital data centers are the ultimate carrot! …
rdc_ohh9k02
G
Art is the proces how you make something unique, something that means to you and…
ytc_UgzrNj7h6…
G
one example of the ai out of control is in the art community . after the ai lear…
ytc_Ugy2fanzz…
G
Who allowed essentially industrial plants to be zoned right next to residential?…
ytc_UgxYx_E3-…
Comment
things are happening so fast that there seems to be no time to think about right or wrong… I think we need to slow things down. I know I don’t know how to do that but some of AI leaders do I think they have to get together and talk and reason with each other and come up with a solution on how to slow things down and how to be able to sync to each other’s ideas… It is my belief that it’s time to give up ego, and time to share mindsets
People and scientist know as much as they can know about AI, which isn’t everything obviously should discuss together what to do that will put human beings into a safety net or a safety space and honestly think of ways to protect the human being and all those in it!
youtube
2026-04-03T19:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyM4FigipMimg5ifFZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxiPx_xxzkflOZDUUJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyLlyHP9cRVmBRL8ql4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw8L8M988pbI3LhhBB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx235M1a87sTzqDfRl4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyk6TlI61fjrL8HBZN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzqr8J3iy5XScz5aTJ4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgynNzGh1nVZKwzG_K94AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzlQW2VAnB1jqPePTd4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw3vnqTmVNaaxsR5Dx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"mixed"}
]