Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Humans will be driven to extinction along with them if we keep doing what we're …
rdc_fwivr5k
G
This is why I hate the concept of AI "girlfriends/boyfriends." It doesn't love y…
ytc_Ugwrd-8iM…
G
Damn... for some reason you just gave me hope. A mother doesn't learn those inst…
ytc_UgxO09W_c…
G
AI prompters are artists in the same way somebody ordering a special Item not o…
ytc_Ugz9ZuptW…
G
Ensuring factual accuracy in AI is critical. However, data mining often processe…
ytc_Ugw-neYK4…
G
The difference between a work robot and a sex robot will be silicone skin and pr…
ytc_UgzTNVSR1…
G
AI is too stupid to do my job so I feel safe, I use ai everyday for help to do m…
ytc_UgwePZ3kv…
G
We should be worried about A.I. and kamizee drones. Its watching and learning fr…
ytc_Ugx0D_Oue…
Comment
Great show as usual. I had a conversation with let's say, a top scientist to do with AI, and this person told me that it will be AI's fighting each other when the end comes - good versus bad. He said some will be programmed to protect us while others will be programmed to destroy us. 😱
youtube
AI Governance
2023-07-07T06:3…
♥ 5
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxGGEAKQNNdTYvXkoJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwfDvnIVaJlNVXUnKh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzwgpbzA87Oo5YBoiZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgylGkpxw_7bAB7X3Td4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxgqZObDnutqxgQ9DR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy1GaTg6mnUUStSSyp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyfsSAZw6h3iGJNGOR4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxm23pc7rPGiIpuVyJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyooFoavHf-YYxj5DB4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgytJM4rhafqo_LBeVt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]