Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Can anyone list the 5 jobs they discussed that wont be replaced by AI ?…
ytc_Ugx0g9vr6…
G
So ...this video is about radar, isn't it? It isn't about the misunderstanding o…
ytc_Ugz8fTLqF…
G
AI imaging is a lot like buying a trophy for yourself to congratulate yourself f…
ytc_Ugyn8sbaf…
G
Literally caught a doctor prescribing me the incorrect medication due to AI. Wen…
ytc_UgzKr-ahi…
G
Instead of speaking about how Ai is replacing because youre short sited.. people…
ytc_UgxuZNjUq…
G
Ngl Chat GBT is a better therapist than my therpist🤷🏽♀️ I had a break through w…
ytc_Ugx3egpHJ…
G
Reminds me of Jared’s ride in Peter Gregory’s self-driving car in Season 1 Ep 6 …
ytc_UgykGizRP…
G
I hate AI and all the people developing it and all the people using it. I hope d…
ytc_Ugw5-UEgQ…
Comment
We have already got a big taste of this AI tech. The public seems to always see new technology 10-20 years later. The scary part is not some robot we can see. Its the social manipulation we cant see that is scary which turns people against each other and influences public opinion. Have you noticed that all the hell fire omg isnt in your real life day to day operations?
Thats the scary part
youtube
AI Governance
2023-04-18T23:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzcShx882zGZN9X7WN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwXQ-aAN_yINWMCwnt4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz2YYEghygIvxXYYRx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxpCzcwEFEjn26cud94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwy1f9PF37mMYopH3t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxiEQvyoUeVKotCbG94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgweXKTuoDmhoXNLf0p4AaABAg","responsibility":"company","reasoning":"deontological","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgxuWxoVAOVFsqeL4IF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxMXwC3BoT42juee2x4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugyq6gNj_0Zl1hidWml4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"fear"}
]