Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I feel like someone who is uncertain if humanity should continue existing is a t…
ytc_UgzRpptbv…
G
Thank you so much for showing us the bias in these AI systems. Racism is still a…
ytc_UgxMzfl2J…
G
The moment they start to use AI more in autonomous war machines, we will enter t…
ytc_Ugy8AZWjC…
G
"How can you complain about automated machinery taking jobs when you use a screw…
ytc_UgxLiPepS…
G
Dude we need a talk why are you blowing up anything you see in this ai app…
ytc_UgyrGMAmr…
G
This is the exact reason we should do dih biometric data verification instead of…
ytc_UgxcZQ4K8…
G
Given the huge impact, I would say a 24-hour work week with no loss in pay makes…
ytc_UgzUsrRoO…
G
I think the self-driving car always should favour the lives of the people outsid…
ytc_Ugx-IgbdL…
Comment
Even if we were to allow the government to regulate AI who's to say other governments won't use it to try and surpass us and try to put us in a bind
youtube
AI Responsibility
2023-07-16T21:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwmNJ_snnbUT6iArX54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwStWdzogcjetMcTgt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwy-ds5OFprbbr1poZ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx906ahqhQZ5yPqM_h4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxRcZrFPKywChdjt4l4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyomQeBIP40yhc8x1p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyrCCdhBE-WkkOA8UJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx7dIYS-p0MipO3QTZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugw_0HZ0bzj2P4YmYuB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgySDYrGMxVO42ccFGV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}
]