Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Is AI dangerous to humans? Depends on who is using it. Nuclear weapons are dan…
ytc_UgxSwuFHV…
G
my most normal AI chat basically goes like
"A 3 CHERRY CARD A DAY MAKES THE 403 …
ytc_UgwTCOOrG…
G
I would simply not allow AI and, if need be, computers i the classroom. I would …
ytc_UgymsAWGD…
G
If 2 driverless cars get into a wreck in the woods, did it really happen?…
ytc_Ugxs7qw-h…
G
If you are a construction worker you will survive this for awhile anyway till th…
ytc_UgwqlOGGx…
G
Or maybe the ai is 100 percent right because these data sets contributed the mos…
ytc_UgyYvSe4W…
G
So if a mad scientist produces an army of robot assassins and unleashed them, th…
rdc_dy4e6f2
G
The real reason for AI is one day to get rid of people... not just jobs…
ytc_UgwatLl89…
Comment
I'm all for having Robots for the dangerous jobs initially to make sure the scene is secure and safe like police traffic, hostage, domestic, etc. fire fighters, doing roofing high rise buildings, etc.
Once the scenes are secure, then bring in the humans and allow them to do the interactions.
I Always bi pass AI and speak to a representative.
The question is how can We The People stop Ai from taking Our World Over. Taking our jobs.
youtube
2026-04-21T08:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | mixed |
| Policy | liability |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugzf2HigqT111CeGW_N4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwWYYrsfUJGU_OltgR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzZkxaWTLEdLtbIfxh4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"unclear"},
{"id":"ytc_UgwDdhs5mnavyANgxjt4AaABAg","responsibility":"user","reasoning":"mixed","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugxts2689sa-Jyz9ZdN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzB3tysmNOqPaDLlHN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzgedI0JoOL4oSIlRF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw6KtI4NJYr2v4gHtp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwKm-ioPIWe_4BAnL94AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyzzWfPxrmDvSLPZIV4AaABAg","responsibility":"company","reasoning":"virtue","policy":"industry_self","emotion":"approval"}
]