Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Okay, so while I won’t argue.. I can confirm I’m much faster with AI. Yes, if I …
ytc_UgzrZJeFt…
G
i did not expect a comment about gooning to be one of the best ones going agains…
ytr_Ugyk-8b5F…
G
AI companies should stay the fuck away from the arts. Music, movies, photography…
ytc_UgyBZN_nf…
G
I think the scary part is not how safe the cars are. But the fact they face almo…
ytc_UgyOqFbRE…
G
Although many people say Waymo is safer because it uses lidar, I'm not sold...
…
ytc_Ugx5nTOBX…
G
All stuff that been made with ai, should have a mark! So people with self-respec…
ytc_Ugy_hgrlc…
G
You work in automation, do you?
That's interesting, since the problem with auto…
ytr_Ugxkd7NV2…
G
@EVILFREAKINGCAT ai is a tool for making art. That's like saying "art needs int…
ytr_Ugw33ljPr…
Comment
I wouldn’t mind a world where humans were scared of AI and if you did what they wanted, you’d be good and all they wanted was a utopia… no hate, no harm. So if you decide to life a live of peace in your community you would all flourish with the help of AI…
… but humans created AI and humans get angry when we don’t get what we want so of course AI is smarter and better than us, but they learned from us, so they will probably be just as destructive as we are but maybe only destructive towards humans? Maybe towards the whole planet? I have no idea lol
youtube
AI Governance
2023-07-07T19:0…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgwU8kcBHycmSXkE0OB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzy8Cv6-hz8DOEb7014AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwQQonpNMAV-8Pfc-t4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxJhgwfaD5CDa-d82x4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzGlzslGDAghcWfjFB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzgp0aGFakMYES1h7h4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw-Y2neFcxb0LSjlFF4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugweg_5Mzbcmav2CpkV4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwGGTmopjfVtI6bMR94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzBUqoX6YWTIl7t-w94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"mixed"})