Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Bro I flipped when I saw the little holes in there head when no dur there a robo…
ytc_Ugy9_ba8t…
G
🎉❤🎉❤❤🎉❤🎉❤🎉🎉❤🎉❤❤🎉❤❤Allah hu akbar ya allah pak apki qudrat hai sari ap ney he sab…
ytc_Ugy9oDD6N…
G
This is true, already true new celular with the system they have a integrate Mic…
ytc_Ugz-v90Qo…
G
AI art doesn't tell me it's good art, it just tells me it's a good image generat…
ytr_UgyZCW5B0…
G
“Move it, Third Law Robot, I’m driving in a proper manner, unlike you ancient to…
ytc_UgwFr_3ek…
G
They let me keep my plan for free after I canceled my subscription. (Yes, they m…
rdc_o7x0z23
G
that clip with his brother really shows he's got some deep insecurity about his …
ytc_UgwBQcS_I…
G
No it won't, as an avid user of AI it's good but it has its limitations, we'll s…
ytc_UgyCuCExg…
Comment
It's funny that they say "we have paused other technologies".... Could you name me one ?
How about Gain-of-Function research, which gave us the Wuhan Virus...? Nope...
How about Chemical weapons development..? Nope..
How about Nuclear weapons development..? Nope..
Even if a framework of safety is put in place, the governments will simply make themselves exempt, and become humanity's worst enemy...
My prediction is the security agencies like the CIA/NSA or military intelligence will be the biggest proponents of AI, and will eventually give us Skynet.... Hmm..
youtube
AI Governance
2023-03-30T11:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugw2fk2Kp2ZS__IEk3d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwklHhmSyIKki-dGxd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgzFl5Qnvd4D2e-9RJV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxBbjHdNEXCBjidVfZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzOEgFpGb1O8aFy9l14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyn8vRMgBLNpaTFoeJ4AaABAg","responsibility":"government","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxcQDx4AdRgBphhK2V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwpIwmMStqU8q1B78x4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwKdCGtVWu6CoEWmrl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzWqGaASr6elrTgeeR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"}
]