Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I just want to point out that the luddites AIbros like to refer too as a shortha…
ytc_UgxIJuxFD…
G
The part that these people who are obsessed with AI don’t understand, is most pe…
ytc_Ugyawh26V…
G
I hate AI an no. I feel safer with AI than government.
AI hasn't tried to forc…
ytc_UgyJRR5CY…
G
This seems a bit unprecedented. If a gun manufacturer isn't responsible for a gu…
rdc_ljom9kh
G
make any attempt to say chatgpt isnt that great or makes tons of errors and peop…
ytc_UgwOdc489…
G
2:23 I'd argue that NOBODY has any idea how LLMs work - they are indeed black bo…
ytc_UgzGLFB8c…
G
Who will buy the stuff the companies create with AI when no one has an income? A…
ytc_UgxTX0Txl…
G
If AI runs everything ... where is the money for products come from. If people…
ytc_Ugzq13SUy…
Comment
I fear that as long as companies motivated by pure profit and rapid growth are driving the development of A./I, then it's more likely that A.I will become the threat so many are concerned over. Remember, human greed and ambition knows no bounds and has caused countless tragedies over the years. As long as these people are in control, we're in trouble.
Add to that that A.I development is the new space race, a race that mega companies are desperate to win, safety will always come second. History is replete with examples of corporate greed and I fear the A.I race will only magnify this likelihood.
youtube
AI Governance
2023-12-31T06:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | virtue |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx2migULC-5BjP3ckp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw3eGz_yCn6_SvrDLd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxmNJfkEEVp3MyyfrR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx9NdwnKBTAE4fodqp4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwgoUv1K4BzqdYdyCZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxmUvs3Dr__L_nCNrZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy_X3EKVIq0VONYgmF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwNw6vgBTOHkJrOBap4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzyMIzkp6SIJmPGaOZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz5crK7eG4I1CPjbsJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"mixed"}
]