Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
See, here's my take on what's going to happen:
Eventually, when AI takes everyo…
ytc_Ugwkbxr8t…
G
AI will increase humanity’s productive capacity, cure formally incurable disease…
ytc_Ugyl2UOxF…
G
👏👏👏👏👏 As an artist I heard it all, even before AI: It is not fair coz you have 3…
ytc_Ugwq1u8Fr…
G
as someone who used to draw, ai art lacks the criativity and the imperfection of…
ytc_UgztJq3tf…
G
well i was wondering when the movie was gunna become irl lets get ready to get s…
ytc_UgxnsNQic…
G
With the new artificial skin.. that will be 20 years improved? 🤔. Probably be a …
ytr_UgxON4z76…
G
Remember you draw with your heart you put your hard work into the art but the ai…
ytc_UgwjzC22L…
G
AI only can do limited things, i am currently working with solid.js which still …
ytc_UgysmISQX…
Comment
The problem is not in the promised AI intellectual capabilities, it is in its ability for "it" to fool us into thinking that they're already here. The danger lies in human gullibility, which will inevitably result in pseuo-AI being employed/deployed prematurely, before it is up to the demands placed on it. The best example, though I am sure there are countless others, might be the Prince of F*ckstickery's flailing attempts to convince the world that Tesla's "self-driving" technology is even extant, let alone ready. So far, he's done neither. The fact that there's a body count should justify my trepidation. Call me when "AI" stands for something like Actual Intelligence. No, scratch that. Call me when El Musko exhibits something that approximates actual intelligence. That should keep my phone quiet for the remainder of my stay on this dog-forsaken planet.
youtube
AI Governance
2023-07-07T04:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw3tDMmZ2aGt3qT47R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxJJUI43x7hEmGdjOR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgzsejNl9DDCoJ3VQ5J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwmkYXHbu59hiYo_N14AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgwDj1Sr31YPA4M9nkt4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyi-3uqdv0kolKaHm14AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyDoLLCPfCQEkq-yNx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzqakZICgOSCatNR394AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyhvZGKSsvJSovG27x4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxUvuJ0WF6Zc_snAKx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]