Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The logic is so loaded in this one. He is ASSUMING self driving cars would kill …
ytc_Ugw6bpKGX…
G
I didn't listen to all of this, but it seems she is mainly complaining about cap…
ytc_Ugx9G7zK4…
G
@bullettime1116
Then you have ignored over 30+ years of the gaming industry in i…
ytr_Ugzk9Dgl5…
G
how much would you bet, that the people calling anti AI folk "luddites" needed t…
ytc_UgwozPTgn…
G
Ehh, there are no enforcement mechanisms in place at all, so it’s a stretch to c…
rdc_gtd8qn8
G
This is like those science fiction Issac Asimov books about the recurring robot …
ytc_UgxFVtP-k…
G
So many of these AI startups will go broke from Triple A buyouts, we are an “AI”…
rdc_o46dleo
G
This given terminator and a iRobot vibes if a AI is controlling them and have se…
ytc_UgwYDkRdB…
Comment
I always thought it would create a bigger societal gap because some are very smart and is compared to the AI because many use AI instead of their own minds and we see a significant decrease in creativity in humans. We see less cognitive activity because the brain is a muscle and the less you use it the less capable it is.
youtube
AI Governance
2024-11-18T19:3…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzviWcWDT9w1pAh9iN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxUgpEm4DOIO7renDR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwKLHbo3Al_KIIFejd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzEqulA5JdSd3MlAJF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwVaftWSecpWod8MNF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugwy4JirtmX8oCacpi94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwomdOYCEwjAYICwqF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwZjmiNEa2A9yXlaLd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwxfdX661-R0INkL-F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyGu9Rz0j07ilxwBzF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]