Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Its nonsense to sit here and listen to a robot tell us that humans could be bett…
ytc_UgyVmejoG…
G
The never ending battle between grabbing attention and addressing issues people …
ytr_UgwdRD2Kb…
G
Sub text “we’d like to drag you down to our level from a food safety and pharmac…
ytc_UgytBqTfE…
G
Basically photos/videos can no longer be treated as something absolute. Society …
rdc_izkuu2e
G
When fighting a robot attack the legs get it to the ground then disconnect the m…
ytc_UgzIToq73…
G
Does this only make it hard for the ai to copy or does it like make it literally…
ytc_UgytXpgoQ…
G
Are governments asleep. If millions are going to be unemployed then millions of …
ytc_UgzR3ysrX…
G
What if you don't have the rmoney or right to a lawyer and you're defending your…
ytc_UgxfLfzbc…
Comment
This is very sobering and, even though I embrace technology (I'm a digital immigrant), AI is something that gives me pause. At some point we need to ask, "We are so pre-occupied with whether we can, we need to stop and ask if we should." After listening to this podcast, it might be too late. Will the world change this drastically this quickly? I don't think so. I am hoping that before we hand over all intelligence to the 'bots,' we will step back and move in another direction. If people aren't earning/receiving an income, what is the purpose of having robots manufacturing products or providing a service if there isn't anyone to pay for that? An idle, non-productive population will not last, and when the work base is no longer there, everything will collapse.
youtube
AI Governance
2025-09-04T13:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz3a47Q2o3jZ4ZKVeh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyTNov40IYNhUhULMB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxjjplBQ1ilwAy4WBV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx1g8rFTxmfdUaDZ_h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgzV3MbFohPzOyY-wl54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzvZdzDfRoDnXVwYVl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzkZHRHVkT117W9u9Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzsyk89KjYwb4gPwjJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzwEWiWxfvKNyUGb9J4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyj4gsfVUxJLiWDupB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}
]