Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
When you *call in to make a payment,* they (AT&T/Verizon/TMobile) *recommend tha…
ytc_UgyrxRa_h…
G
So, firstly, he suggests humans will have no purpose if they don’t work, yet whe…
ytc_Ugxq6tV1b…
G
I always thought that AI would be exceptional at replacing CSuite level executiv…
rdc_m6xrkfs
G
You're very perceptive. All jokes aside this is an astute observation. AI will…
ytr_Ugxjua-r7…
G
I hate that AI suddenly is synonymous with Large Language Models (LLM). There ar…
ytc_UgxK-aAPL…
G
And this is why self driving cars will never work in any of our life times…
ytc_Ugw9T9gbl…
G
I love the Lord Jesus may he continue to raise us in his light who knows what he…
rdc_kvu1xip
G
What jobs will exist when no products can be purchased by humans who do not have…
ytc_UgyZ9txXS…
Comment
If AI can do everything for us, what will be the point of us learning anything. AI will make us dumber , not smarter. Lack of purpose will lead to our extinction if it AI doesn't kill us first.
youtube
AI Governance
2026-03-23T06:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzS7fsh-Ec4BCD5t0d4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxGiPJ8xVUsgR8PxEx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzBFP5uPi4A2q9JDr54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw1xbXHHipJ0SR2C4F4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzz8Tntn2azqgB-Rkx4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzwWfP8Hcvn4BWCgN94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzlK400lPyNR5b_hMB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgziufZeSDhhTEwYuRV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwmU98LGz6963ElgG14AaABAg","responsibility":"none","reasoning":"mixed","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgytFlf8KB__oc0tWK94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]