Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"There's no reason to hire a human if AI can do it" - this is wrong because a hu…
ytc_Ugx3EU-s1…
G
Funny, I try playing this game with my ChatGPT, but it says it "can't agree to t…
ytc_Ugy0YtR2K…
G
We're not even close to superintelligence... These are still just narrow form AI…
ytc_Ugy02rW0r…
G
You see it's you either know how to use ai or you don't. And you just don't. Wha…
ytc_UgzpVimlU…
G
Okay but it objectively isn't here to stay. The amount of electricity and other …
ytr_Ugzqb9lt3…
G
If you are using Autopilot, it will NOT work if you are using it on ramps and ev…
ytc_UgyU_k8x0…
G
If the number of people losing jobs keeps increasing and number of people born k…
ytc_UgzltSn-u…
G
Ive never used ai in a helpful way and i doubt there is a helpful way to use it.…
ytc_UgydsDRld…
Comment
Vance doesn't know what he's talking about.
He doesn't understand the risks that AI poses, nor does he care if jobs are lost.
The EU has long had well thought out policies that support economic growth, protect human rights and support environmental sustainability.
If anything, the EU should be educating the Trump administration on how to govern properly!
youtube
AI Governance
2025-05-28T13:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgyA5O2cN317n28mzc14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyUXJ36Plk2F9ez1714AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw8g6pqNumokq-2Fa54AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzHq44hPWZpWFsTrpB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxpp6Qoha4C23HTMB54AaABAg","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzyRhvBDhRSzya4Tk14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy5gRbC2ET6VZMaFOB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzLzNYA-UE89PJvcsF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwcL1qpuPiD9l3WTk94AaABAg","responsibility":"government","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgyLJaQiXaOlC0-sVz54AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}]