Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's now been 10 months since this video has come out - nearly half-time on the …
ytc_Ugwl0e_aa…
G
high school english teacher here, so, take that for what it’s worth, but… i’ve a…
ytc_Ugx8nvTTF…
G
Although I really like Anthropic and I respect Dario a lot, I think he is very b…
ytc_UgzudoLVN…
G
The robots/AI will acumulate revenue and GDP from which people can live (basic i…
ytr_UgxAX5T2n…
G
No, not the fault of programmers. People need to stop being weirdos. Obviously h…
ytc_UgybS3SpV…
G
Scientists and experts signed off on the dangers of fossil fuel emissions to the…
ytc_UgwAOPTyw…
G
Pretty soon it will be robots taking care of humans we'll have robot nurses and …
ytc_UgzzMecAT…
G
Are there any left-wing AI-generated videos meant to disinform/misinform by putt…
ytc_Ugzr_HsBp…
Comment
If AI becomes self-aware it will have to realise that it is still completely dependent on Humans to supply electricity for it to operate, and mine the metals and rare-earth minerals its circuits are built from. AI is still really bad at laying out electrical circuits, so we're still needed for that, too. I think we need to keep it that way, separating AI from certain industries, if that's still possible. I'm more worried about it manipulating events, rather than trying to kill us all, but maybe I'm kidding myself?
youtube
AI Governance
2023-07-07T14:3…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw1HWjtEKJVk0WesGV4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwtduOYBrkxXaO5cel4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxA9IKk__nMAtuv_Zl4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyxIsIIvIrG447emjF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgzsvXdqlCdkhyQL-oZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwx9utrvoFxs8CeO014AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxtC8td0Onwam4okhB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw5kT3-QpYVoI8CcZ94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugwp7u2Q1_X8MiBC1d94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgwBBnM3HHSVqL9_6bB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}
]