Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@swamp-k6w sorry you got strawmanned, but your comment is defending the people t…
ytr_UgwKQRIgF…
G
From a moral perspective, I don't see it as very different.
AI isn't psychic. …
rdc_lgn9xhu
G
Imagen some ai pirat watch you so he know what is he supouse to dtay avay…
ytc_UgzZ4f0PR…
G
How long before we give enough training data to ChatGPT to give it depression? S…
ytc_UgyHMRf3w…
G
This is why all I use it for is learning concepts
For example, I've been using U…
ytc_UgxpoKwQD…
G
Thank you so mich for this video. Hearing this support from big artists is very …
ytc_UgzLo3r4B…
G
I hope the AI bubble bursts and these companies laying off employees realize it …
ytc_UgzrE-rn6…
G
think this through.
they are telling us that AI is dangerous, that it will kill …
ytc_UgwzVonEx…
Comment
By 2027, there won’t be any AI security specialists left, and they’ll be out of work too...
But seriously: there are many types of AI that require vast databases and extensive training to function properly; on top of that, they consume a lot of energy; and they need to generate a financial return.
If AI destroys so many jobs (it’s already doing so...), this will further undermine human labor, lead people to elect “crazy” figures like Trump, and, worse, it will also destroy markets, since the underemployed don’t consume and don’t generate profits for large corporations.
It’s not quite that simple or immediate, but it’s worth issuing a red alert!
youtube
AI Governance
2026-04-24T00:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx3S9OkIiOvWENKG_t4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyRODo4_sFEAnQIaoh4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwW7KoYWAbeOmv7-zp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwLko385PPmc_LzZaR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxJ0tWzXsdm3PtIN3J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw3nWCN5p38r8dc31N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzor8-mzVd-DdVM-pF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz3gRtrS0u-34BIJ754AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwI0pNobEJofYgRQQ54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzugsn5OFjSKQBWXoR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]