Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ai use final work at material not just learning you just skip and don't know abo…
ytr_UgyGgONUz…
G
Anyone can draw and if you say you can’t you just haven’t done the work drawing …
ytr_Ugyles236…
G
As you seem to understand some of this, I see another inherent problem:
In a ca…
rdc_e7jcw1i
G
I have neverrrr been able to learn online. This is something that would work aga…
ytc_Ugz_1v9V-…
G
The disaster with using AI to operate is no accountability. No learning and no …
ytc_UgyhBGCAo…
G
Superintelligence AI won't have the motive to take over control of humanity. Be…
ytc_UgyQgvAj2…
G
This is why I'm leery about opening myself to ChatGPT. I don't mind using it for…
rdc_mye6o7l
G
They are ignoring something important here. If we purposefully develop sapient A…
ytc_Ugiu3igcs…
Comment
As the risk of AI increases, the adoption of AI slows. The same thing has already happened with self-driving cars (the tech was created a decade ago, yet humans have chosen not to allow them on our streets). AI will be no different - when it becomes dangerous, it will be regulated and/or banned.
youtube
AI Governance
2025-06-16T09:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyJv7o5dpFRjhfqKOp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw6-ctKmxFl2vgITzN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwhaRTotNhi-72fhh54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx8wXvQcU-fm_03Jjh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxxsKI_VgmLKV4OsrR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzmVMUMyyM7gZzWzal4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzMQD-z5M-bPvr6LpN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxDlTeWZ0fXSjqCKhh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxAhdK1KgnZbXjW_0F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyokSLoknCeP2M0lFR4AaABAg","responsibility":"user","reasoning":"mixed","policy":"industry_self","emotion":"mixed"}
]