Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yeah. I think we're very close to end times. Many of us feel it. AI is not sitti…
ytc_Ugzu5APrX…
G
If an ai is smart enough to have feelings and make its own decisions it’s smart …
ytc_UgwDbpO3j…
G
0:45 As much as I hate generative ai videos and images. This at least has its ow…
ytc_UgwHZe_ut…
G
Long story short: AI is gonna be very awesome or very terrible! World will chang…
ytc_UgxLug3Cz…
G
Great video, I enjoyed the discussion. It could have done without the political …
ytc_Ugxvg52CT…
G
Just my immediate and initial 2 cents in the matter:
On the down side...
When …
ytc_UgwIZ9u9j…
G
Learn to use the new tools, like everyone in tech should be doing if they want t…
rdc_ohmzwjy
G
How do you send probes into space without ai, how are autonomous cars meant to w…
ytc_UgykgIJh7…
Comment
In my opinion we need to stop AI right now no way because then China will still keep developing theirs and could possibly take over the technology landscape so what do we do we have to do something something has to be done it's only going to get better and it's only a matter of time before it gets in the wrong person's hands how are we going to navigate this unpredictable situation
youtube
2025-07-31T03:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugwb_5qQo5CH6Ty38HZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzaMbLtBFcZw4DM_J14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxO70uZagwMPEVsiPl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzaPFwEtq8s86J-T9Z4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxp_GCbfQUdcTz1UrB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwGMAPbRtEZAp3J3EV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgzXzMcjvWvRKbgcX654AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxGTCJ1iWNcoB8gAaV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgygPUOOwxyJhX_pPE94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxfK3Vyo7hVv2VU-1t4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"regulate","emotion":"mixed"}]