Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I disagree about Google not needing regulations. They definitely play the algori…
ytc_UgxRBUKo6…
G
Most people don't understand computing power doubles every 18 months. That means…
ytc_UgxNhQfo1…
G
It is nefarious. Google (Youtube) is all in on the surveillance state agenda. Th…
ytc_UgzvU20ta…
G
Oh please don’t give up on your art. AI images may look good, but they have no s…
ytr_UgzMp5Wo4…
G
I actually did the saturation method on the ai piece and a BUNCH of pixels on th…
ytc_UgzDwgpQQ…
G
First of all, AI should be morally human, based on the 10 commandments. Second, …
ytc_Ugwbksmqb…
G
An AI cannot lie, lying requires 3 things 1. Knowledge and awareness of the tru…
ytc_UgxHoU4Pl…
G
I mean...sure 100 years. But how short is 100 years in AI learning time? Id wag…
ytc_UgyViUKH0…
Comment
Think of AI as the digital equivalent of Wuhan gain of function. Right now, it seems that every nation is competing to make the most virulent AI and no one is researching how to make the equivalent of an MRNA vaccine to stop AI when (not if) it gets out of control.
youtube
AI Governance
2025-08-13T18:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwiiAT3w22tKQxrSWR4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw-51nB-WYUhAFEG-F4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzJVv-BDmSBHH-wzcl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyjxucyba9vV9bIjKB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxYuqjHdXZ46TMRbHB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzZ_JSUhjdREhewDE54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxKpVtuTsFzuvyanvx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzIUsoqmR2G_BbCTIZ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxpIf_yr33iQkzEF0l4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugxl3320fGFeLEQICed4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]