Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
they have to be unique enough to not get flagged and ai use is okay…
ytr_UgynXfRpc…
G
Nah, I’d say this is a lack of AI - more specifically, the “intelligence” part. …
ytr_UgyLDO7gm…
G
humans dont have consciousness from birth. ai can totally gain consciousness imo…
ytc_Ugyd7AgLn…
G
First the praise...with your current work, your depth, and breadth of knowledge …
ytc_UgwsBpIOE…
G
I wonder if their store in Illinois use that same tech and if they are properly …
rdc_jckjpts
G
Thankfully the AI didnt mistake him for a black person or else the police wouldv…
ytc_Ugwak70_a…
G
So, it says agencies are watching us, but it can’t tell us it can actively see w…
ytc_UgyJdrDbn…
G
This was two years ago, way before ai video generation was released to the publi…
ytr_Ugx1MSXjH…
Comment
AI ACHILLES' HEEL :
Serious AI requires massive POWER and COMMUNICATIONS infrastructure . AIs worst nightmare is an EMP blast . . . Also It would need much more than cell phones and computers to "hide" This architecture might work for sensors is too inefficient for AGI. 💩
WE will always be our worst enemy . . . not AGI.
youtube
AI Governance
2023-07-07T23:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyO13mF0mx1G1LmyY54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxW35pMlkLfTEtfR4Z4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxVRy0E7puJ4yLvRo54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz3PBjEy7YAxRt49Pd4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwgVyF4Gj9s2-qY9ux4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx2NN8ZAwgb8uL-gV54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx6IZFc1W6Jr759ikZ4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzEkc8rH2jmnbLTlBh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugx-QsGvAz0c2MhtgSJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwgvZAX3XbPEo2DPd14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]