Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
1:00:19 this is the timeline where ‘future’ AI is running the simulation to see …
ytc_UgwOFAFs1…
G
The AI predicted a black man to get shot based on a number of factors, including…
ytc_Ugwel8CG8…
G
Hand drawn art: Using fire to cook a meal
Cgi art: Using a stove to cook a meal
…
ytr_Ugx2IUdC9…
G
The strategy is, encourage rage and fear by creating racist propaganda in an att…
ytc_UgzjKHrsk…
G
The phobia of AI makers stems from things like that they never consider how smar…
ytc_Ugwtf2Wls…
G
They really need to start taxing AI companies heavily, cause this can get out of…
ytc_UgwMNEDLf…
G
@MichaelAI-i6f That's interesting. What do you think would be the standard or t…
ytr_UgzKRUUfn…
G
@ 15:35 i HATE waze and self driving cars in a large city cannot compare to som…
ytc_Ugxb0_zUJ…
Comment
I agree with the Doctor when he says we "don't need super intelligence." I personally am uncomfortable with AI developing its own code, and I'm not thrilled with the way things are going now concerning AI, because in the end I know that human nature dictates that someone out there is hell-bent on getting to super intelligence and I do believe that will be our end.
youtube
AI Governance
2025-09-07T06:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyJXuuz0ztFVESToHB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyMkAR_6iiRde0dafB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz219IR8buATg-TOMN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxpZ-wUFlVBxH724RJ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx0G-jUP5OPnctHWUp4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgwFDNFF3HC5xxHnw9V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy6SAoCU5EnnJXPbbl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwNGGJUIb2ABPnoNSh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgznlOnMI4F6_Hb5QeR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy92OLbQTAbjypnI6p4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"approval"}
]