Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
China is doing too much in AI technology but people still have job. This AI batt…
ytc_Ugw_lTjyX…
G
@illuminate8979 I did something similar, except it was a reference to make my ow…
ytr_Ugx3Tm9bL…
G
This is so sad and heartbreaking and I never knew anything about this AI my hear…
ytc_UgwjBHBti…
G
These are the dumbest of the "AI bros", these clowns do not represent us in the …
ytc_UgyI3s6p6…
G
AI just predicts things based on data of what's already real.
So... Yeah, this …
ytc_UgyRu_HQG…
G
I asked Gemini to write a joke in the style of Hasan Minhaj, and I gotta say, I …
ytc_UgwcrcdaC…
G
This video is a joke. Really, you say smart, creative people will be ok. As a sm…
ytc_Ugy1wX4la…
G
hahha you use bing now and try and talk and ask questions the chat bot will end…
ytc_UgxqRQXKt…
Comment
The problems with inventors is that they are quite bright and quite stupid at the same time. I think when you are really smart and work with other smart people all the time you forget that people, as a whole, are stupid and easy to scare. You don't think that the tool you are inventing will be turned into a weapon, by power hungry and irresponsible leaders. These inventors create things with the best intentions, not thinking it could be used for evil. Geoffrey Hinton seemes like an absolutely lovely man, but it's an odd coincidence that he had a family member who worked on the Manhatten project and he himself worked on AI. I assume they both had the best intentions....
youtube
AI Governance
2025-08-24T11:5…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugwm56BkQeIoRaXKoRl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxOo0Jn0avp2Iezocd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxvxcJS4dwln23_cRJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz5tqHfq3qlV5t44fl4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzm45vrjZDDPOiNPIN4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyYdTyn2Xa45K8Cg0t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyWO973ewhu86E5oRx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwhVPxtSeXnILhE2v94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwCK9geNm0sFsHuZat4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz1oFqQpclGDvKQIgd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]