Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Also face recognition are not as accurate as we think for brown and black people…
ytc_UgwKplNmy…
G
AI art is boring because they don't use it as what it is, a tool, it's like buyi…
ytc_UgwncymAb…
G
Lame CC and commenters. It's just a picture for a job recruitment post. If they …
ytc_Ugygc3RQI…
G
This video was made 2 years ago trump just sign an executive order for ai to the…
ytc_UgwtK4YZi…
G
Do you suggest AI to be blind and not see the world because i dont see why its d…
ytc_Ugz9OEzUY…
G
Honestly it hurts a lot hearing you guys talk about history. First of all you ta…
ytc_UgypFsDIH…
G
I think that is a wonderful statement about all the AI is taking over Programmer…
ytc_Ugx1igR_e…
G
Budget is for 10 years.
Ban laws restricting AI for ten years.
WHAT THE HELL …
ytc_UgyXXEj90…
Comment
The problem with we humans is we like to THINK that we are always in control, this won't be the case with A.I. A.I will be the end of humans, a "Terminator effect" If we don't put a pause to this for now. First step PAUSE A.I, The Second precautionary step that should be done immediately, right now, and with no delay would be to disconnect nuclear ICBM's computer control launch systems from any possible internet connection, but we know they won't do it because they are blind to the threat, it will be too late before they realize it. This is not a conspiracy theory, I believe they have already begun to loose control over A.I, so many people are naive to the danger of A.I.
youtube
AI Governance
2023-03-30T05:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwnGb7ARF5fU5iZX5F4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxy4-SMjENRpv1ZBRh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugyffh6HQOXSFG0kqiN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwINfpgQa9_dARlLAV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgyTB3WW0fTYKA-HiDV4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzGmamRiBEZOxEgKjF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwprg8qtmJ--8LVHEx4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgySXCvbDM5_ckFAq1p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzgJ0gtnD9pDdz8F1h4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxGnePZa64AFsqJFaN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]