Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
But... the ai never told him to consume bromide. it suggested it for cleaning. w…
ytc_UgyPf8Y29…
G
Outsourcing your thinking to a chatbot is hazardous, partly because too many hum…
ytc_UgyTSejWe…
G
I did no intentional tricks and one day chatgpt suggested me to switch to a blac…
ytc_UgytwKsUR…
G
Its a conspiracy. AI has been brainwashing people through suggestion algorithms …
ytc_Ugyuk1hBt…
G
@Brogon_the_10st
Art is subjective,
When it's art.
"AI" Image generation is not…
ytr_UgwUu9B2Y…
G
This is the most boring thing. Sofia at least is quicker at response than the ro…
ytc_UgzStsG2W…
G
I take some solace in the immutability of Murphy's Law that even Superintelligen…
ytc_Ugwq0OmC3…
G
Ai has no hold on nature. Ai worries only effect cities and government. You can …
ytc_UgzlvrNMs…
Comment
We already have a machine that kills humans, it's called a gun. Like a gun AI needs someone to point it and pull the trigger. Why? As the video points out a human must provide the motive. Machines simply don't care. What we need now is laws, like Canadian and European gun laws, to control people who would use AI as a tool to harm others. By the way, humans think better than AI and without the power drain. Time to start studying psychology.
youtube
AI Governance
2025-08-17T20:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxdW7EAaybO9gnPjgV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyLeBOml9zNFnBw67B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy5aVKtGyMbPmKUMsh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx5MavApaO_cO_TUZx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxjxC6QiPqvW0fWOax4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzCp6_qY3JuA50RUDV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzyP88qrr--n7-G8op4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzUHx7tC-JKlTMd0F14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwvGR5UI4-QKIg_azd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_Ugxknhuh6-mCeR1lQJB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"fear"}
]