Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
> High UBI pilot results after 3 months of data via open-banking, $5.1k distr…
rdc_ogtoxht
G
Great to quote Missy, someone who has a financial interest in a lidar company, a…
ytc_UgyIt4HuK…
G
1:26:48 - I agree with everything there, except "..the chat bot just HAD a subje…
ytc_UgzgXJ633…
G
3:20 this is nonsense with regards to LLMs.
If you prompt an LLM to get it to r…
ytc_UgxA-hhEL…
G
6:10 Idiots like Sacks should open a history book. If we ever create a genuine A…
ytc_Ugy_ARfOS…
G
Ai gives ridicules advices on fixing things and mostly quoting online articles. …
rdc_nmb49dl
G
She is smarter than you definitely and she does know what he is talking about. Y…
ytr_UgzwGkjOT…
G
Paying artists or learning it is either too expensive or hard but ai does it for…
ytc_UgyOPNdOB…
Comment
AI is about power and control. The investors are tools that smart people are using to achieve power and control. Smart people are tools investors are using to make more money. Safety is pushed aside because it slows things down. The sooner we achieve super intelligent AGI the sooner all problems are solved. None of the smart people are in AI research because they care more about money than power. Who wouldn't want the power to cure cancer, to stop people from dying, or to solve global issues?
youtube
AI Governance
2025-09-09T14:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugy5E3nQP9sxWd2iFl14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxOC_K0549C7ZEu34p4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwyV5Q6H_R2vgXKUZt4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgylN1pUp2rLoDUluv94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyrO1WS8HFVMhZ_6-B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwLxtLnYxv3MBMhTkt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgzePez1bzKi5XWpD7h4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxZF_slill-9NnSkiF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzCcvcMjCpxZ_v7WNl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwezpzYqHwBwo5Hpgl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"})