Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Karen Hao is wrong about tech companies not making their models more productive …
ytc_UgyGabqu1…
G
Give me a fucking break😂😂😂” Ima No ExPaRT oN QuAnTuMs CoMpUtiNg!!”. This billion…
ytc_UgxParNt2…
G
This guy doe have no idea what this robot. Beware of technology that in the Bibl…
ytc_UgzERx4Yk…
G
It's very difficult to win a legal case against a law firm, and if you do, you'l…
rdc_imfxl7x
G
gpt-4 10 billion dollars? iirc the training cost was more than 100 million but n…
ytc_Ugw1P-oDU…
G
"AI CEO will never layoff 20% of the workforce right before the holidays in orde…
ytc_Ugx2q9EXm…
G
If we create a Super Powerful AI that vastly eclipses our capabilities, then may…
ytc_UgzGqNeYT…
G
She says we should take steps and measures to ensure that humans are using AI an…
ytc_Ugxx2uDpz…
Comment
Because it would be impossible to do so. Addressing the root cause is better and a more realistic option. Just like it is more logical to ban access to weapons than prosecute mass shooters as and when they commit crimes. The public deserves safe AI tools.
youtube
Viral AI Reaction
2026-02-12T19:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgzYXxtT-bf6H63zX-F4AaABAg.ARuWDIemsQDARuWu6Up5iK","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytr_UgzYXxtT-bf6H63zX-F4AaABAg.ARuWDIemsQDARugKzcxjE1","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytr_UgyxmsSM3q3rBwe16px4AaABAg.ARtkBY4zp17AT8L2MhglWZ","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgxFQr-apKPEEePrE694AaABAg.ARt_h75JB28ARuT5FXk2AQ","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytr_UgxU0GWMmcwdG06OLTF4AaABAg.ARtUyludPG_ARtW2kfzpH4","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgxU0GWMmcwdG06OLTF4AaABAg.ARtUyludPG_ARtaOln97oa","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgwxUwZ1H_6IotTi1wF4AaABAg.ARtO4SxqhAyARtbXpYnMba","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytr_UgwQtC7kfS1l5leINyh4AaABAg.ARtO236ZiNgAS-FpKj-8to","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytr_UgysaEgSXozbpYTsh9p4AaABAg.ARtNuoihABgARtWwtS6IiI","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytr_UgzatdHfKQb_kh211Gp4AaABAg.ARtM2ZnJvCMARtOlqDw2tA","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]