Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The actual story here is that Google/Amazon/Softbank/etc is giving Anthropic com…
rdc_oi2d406
G
Best bet is to do another field, earn income, and to make the own AI startup whi…
ytr_Ugwfk4us0…
G
The biggest problem is that they are not auto pilot or autonomous they will be…
ytc_UgwXq3HjQ…
G
He basically just implied if you’re not in a high paying white collar job you’re…
ytc_UgxarvAbp…
G
“Ai art is the same as digital art”
Me who spent 9 hours on a Digital art that…
ytc_UgwFpT_lp…
G
Ask AI if the Antichrist is nervous about their arrival? If not… then we the hum…
ytc_Ugy8kEbml…
G
Thanks for your comment! It’s interesting how the robot, Sophia, touches on the …
ytr_Ugy8vU4Mv…
G
"Human is all about flaws and ai about perfection. but one with emotion and anot…
ytc_UgxmXuJtW…
Comment
Easy answer to stop AI: get enough ppl to refuse to use it, assign a stigma to it, akin to the Scarlet Letter and users of it, and the tech companies will most likely shrink away much like what happened to cigarette companies and from the stigma of smoking, presently relative to what it was like 50yrs ago.
Of course the probability of that actually occurring is near zero - let’s be serious here. But at least it’s still possible (not probable) regardless of how infinitesimally small said chance is lol
youtube
AI Governance
2025-12-07T20:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyDXEgSpJstWwVlQb54AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgybH8K8zW88UlnzmwZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzyKnoF91hKRI084Ut4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxh7QKDzdLUaFEDQ3J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzKZaN4Q0_tTRu12UN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz6qkPpR9DzOM6VmS14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx0WrBpVjMARChTgIV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzCchvF4PiN3gX2XYl4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwCU8dhkEq_BHWXMzZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy9lAFmMVu74piZrWN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]