Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Still dont get the point why its important to be polite to AI, its not like it c…
ytc_UgzPXLc5_…
G
Couldn't you also use Nightshade on existing AI images to get around the issue o…
ytc_UgzPu9IFH…
G
I know what industry will not die: Adult movies! People wont go off to a Robotic…
ytc_UgxohSrXH…
G
from this logic, it would also be valid to say that the human is not conscious j…
ytr_Ugz-YGRN3…
G
Then shareholders would benefit since AI CEO will make the same decision as CEO …
rdc_jswmund
G
1:27 and then the ChatGPT I am using can’t even pull/quote any song lyrics (said…
ytc_UgyZL1Bk_…
G
i prefer using github copilot free, also uses gpt-4 or claude 3.5 and can do mor…
ytc_Ugy2dKF2i…
G
Im by no means a writer, but one day i put a joke prompt into an ai for fun, and…
ytc_Ugxiql0uK…
Comment
I disagree with Elon Musk’s seatbelt argument. Okay, fine. You want to force automobile manufacturers to install seatbelts to satisfy some bubble wrap mentality, I get it. But when you take the next step and make it a punishable offense to not wear the belt, then I say enough. Who is harmed if someone chooses not to wear the belt? No one, except possibly the guy who chooses not to ‘click it’.
Laugh, but that particular federal law is the thin edge of the totalitarian wedge.
As to regulating A.I., someone once said, “Now I am become Death, the Destroyer of worlds.”
youtube
AI Governance
2020-10-03T02:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwMZwDrqAWQX3WeKG14AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxmWj0-Duk-E_v8mbN4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxAsrL7W2vogmdeBMh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugym8HtVa_l-1WNTuUh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzhlq6YBPZ7Yra7w_54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxut42Cf1sSUURfIuN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgyV4aWdLq_4BFCJRH94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyg70GkSx8Md4fdF7l4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgyK18H-5UAQqnWOen54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwn1p--fu5hYSrwjMF4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]