Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ai doesn't exist.
"I'm not against tech...."
All these people are their own wor…
ytc_UgzatnAte…
G
AI will 100% be used for mass population control and manipulation.
Never trust i…
ytc_UgyZTRhN6…
G
@AngryClippy1No, desire is a human emotion, it doesn't have any human emotions …
ytr_Ugx8yAeap…
G
Using a pencil is just like using ai art because its a tool to easily make art, …
ytc_Ugy8Qjpyt…
G
One of the biggest problems throughout human history has been coordination. Our …
ytr_Ugxi5ce8E…
G
MushuaThePotato What is considered a lot of effort can very greatly from person …
ytr_UgwJDlRO1…
G
😂😂😂 hopefully it goes after it’s creator😜
If they had sense AI should have been …
ytc_UgyCcKFg-…
G
buddy you're gonna have to spend a LOT of effort to make a conscious being out o…
ytr_Ugy5jw4NX…
Comment
I've been using every AI I can find to help me with some high school math upgrades. They can do simple stuff ok, but when I ask them to do complicated operations it falls down. For example they can factor a polynomial by grouping when the numbers work for that method, but when the numbers don't work they won't try a different method. They will still attempt to factor by grouping and just throw in some made up numbers. Chat GPT, Copilot, and Perplexity all make the same mistakes the same way
youtube
AI Governance
2024-02-27T20:4…
♥ 42
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyxjuwntgkJVz6HBnx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyob0qN0MExtDlFLQV4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgyeQrGmeuxp-s3WarR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx8cs1q3IrwJ7iofRV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzoaP5Sqmhrd7j72lx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyFu_KTtrGsSsMevuh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw0cBH1YLZtiY0IJkN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UgxdfvEKjTDpe1aaac14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwsf2qYAoJ7YUCGEO94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxyEuMJICgE46Uwz1l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]