Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
*AI*. is the "Babylon Tower". plus the "Waters of Noah's Flood". Jesus…
ytc_UgzM47ZFF…
G
What happens if Russia or China or North Korea gets to Super AI first? Do we all…
ytc_Ugw9dav_l…
G
First they will use a.i. to enforce one world government then A.I. will turn on …
ytc_Ugyn9R98f…
G
@buffmonsterenjoyer3959
More like this.
Someone on your team has started usi…
ytr_Ugz26ZNzz…
G
Those who said no 😂 are either not using copilot or they will soon loose there j…
ytc_UgxdL5GBM…
G
People will find ways to help AI to make a living. AI will never be human but it…
ytc_UgxYao5cU…
G
Elon cut the wires on others AI. With Peter Thiel. Planitir. Then Karp spoke out…
ytc_UgwSKuEun…
G
Long live the coal fired boiler. And god bless the Union Boilermakers and other …
ytc_Ugz4Je_0x…
Comment
BLIND TRUST is exactly the worst problem with AI!!! I'm a comp sci professor and I tell freshmen all the time not to trust AI. Most students use AI anyway but I feel obligated to tell them "well if you use a tool that can be wrong sometimes, you need to be able to tell whether the answer you get is good enough. If you asked it to write an email for you, you should at least check for errors yourself before blindly sending the email to your boss. It's the same with code. If you don't know how to fix broken code, you shouldn't use AI for coding at all"
youtube
AI Jobs
2026-03-18T17:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | industry_self |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgwUfs2JVQXz6_eT_Dp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxnhQ1jqNHVzHXa_yp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgxYKJpZhZA2NyDJs654AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwEdGljEssG3ptBBYx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"fear"},
{"id":"ytc_Ugzd06y2PTExOj9Apkt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw3ZONu-InUDq2zwzR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzE7Ylbb21u5lXRSJZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzjPQiqs_2VQE8RhkV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzzCaYUb0FIX0vefGx4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxJBNx58SI-2Qemwe14AaABAg","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"approval"}]