Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is currently the most dangerous technology in the world. We are accepting it …
ytc_Ugz26V4bQ…
G
I'm not afraid of the AI we can create. I'm afraid of the AI our AI might cr…
ytc_Ugwr3dFfF…
G
Fuck questioning the consciousness of AI till we figure out how the fuck WE are …
ytc_Ugzc-9KlZ…
G
Putting aside the copyright issue of stealing millions of lines of code, this ju…
ytr_UgyaWJAjV…
G
@MaakaSakuranbo please, walk through a museum and pick out the finest artwork…
ytr_Ugw8wbEfR…
G
Government always finds a way to make it a problem for us recording them, but th…
ytc_Ugxb2QB3x…
G
Don't be fooled, you actually have to study to learn. This type education lies t…
ytc_UgxkT5fMY…
G
What stage can AI make a cup of coffee without a human providing it supplies..?…
ytc_UgzBTl4y3…
Comment
The Microsoft AI Copilot got the missile math problem correct. But I don't know if it did the math or copied from the article it found in Scientific American by Martin Gardner. In asking it if it did the math, Microsoft's AI Copilot claims that it did.
youtube
AI Governance
2024-01-16T19:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyR-gMBvt1z0HjKUuB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"urgency"},
{"id":"ytc_UgztHZWSnR3PcVmz4Cl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx9f01lBDzYg2AWZXR4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyTQSQkoMqGKOVSWuZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyI-myVXIBnnhm3Wcl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxkAlOVgHP-66Bf2o94AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzZngtzSfeFtjtoLq54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgymkLH6szzXt7LahQB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzzh7aqDjUpTo_EQWB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzYFRhoJy9-uojNroB4AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"indifference"}
]