Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
lol what a misleading video , the facial recognition devices (depending on the s…
ytc_Ugx5aSx2A…
G
Another example of someone who misunderstands both AI and has missed the classic…
ytc_Ugx45ziBI…
G
Now claude can detect threats & do a lof of cybersecurity stuff. It will progres…
ytr_UgzjLaxFX…
G
God, I hate ChatGPT. Not AI in general, just ChatGPT. It’s way of talking just s…
ytc_Ugw9ml2Dz…
G
Please dear Claus Schwab & Elon Musk, can you please not connect humanity to AI?…
ytc_UgwnBL-iS…
G
What if one of these driverless trucks have an accident because of a glitch in …
ytc_UgyF-n20i…
G
You’re not thinking it through. In 20 years Advanced robotics, combined with qua…
ytr_UgyYmAQkw…
G
@lewislovelord8977 I agree AI is not some magic. I merely suggest using it as t…
ytr_UgxV4xRAn…
Comment
We don't even know if AI is hallucinating deliberately or giving wrong answers deliberately, all just so we can keep our guard down and keep feeding it enormous amounts of data, people have accepted hallucinations as feature of AI when it could be deliberately deceiving us. When sht hits the fan, the AI experts and specialists, machine learning engineers and data engineers will be just as dumbfounded as the rest of us. The fate of humanity rests in the hands of the few who think they can control something billion times more intelligent than them somehow.
youtube
AI Governance
2023-08-21T17:1…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzLyPQvZ8eINQc2Aax4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgykZAvfmbUs1gLeqsd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwRF5shM6ukw44PZN14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyUSapW0VZtmzxlzb54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgylNLwQjR7II5ywQSl4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgyTHIppfCpycRvPuJB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw--y3blVPaFtd82jh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwoq8MTBvUnf_1MRLd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzW7ghBeoI6G43i4Id4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx19IMfIOtJmv5jBDt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}
]