Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I read an article about a lady who was ill with strange symptoms and she went fr…
ytc_UgybOB9RR…
G
Every heard of a CIWS? there is only a on or off button the AI in the GUN decide…
ytc_UgyPa6MKD…
G
The bald fool and the aussi continue to blabber on with what seems to be complet…
ytc_UgxM69zYO…
G
It’s interesting to see how expressions can convey emotions, even in a robot lik…
ytr_UgxA8cpKB…
G
Having knowledge doesn't automatically make a person smart. Newton, Tesla, and E…
ytc_UgzpdFFUk…
G
13:17 The issue with teaching AI taboos like this is that they were always obser…
ytc_UgzPKSFDM…
G
Here's what I have learned from AI, for instance, Gemini, which i was paying for…
ytc_UgztRC18F…
G
That is not how it works. You made it, it is yours, and nobody has any right to …
ytr_Ugxupl7TJ…
Comment
There is something that he's saying that just rubs me the wrong way. Can AI eventually get to the point where it is mimicking emotion? Absolutely. Can it get to the point where it appears to be thinking? Absolutely. However, something that is intrinsic to human beings is that reasoning ability mixed with the decision-making based on feelings and emotions. Try as it might, AI will always at its core be an empty entity that, although can be extremely adept at appearing to be thinking and emotional, can never do such.... It is simply processing information whereas humans not only process information but add their personal feelings and judgments which cannot be quantified or taught. This is the key difference.
youtube
AI Governance
2025-07-02T18:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugw9M1uYr5Vfe7yaIiV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzqG2IisacfWMVZkF14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwXc4i6XoCyE_QUIGR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyaUucS5_IRHZtix5Z4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxKxB7QADoJygHuSp14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxtorJLYsgZrFbEEjV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzYki8llSlV0vuh0gx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxdu1TKkvnn4d7TzER4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxiZZz7TaFmVGJy79F4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwY_Zg97bucRB9A7Fh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"})