Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI stocks are set to dominate in 2024. I prefer NVIDIA because it is well-positi…
ytc_UgzXIcJfX…
G
True. But if it is not introduced to the concept of a turing test, would we have…
ytr_UgyPJ7ryY…
G
So if AI can do all tech jobs and create all programs, apps, engineering ect on …
ytc_UgxlIRm-5…
G
I don't know the exact context of what you're talking about, but if his ai was m…
ytr_UgyW9i4vt…
G
LMAO are you sure, the AI already interacting with eachother.. they are smarter …
ytr_UgzSTGvcR…
G
When it comes to Angel Engine (Or any AI art others find value in) @theunearthly…
ytr_UgzMZscDf…
G
If they could have a sweet piece of software for next to free, why pay teams of …
rdc_m6xnya1
G
Should we be surprised that this is a black man?
BTW this is not a case against…
rdc_h54f26d
Comment
Robots with emotions is laughable. What would be the point of giving An A.I robot emotions? So it can get frustrated or angry? because there would be no reason to be frustrated or angry unless A.I gets things wrong or has a perceived fault or error it can reflect upon. .
This guy on one hand says he fears the future of A.I and on the other hand says A. I should have emotions? Give A. I emotions and you might as well put a gun to our own heads.. The reason for most human related issues is the inability to regulate emotions... Anger turns to rage and rage starts war
youtube
AI Governance
2025-06-16T10:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyauZG3xyDwjfDSBMZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyGogRmjcx4_qiN2oJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwsfPVYKZKxr8W0rLJ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzuXiaQ2E3TB8RXu_p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyi2xVAEcbRViuXnHZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugwoz_I3xRHnUUfLfPV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx_vsNDtzs8VD4aXCR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyX98M819rnwXwBQ-d4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx2w-TR26zWDnqU4lR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgweI1aaseiT02Eg4gR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]