Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm sorry, but does Ai have emotions? Does AI have to feed it's 3 month old baby…
ytc_UgyDvYV5J…
G
I was a software tester for a Fortune 500 company for 15 yrs. In 2017 my dept wa…
ytc_UgxBd-zgR…
G
Couldn't you also use Nightshade on existing AI images to get around the issue o…
ytc_UgzPu9IFH…
G
Hard to tell these days. With CGI and so forth there are so many movies out wher…
ytr_UgxNkNmYO…
G
Better yet, their corporate overlords record keystrokes and mannerisms of every …
rdc_jtb9dlb
G
@seriouscat2231true but Not true at the Same time im Talking about Code here no…
ytr_UgxJkieEa…
G
Ai art stans are the first ones to complain when people repost their trash.
The …
ytc_UgzcCqdeB…
G
This has the stink of propaganda, of an industry-paid-for-study all over it. Whe…
rdc_e445idw
Comment
Would like to see a program that considers the possibility of enhancing humans with AI technology. A machine might desire something we have. Empathy, creativity, etc just like we desire great intelligence. In Matrix. The human learns a skill in one minute through the use of advanced technology. In Star Trex VGer wants to unite with the human to achieve godlike existence. Can we move toward this future? Right now the AI experts are keeping humans and machines separate. Some scientists are studying the mitochondria to see how life can be extended to the thousands of years. Some say we only use 10% of our brain (Lucy). Could AI help us increase our IQ into superhuman intelligence?Question. How can we use AI to always stay superior to the machine?
youtube
AI Governance
2026-02-08T13:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzNJ-dpyioLVKXeYpF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugypf3CJ0MX8TlkNxwF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxFbizYgNEl8vSS3ot4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgztkyFGYSeD2tZE1BV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwdcBSdbw3rl42kiot4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwFATDixk2gyMDBbt94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwmWIwb0cMZz2mUyw14AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgypbgX1T37Ccn1gMc14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugzne043iRrQfLTC8HJ4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyfyZ2-z5WzhDd5bfB4AaABAg","responsibility":"government","reasoning":"unclear","policy":"none","emotion":"outrage"}
]