Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@trybunt There are a few bright lights at the end of the tunnel ... maybe. Like …
ytr_Ugy8fQDWM…
G
That may happen, but it's not exactly "realizing" that its answers were wrong. R…
ytr_UgysmdyIy…
G
Twenty years the man robot says I told MFer, i was gonna take over; mean while …
ytc_Ugwqkf2Nh…
G
@misticair i did. clearly not ai because if you did read it you can see for part…
ytr_Ugzq5z66M…
G
AI learned from humans... you aren't a trillionaire... so why exactly would it d…
ytr_Ugwh0S6Bc…
G
Stanford lab just gives you the comparison and the decision is on the management…
ytc_Ugw0vGZQ1…
G
I’m ok with AI taking over, I live on earth around humans and honestly I’m all f…
ytc_UgxJHlVfw…
G
Much More than Quite Fooled & Still Are.. Now AI, Definitely on Steroids+ a Bonu…
ytc_UgwSHfDe9…
Comment
if human mind is so slow and limited by speach why we dont use ai to create software what makes our mouth to move faster when we talk :D (thats a joke). but on serious side i think people will use the chips to integrate into human brains, link them all together in 1 network. 1st it find all those missing neurological pathes what this scientist is talking about. 2nd extra super fast information sharing and analizing. 3rd it will make humans still think they are more powerfull than computer as they turn into one themselfs. Then human will have them super ai sitting in the heads doing all the magick. just need exo skelet to makes muscles stronger and thats it. super human revelation.
youtube
AI Governance
2025-08-19T13:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwhOMGHO1Oiug6O5nJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx9hw0L7tbnKaTrwo14AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxlmjKGOXB-A63y05p4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgyEXgoCwhKC31o59gZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"unclear"},
{"id":"ytc_Ugy9UGOmRzJ1Hv4MXV54AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzY0NdWtuwJiserE614AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzK39l4NZUBovUz7bR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx4Tt6GTJ6D4cU2uKh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzdt1v8Th8lh_nab5t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxXy0ghLN7krfEqFDV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]