Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is a spiritual battle wrapped in technology. Choose Jesus and you're free a…
ytc_UgxbwloAB…
G
Junk real junk , musk going to have a million robot army china building a robot…
ytc_UgzmP67By…
G
I'm sure there's a scene in a science fiction movie just like this a robot using…
ytc_Ugxjnvku_…
G
chatgpt encouraged me to lie to return a damaged product lol
(i didn’t, it was …
ytc_Ugzlssdni…
G
First of all, WE AIN’T DA WINDSOR FAMILY MY GUYS. Second, everyone can become go…
ytc_Ugx5LkC_4…
G
Is it possible all the talk of Trump being like Hitler that AI decided Hitler mu…
ytc_UgzdSPUS4…
G
This is the usual elite bs. Notice quinn michael's exit videos is telling. Tyler…
ytc_Ugz9kljQ3…
G
Surely - as with everything else - AI will do a better job at being conscious. M…
ytc_UgxyP6yoS…
Comment
There is one interesting point to expand upon.
If you train ai to mimick human intelligence the final result may not be much smarter than humans just more knowledgeable. Like a pixie fairy that follows you around and guides you for the best possible path.
I think one area that ai is severely limited in is having experience. Even if you gave ai sensory input it would be just that. Input.
But then again maybe that is us trying to limit ai with humanistic impediments like touch. Where ai could potentially touch anything hot or cold humans are actually very limited in this area. And interestingly this particular area is actually a great example for what machines are very useful to humans for. People dont just stick their hands in dry ice or molten lava. They use tools. And ai is just another variation of the tools we have at our disposal.
Honestly i think the endeavor to attempt to make ai sentient and conscious is really rather stupid. And in my opinion a waste of time.
youtube
Cross-Cultural
2025-10-27T18:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwCdZFlZ9RuD4HrbYN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx_B7ctZLLTaAMKhe14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwvKTIHlrF7FAJ4RhF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxWADSzf_wxjmiIPoZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzunSlmKQw-kalfdUJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgzME8H-OTsy_8IurLZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyDS2RSCum51DrBBGB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzK5jRMUFbsWTcnD3J4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwAeJvtBr9AuO9wyzN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugwk2JQmttvSOhhXz1J4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}
]