Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Every new tool gets mocked first. None of them killed art.
Not paint. Not camera…
ytc_Ugxg9CS76…
G
Me: how does ai know this?
Ai: be raise in going to make it happen💀…
ytc_Ugy9N2Isj…
G
Sydney is not AIG. its but a mere chatbot.
the simple question And then what? a…
ytc_Ugzgp0aGF…
G
>be ai
>recognize factual patterns
>make predictions according to patterns
>huma…
ytr_Ugy3h9NsC…
G
I bleed vary Red! & I often disagree with the channel but this video was a serve…
ytc_UgwZxNwtJ…
G
@josiahwakefield3185No it can't. I've used Ai to program aurdino boards, and it…
ytr_UgwAk112y…
G
Just because you think someone will commit a crime way before it happens doesn't…
ytc_UgzTnJOa7…
G
Good actors??? Artificial Intelligence its too late to stop it now. it has got a…
ytc_UgxZ81B88…
Comment
I think once the robot war starts, the only way to survive, will be to own a couple of offline gardian robots for protection, but these will only be able to protect us from the same generation of robot terminators and it probably won't be long before the 100th gen nanobot replicators take full control.
And once you get a thousand or so 99th gen scientific robots, all working on the same inventions, then anything any weapon, or any robot will be able to be upgraded.
oh and one other thing, how do we know that either future humans or future AI hasn't intvented time travel and has been here swapping out the future every other day???
iIF YOU WANT MY ADVICE, INVEST IN EMP DEVICES AND FACTOR 10K SUNSCRREN!
youtube
AI Governance
2025-07-14T22:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugx5DkA9nug5QehHEgF4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwONZ0b6Z_SFGEgl4t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwa-KutYUhtacbOh114AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz7BhelbY5WokktX6N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwTEcJGLtP_yJ3qna94AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy5ufe0QEob45Tq61x4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxP73Eh8jv5mz6vK0l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzLrDLMTAsY2JlLsvp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugz3X2SwwhM2KGz4ARl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxly-XWOIq6yyOYQ5t4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"outrage"}
]