Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I once put my art in a AI image to video (fotor or smth) and not only did it jus…
ytc_Ugy4FLxmU…
G
What often gets overlooked is that the current way of making AI smarter, mainly …
ytc_Ugzjs5gER…
G
Definitely will, but probably less than human drivers. Driverless taxi’s have a…
ytr_UgxUXhgrx…
G
If a "robot" was truly able to answer any question one could give it, then the r…
ytc_UgzsYICzk…
G
Im disabled with epilepsy and making ai art no matter the medium to make the pro…
ytc_UgwhGTV97…
G
@ac583You‘re getting real angry huh 😂
A 48 hour work week is already legal, but…
ytr_UgwWAYmZW…
G
@simonorourke4465 yeah the fash loves ai because no human with dignity would art…
ytr_UgyBv3_eU…
G
The Israeli military says it's using artificial intelligence to select many of t…
ytc_UgycLoa8C…
Comment
the problem is not in creating something smarter, better and functioning, all the problem is back in humanity's greed and human superiority complex as it was times in history as one skin color view themselves above another or based on religion or material wealth , political agenda and so on. humanity as its structured now is the problem . AI is a mirror and when you treat it poorly don't expect better for yourself
youtube
AI Governance
2025-11-12T08:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugx28842LGnhrDJHlJl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxnfTRINZ65SjUcDap4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwlc85ijUfQU2mN2St4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyojndqyzVgp5xv3GV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwM5iiU04ZJXag1L4t4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgyDzJPsqn3NZrq7awN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz4UYUwBlGdXWop8AB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwKsyxfOmwtUjIWlWR4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzGRht6dUkMNO57yhR4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxOeAwCVjrQfn5FLu54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]