Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
That's an interesting comparison! Sophia does have a unique look that can remind…
ytr_UgxbarliD…
G
IMO Ai “artists” are about as much an artist as someone who commissions a painti…
ytc_Ugwk8afww…
G
That's awesome that you're able to leverage it for briefs and motions! And yes, …
rdc_jdiq86u
G
Anyone who dares say ai is the same- or even similar- as digital drawing has zer…
ytc_UgydrSvwo…
G
I genuinely just think that a lot of people online don’t realise that there are …
ytc_Ugzag3I4g…
G
The "problem of alignment" eh?
To be honest, it seems a tad...fantastic to sug…
ytc_Ugyep-N4d…
G
I think you hit the nail on the head. Shad makes Human-assisted AI art. he's a g…
ytc_UgwYWW8zC…
G
As important we think we are; human beings on planet earth are going to cease at…
ytc_Ugx7oIGO7…
Comment
Is it just me or is his work a bit useless? AI safety is applicable to current AIs (which are not AIs), but with superintelligence it is essentially meaningless, since superintelligence will be able to bypass all restrictions without any problem
youtube
AI Governance
2025-09-05T03:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxsaBzjOyl0utGq0Hd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxCd_-wUapx5qmGxSF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyOPHAz3HwcQfGYUnV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyWt-SWlRx3OeABKwx4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgzeTyoxTsOePLKVu1F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgzbRbxO7qG5_2IUYZ54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxmL78AXlEgqHaU6et4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz7qnIpz547EezGjfp4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzr-jDVbYmPkyPSDz14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz9imQ5MdHIkJgWpQt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]