Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
On the subject of the "Accessibility" of art.
Im going to be entirely honest, an…
ytc_UgwMfeAps…
G
We need to continue to smear the term "AI Artists", to make it as shameful as po…
ytc_UgxC0tHHd…
G
I'm curious what AI will do with all the suppressed technologies that are hidden…
ytc_UgzDZntQd…
G
Seeing this a.i. doing the flims and cast look super weird and lame. I rather se…
ytc_UgyhbTXBQ…
G
How's the fearporn going?
We’ve been pl promised AI would cure cancer by 2025 a…
ytc_Ugzc87o2S…
G
The only thing ill buy from amazon is a robot so he can do my ebay shopping…
ytc_UgzrA4nGC…
G
He says he lost respect for real artists but doesnt show any to the people who p…
ytc_UgwSOluD3…
G
And who's gonna pay taxes? How will we fund public services like police, roads, …
ytc_UgxMyCDyD…
Comment
Sounds a lot like the ethical dilemmas faced by the members of the Manhattan Project, and likely other similar nuclear weapon development programs. Those programs, devastating as they were, had centralised and international government oversight so as to be less devastating than they could have been. Meanwhile AI is being developed simultaneously by profit-driven entities and citizens in their bedrooms, so even at this very early stage there is a great capacity for nefarious actors to emerge. And you cannot expect governments to suddenly swoop in to establish any form of control - they barely understand how to legislate or enforce laws related to use of the internet, let alone AI. Not that they can do much anyway since the cat is very much out of the bag.
youtube
AI Governance
2024-01-19T03:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxZGUvJvu2RsKMZ0-F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxCRfvHalsjxXHQhgF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwTUn_pek1VlAJElhR4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwsUgGAQynl9WCuXuN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxprUFuPVJ2jxhuVlJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxYD4VrQ2otFId05BF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugyy5M7_YHuPSTy4Lvt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzhvW4B3waBjgEZR0d4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzW_uDyqRFef7pHp-x4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugy5xgOqcCRnrlQAVr14AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}]