Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Neil couldn’t be more accurate. It will really only do what we program it to do.…
ytc_UgyrBzEAF…
G
There’s need to be more dialogue on the subject of AI ,robotic machinery (self t…
ytc_Ugw_UZbcI…
G
You miss the point completely! Do we need jobs? No we do not but we need educati…
ytc_UgwgzcPih…
G
Cooking instant noodles took more effort and time than generating an AI art. At …
ytc_UgxuxKAVH…
G
Okay Currently, there are around 35 million self-driving cars with about 6,000 f…
ytc_UgzMa-OKb…
G
The premise that massive unemployment will be passively accepted by millions of …
ytc_UgyNbm13I…
G
Man I really regret when I used an ai generator. It's been a long time since but…
ytc_Ugz5PJKVw…
G
This is happening in academia as well. I work behind the scenes in an academic l…
ytc_Ugy_D-R3m…
Comment
I watched it with high interest, everybody is interested to this topic, which is quite new, even though as they well said, AI was around since many years before, and we were using it without even knowing but now it's blossming the whole potential and we still don't even know what's coming. Hoping for the better, beyond what has been said, the most important thing is that, there are hearings like this in order to regulate it, listing all the pros and cons to point out what's better for humanity, to ensure that it doesn't slip out of our hands.
youtube
AI Governance
2023-05-27T13:3…
♥ 6
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy8U5tPtekinl7zOwZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugz0lL_km5c9ELAtq2p4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_Ugwa1HQcAHk9tw8cJWN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwxfvrKztazuC8dThV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxYAM_ZGh-wTBBCrqJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugyn7QIChi8vwaBvlWd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzdNHizUSNmePRqDHl4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxuv58D0IHiYikHYh94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw2yl9CVJ1FykqbSMd4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwkWturiTw6NsXIsGJ4AaABAg","responsibility":"company","reasoning":"mixed","policy":"liability","emotion":"mixed"}
]