Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think there aren't enough comparisons to how AI art may resemble the effect AI…
ytc_Ugy0O9gE4…
G
Question? Do you think is it possible to imbue intelligence into AI like Google …
ytc_UgxMy89HR…
G
I seen Waymo, in dallas texas a wee ago. I haven’t seen the driverless big rigs,…
ytc_UgwhI_s1J…
G
This tech is already out of date. China announced some years ago they could pick…
ytc_Ugx3I_2D6…
G
Ban on AI technology is absolutely A MUST sooner the better. you created somethi…
ytc_UgwPBAN0o…
G
No wonder the matrix and Terminator both movie series had a common main antagoni…
ytc_UgzqukvIT…
G
"How soon we will get to advanced AI?"
Kind of vague for a prediction market, …
ytc_UgyMqm9r3…
G
Its not a person, its not sentient, its a smart tool with access to a lot of dat…
ytc_UgyEftZwe…
Comment
As of now the risk is not in the motivation of the AIs, because they are not sentient. We, as humans, read this into their expression. But right now Microsoft has too much power oven OpenAI, wich means they can gain control over ALL INFORMATION. Microsoft basically has all of humanity by the balls, and I personally don't want Microsoft anywhere near my balls! 😢
youtube
AI Governance
2023-03-30T09:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgzUvcNmWdhebmM0rq14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxjJ21UIW-N-gWtIFZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzL-60u0YDiyy3C0Kp4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyQuu1agpBnCarU-WV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyEr-3Ej1EhBGPV4dh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz_ExjirZyC-Mmo8sl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzEqeUGizeI9hgIB2Z4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugyo0diA4kxO18Ucril4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy9UNNZe6ashXzPg114AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugy0t7u8ncodnNvS6hR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}]