Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Id like to disagree with one statement of the hypothetical. The cost of living i…
ytc_UgxEHUIan…
G
Land is not offered in Siberia actually, but in further eastern regions, like Ch…
rdc_d2xcid1
G
Big tech have regulatory concerns, I work at a company that have a huge internal…
rdc_n3lj86j
G
The inability of AI to get things right, the lack of Comprehension, the lack of …
ytc_UgyKBFmqN…
G
I'm not sure it will ever be possible to prove that a machine is or isn't "consc…
rdc_icg0n7o
G
@R.Domane A worldwide AI shutdown sounds simple — but it wouldn’t solve the core…
ytr_UgxW3EoZe…
G
This narrative of if we slow down, china won't slow down. That's why we can't sl…
ytc_Ugwf5dRBl…
G
The AI part is really not her problem in that story. She could ask chat GPT if r…
ytc_UgyCuW2Bl…
Comment
I've always said that if Sci fi writers and creators can think of it for movies and books then they're scientist somewhere trying to build it or some form of it if it's remotely possible even as a concept.. Star Trek had their flip style "communicators" and many years later when the technology was available we had flip phones. Futuristic movies and novels that depicts AI taking over the world at one point was ONLY Sci fi and fantasy, that is until the technology to do so became available which is now. Pay attention to the fast tracking of these technical developments with practically no oversite and you'll begin to see that we are on a AI path to destruction... Assuming a nuclear WW3 doesn't take us there first.
youtube
AI Governance
2023-04-19T13:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgybCRLuslUr-O7PqK94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwK7ESV0IIbRfMoHJp4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyO2TCNw1VVn8By_794AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy0u9JUwkduphZaD-94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwWt-Lz7hC1PtFwQPh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx1tynQvgHuB6gTHep4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx6ZfCPmFVNT5MZO_h4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugw6MvxvLZgsT9GjZzh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzMl3ldqg3Zxme2Xwp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyhr_5736wVeRWAJCl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]