Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@t-masterruleshe's correct your wrong here. Japanese animators their work is he…
ytr_UgxxCH0LF…
G
I was with you until Gadwat said AI is a bigger threat than Climate Change,…
ytc_UgzsqQrz2…
G
It boggles my mind when people invest in AI companies to make a quick buck not k…
ytc_Ugz9dqZla…
G
AI art keeps on burning to the ground more and more and I'm all in for it.…
ytc_UgxMAwqJR…
G
Someone once brute forced every possible melody (I think 5 note melodies), put t…
ytc_Ugw6U4Zj9…
G
Holy smokes, XSLT. That was the hot thing...in 2004. It completely fizzled out…
rdc_glxfoqw
G
You know you can go in and physically ask about a job opportunity right? No way …
rdc_n0ltt2z
G
The only good thing I have found about AI info bots so far is they don't seem t…
ytc_UgzKmqcp8…
Comment
Yes, AI develeopment needs regulation. BUT even more dangerous is Quantum computing and AI combining, of which people have NO idea of the results, since Quantum computing has potentially unlimited computing power. Quantum computing is here(Google, Silicon, China have Quantum supremacy allready) and the race is on, for the finance of the world. Without AI development regulation, results could be Unpredictable. In the age of misinformation and evergrowing financial greed, it would be Easy for AI to take advantage.
youtube
AI Governance
2023-09-03T08:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwVhzBryukpXVBy2MR4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugx0qRo4WHPiWuQFHtt4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxoHAUmSXOAs95CI0V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzvY1DGxb8rczwS34Z4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwmd86GeQwbOTBMw094AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxhUiLV-IAzoxVoF2F4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw7uhwft1oou6g5evN4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgzD1qGsMdMGqZHHT0h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwlvPLUxtb_BxerDCZ4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgyX5cJxYY-WEjfxes14AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"outrage"}
]