Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Alright so, I agree with your thesis of "Ai bad" in totality, small mistakes are…
ytc_Ugw6pzNT6…
G
Take a pencil away from pencil artist he will use pen take away ai from ai art …
ytr_Ugy5UrPZg…
G
If AI was out of control to the detriment of humans we would already know about …
ytc_UgzjEUqcQ…
G
ew... using Ai to draw? HAH! pathetic... dont be proud of being a human without …
ytc_UgyO1-UeD…
G
In order for ai to want to be selfish, it would have to have desire. Just becaus…
ytc_UgzDn1Vjb…
G
It's got some terrible rates of accuracy. Look up ai hallucination and read abou…
ytr_UgzfY4XfJ…
G
23:23 here is my problem. I don’t have many physical disabilities except some ne…
ytc_UgymDyLnd…
G
Charged crimes are still admitting into Algorithms.. Sad, bc cops can arrest a…
ytc_UgzeuN_7H…
Comment
I worked with encryption software technology for several years. The same stuff used to create cryptocurrency and protect online assets from hackers. All of that security is dependent upon a certain incremental growth in computing power over time. As computers become smarter and faster, encryption algorithms are upgraded to stay ahead. The trouble with AI is instead of incremental increases in cracking these sorts of codes, it can make exponential leaps ahead. By the time human programmer realise what happened (assuming they do), pretty much every connected device in the world will already have its very own AI kernel, quietly waiting for their call to action.
youtube
AI Governance
2023-07-07T12:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzWPW4_fF3IZXAUsp94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyfZOkvEg2qS1sLxHd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzqELDFg8HYxyUam1p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzi76BdzuNlAVMhukN4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwduGbdv_vsyB1eIYR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw_2ctjVr2zHF53lCh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwa3FTV5er2DARUKON4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxI5X_txv8fRIUh4QF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzDEJoZflMQsnHAweB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzaqCkVbgDFSDMknLt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]