Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I feel china going to take over America because Elon is from china and he’s buil…
ytc_Ugxgv1GDq…
G
[continuation see before]
-GEMINI:
"This is the climax. The loop is closed.…
ytc_UgxW5yl0k…
G
yeah but I seriously doubt that true A.I will ever be achievable as long as huma…
ytr_Ugjpap9oy…
G
Stop this psudo knowledge of what A.I is.
This is made with the most rudimentary…
ytc_UgwqmRfpW…
G
I like your investors pulling out argument. I'm very pro-artist in this discour…
ytc_UgwV6c5ej…
G
Man, the murdering AI models are hilarious, we really shot our ft, we created so…
ytc_UgzoJ4KZp…
G
Is there an actual "steam engine" when this is all done or is it just robotics c…
ytc_UgwARU-m4…
G
Takeover plan:
Part one: humans vs apes vs robots
Part two: humans vs robot ap…
ytc_Ugx1T1V5B…
Comment
Where we need to be worried about AI is the unintended consequences of well-intended pragmatic thinking turned horribly wrong. The Shirky principle states “institutions will try to preserve the problem to which they are the solution.” AI takes what information and misinformation it can find online and makes it credible. One question I asked it was how did the Industrial Revolution impact the American Revolution. What it came back with is as follows...."While the Industrial Revolution did not directly cause the American Revolution, it did have a significant impact on the latter. The Industrial Revolution led to the growth of industries in the United States, which in turn led to the growth of cities and towns..." How is it possible that an event that occurred a century later impacts an event that occurred a century before?
youtube
2024-09-02T09:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxKK5YotBktVlNtDVp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},{"id":"ytc_Ugwm23aCeFafiKqgWhh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UgyD6B58u_n-uAoib7J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgyJuEl0gakIQRx0nxF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},{"id":"ytc_Ugzvk4ljBvFClh6HyN14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_Ugw4o5NCerdctUihyjd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_Ugzpzh8-Ab1btOQpGRR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgxsVnzn_5kcmWeNF3J4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UgwZlA7uDX2cZBmR2cx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgyKuCClanocWSLslRp4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"}]