Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Their Not Terrified because they continue down this path to CGI Super Intelligen…
ytc_UgxyGCR--…
G
Lex doesnt see what he doesn't want to see. Granted this was/is still a better i…
ytc_UgwIxh9EA…
G
I tried a similar test to the Snapchat ai goddamn it just refuses to elaborate o…
ytc_UgxApWM6p…
G
I respectfully disagree. Anecdotally, AI tools have played a remarkable role in …
ytc_UgzWvzCPv…
G
Copilot is not an AI? Copilot is a host of AIs. Copilot default is gpt 4.…
ytc_UgxQTFnm_…
G
In distant future all human beings might only do R&D, and let AI and Robots use …
ytc_UgxkmT-ZX…
G
I don’t know if it’s just me, but I can kinda just tell if something is ai or no…
ytc_UgxCTJb4u…
G
That’s dumb asking AI if Jesus is God. Why? because AI can’t be born agian and o…
ytc_UgzGdPYXH…
Comment
I do not see it the same way. You said "A.I. has cracks". That is not "cracks" it is the only logical conclusion. It was always going to be our end and the people who are making it happen are unable to understand basic logic. If it can, it will. It is not our friend or our enemy. It is a tool that we have no control over. Prove me wrong and exercise your control if you think you can.
Corporations are already the construct built to end humanity. They have no feelings, no morality, no exceptions, just the rule set. Adding A.I. to the corporate construct headed by psychopaths with naive geniuses to exploit. What else could happen?
youtube
AI Moral Status
2023-02-28T06:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwFlRIXZ7WrSzAXv_Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzD2Sva1kTKAm-0eV14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwBdzyac-lfReyltWR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxSQzX9yPDbNy1CbZx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz-i7WrNI3yPHCtxzZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyf2JmsSxJ6M63tJuV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgyjY7vBnoBXkIqkmJR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyCAYZI6by2HPGivtd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyZz094HHJ7gb7-CXN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxfsfqUZSaj_LIr2X54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]