Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ai exist before this is only a relaunch, is the sum of all Egos humans (when lis…
ytc_Ugx4GcPdK…
G
The only kind of AI I really use is something similar to Character AI, it's just…
ytc_UgxdvRcDa…
G
Difference is, the 30,000 that died could have controlled the outcome. If your s…
ytc_Ugz1YNjhb…
G
No don't slow it down, speed it up so it can blowup fast! It's realy a shoot on …
ytc_Ugwhm0gYG…
G
I think that i don't give a fuck as long as it's good. Artists should learn how …
ytc_UgwrXkNZ7…
G
This has just reafirmed some concerns I have with self driving cars. For the pas…
ytc_UgiiIRzPV…
G
44:49 Elon just bought Twitter so that he can Unban Trump :D we shpuld be worrie…
ytc_Ugy73z0ON…
G
@Sharkcool-h2oboth are better. There are ways to trick basic computer monitorin…
ytr_UgxCMZSxn…
Comment
Oh wow so google wants this super AI to basically put them and keep them at the top. I really don't know what google wants and what their intentions are but I don't think it's "positive" for the people. Meta universe, the matrix, etc... And everytime they test these super AI they always have to unplug it. Facebook, Google, Even quantum Computers. I think I can get a good idea of what they are trying to do.
youtube
AI Governance
2023-04-18T12:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxbKAHOwWMHYie6h7B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyzHbjn68cChDqNQ8h4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyzedawhG8_lOgkb1l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxSXFXSlFlKPxGXCrx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw-nfbPQCIXek8wjnV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw36Ia2Z-e2wtdNNkN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxX1zwFnL4jW1ZBrZh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy1BlHzLf5RT3xhort4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx6qsnc_WWI9ec3CRZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgznJOuiw3qT3sm5fMJ4AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"none","emotion":"approval"]}