Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI hasn't been introduced to Quantum computing yet.. when someone stupid enough…
ytc_Ugy1tOpCh…
G
AI will never take the job of a physician. However, you may be using software in…
ytr_UgyKi6IcK…
G
That was a "realbotix" display. 😅 I bought several hundred shares in that compan…
ytc_UgxAmiDar…
G
My heavan i must learnAl.i fear to every Ai.orrr cant fine out this threat
Its i…
ytc_Ugx0ojBCB…
G
Yes. Some of those does not sound implausible. To be exact, AI going beyond huma…
ytc_UgzWFHzHG…
G
B.S.B.S.B.S.B.S.B.S.B.S.B.S.B.S.B.S...... AI is dangerous because it will expose…
ytc_Ugy_gtulU…
G
AI is legitimately dismantling every way of life and the only option is to just …
ytr_UgwbCwwxQ…
G
me personally, i think using ai to assist in either getting started in an art re…
ytc_UgxdFW7dz…
Comment
The biggest problem facing humanity is not take over of AI. But the problem when humans know little and AI knows everything. When a new generation of children have no desire to learn because AI has learned everything and is doing everything. When human brain cells don't utilize to their max potential there is a downgrading effect taking place. Unless knowledge can actually be uploaded to the human brain? And problem solving abilities can be practiced by the human brain through virtual world effects (hallucination)? Which goes back to the concept of simulated reality (Matrix).
youtube
AI Harm Incident
2024-07-29T16:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzX5_w0VIziVjTUDJV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxxvn5dEWGH6YA-rlp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwk-cuVmzOCTVXnv3R4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyMT0f7f6JiMM4xVct4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzzMSBd3luJNyVEueZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx_lHbpfVQ2IKY6mNx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyj_I-TpRoqB1aEayN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwt5qU4i-acfFWEThh4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy0XAEG_3kPw7PJTZR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyCrySiNA9Rt7VYSXl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]