Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
An AI can't steal anything. Human has created an AI and can switch it off. An AI…
ytr_UgzGXRc6E…
G
I am a digital artist and I also will make AI art to use for bots. I think AI ar…
ytc_UgyaXCSf3…
G
Chatgpt is my professor, my personal friend and the god of education system. Def…
ytc_UgwgYNOKI…
G
I'm not sure how this guy was able to have a 2 hour conversation with bing when …
ytc_UgynQBPXR…
G
Claudes response to one of the argument that I had with it:
Imagine an AGI give…
ytc_UgxoZEhSg…
G
Not if it takes a person's job through outright theft, it's not. True artists wi…
ytc_Ugy_3hKls…
G
@goodthings5772 Maybe... I didn't say any of that though. I'm not interested in …
ytr_UgwsFJiBh…
G
Depends entirely on your definition of intelligence. Does artificial consciousne…
ytr_UgwZTecmq…
Comment
There is NO INTELLEGENCE in A.I. A.I. gets its smarts from programming; but the problem is that these programs "learn". The real trouble is when one of these geniuses perfects the semiconductor neuron; THEN humanity will pay the price. Like everything from the last 50,000 years, the threat will be put down by the old standby- violence. Blowing up servers, data centers, interconnections. The usual stuff. The "should we/could we" dilemma is never contemplated by the scientist who act like irresponsible children.
youtube
AI Harm Incident
2025-09-19T19:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwOzhwJ_KqIQbYG-e14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxQnyczL3anHshR_w54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugzi9ZahtWLsbdZSX0l4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugxy3Glwcr1TMKi8mQx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgycFJgM6THLOZGzzu14AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwOw_CGIBtc7G0UDnB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx3KsHNhFNibsv6S8F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgzDvORlzhrrLRQxNTt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugyt9Hp0Q8fLl5Ngm6V4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwMMaj7wpyTJphv1694AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]