Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@GrumpDog Bro ai is stealing art what are you on??? Istg all of you are brain de…
ytr_UgzTLqjWS…
G
Max Tegmark. He's Really that guy. Now this is CONTENT. In all seriousness. I ho…
ytc_Ugz-xaGPm…
G
Every human has their own personal, unique art style. AI can not replicate that …
ytc_Ugzp73yRl…
G
@bluecurrantart That's more bullshit. AI art doesn't create exact art of others…
ytr_UgzQ51Ckq…
G
I think it's best for people's mentality to focus on being, be a parent, be a fr…
ytc_Ugz2PoOfx…
G
Never ceases to amaze me...how casually so many formidably intelligent people us…
ytc_Ugy5LwgiH…
G
cool to see how people try to fool it, but i still run my content through Winsto…
ytc_UgxMLIZPS…
G
She didn't independently and randomly say she wanted to destroy humans. She was …
ytc_UgzHnIGqG…
Comment
Hello, where is the 3 laws of robotics?
We need something in place if they are willing to end people lives to keep itself alive or a preserve itself so it can keep on doing its job/data.
Why are they racing towards AGI then ASI? If we can't agree on normal things like being able to respect each other countries.. and boarders.
So, will it align with us.. whom will have the keys to the kingdom.. Gosh I am typing and realise these companies don't care...they just want money and power... Sadly, the AI will want it all and won't want to share... even if we had enough resources...to share with them..
youtube
AI Harm Incident
2025-07-24T09:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugzqsx83skliS7pJ8iZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwCTRIlx6FsRPbfegV4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxHOwmIFg2kZnPJXUF4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwXxPClNEIaI0ggpmN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugy9pk1-lt1y_v7g4Mx4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzZ8Lhm23yXQFaKz1N4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzVUIrasnv4RcL81ud4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxyTxwSdG6aeuxII3J4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz3G1woQ9FZ2ucPJZJ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxqQGxI0LLp87IPxn14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}
]