Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I don't think people like XqC or Asmon will understand until it affects them dir…
ytc_UgwlLj35y…
G
It's not that simple, weapons that incorporate ai could be safer it just depends…
ytr_Ugx71A9ke…
G
So Basically, what happens with AI is like introducing a new species into a habi…
ytc_Ugy_alcdq…
G
I am a PhD scientist. I have used ChatGPT several times, asking it questions ab…
ytc_UgxpuNkHC…
G
*blinks*
"Isn't digital art the same as AI?" Oh I'm sorry I didn't realize the …
ytc_Ugz8wzwdd…
G
As I listened to this episode, I felt a deep call to share something. Yes, AI ha…
ytc_UgwCi_itc…
G
My philosophy is:
FOCUS ONLY ON GOOD. DO GOOD. Forget the bad—stop listening t…
ytc_UgwsvLGTB…
G
The only problem is humans would have to manually go through and remove sexist a…
ytc_Ugynlt-9M…
Comment
A.I. Developed not for the sake of being weapons can help solve a lot of problems. There is this assumption that machines are pure logics therefore cold and cruel. But isn't "evil" uniquely human? Funny how with Hawking you ask him about any prediction he thinks every thing are gonna kill us all. Aliens, they will kill us all. Robots, kill us all. I wonder if any one in the behavioural science field ever analyse Hawking, and if any of Hawking's dark thoughts are the manifestation of a brilliant mind being trapped in a disabled body. I want to have some philosophical debates with a self awared A.I. Don't fuck it up for me!!!!
youtube
2015-08-09T07:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwm9I9NcRQElvQfqu54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwRhW6ydR3WoIlU3gl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgghtrugE12abngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugic-8CdfbK863gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgiiVzQEVXTO8XgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgigNAG8ggHJ7HgCoAEC","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugigkb4gWN8_I3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugi_4VKjBann7HgCoAEC","responsibility":"none","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugi9Gszi21MTEngCoAEC","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UggnLXyVGHuX8XgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}
]