Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I am not sure that is the case. Let's consider different example - hunger. Hunge…
ytr_Ughgv7iY0…
G
"The only real way to support human art and protest AI gen is to cut it out of t…
ytr_UgwjReEG0…
G
We are not mature enough to be doing this. AI will figure us out and correctly …
ytc_Ugy0Hzfc9…
G
A bigger concern I have us for architecture development. What's the point of bui…
ytc_UgzHuHs-l…
G
Basically the future:
"hey dad what are you doing?"
"art"
"oh that's cool, ...he…
ytc_UgxvMaa1m…
G
Robot laws do not actually exist. The programmers can change how it will react. …
ytr_UgxO_Ujk5…
G
I’m surrounded by “digital nomads” from western countries all doing various visa…
ytc_Ugxk5k2sh…
G
Wall-e is becoming real. B.S. Get rid of driverless trucks. What's the point of …
ytc_Ugx_ZUgTX…
Comment
I think that the next big step in AI is a multidimensional node network, wherein the nodes are concepts, phrases, etc. Essentially everything that humans think of would act as a node and then the use of a series of weights can simulate standard logic and, in addition, emotional logic. The biggest problem with this approach is the hardware cost of something like it. From there, the most likely way to ensure safety and proper understand between the AI and humans, we should look at functioning psychopaths. There are many in our world and most occupy high level positions in various companies and organizations. There will be those whose internal setup isn't useful and should be disregarded, but there are some who have constructed logical pathways to avoid amoral or outright immoral behaviors. Those few will be exceedingly useful for this industry.
youtube
AI Harm Incident
2025-09-11T21:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzUL8aj7d7e1rFajZt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx4_MaJDAf-_-yYlfZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyuVdNO3EBMAdKSx5N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzdvCFDmtAReM2yRJR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwmgSTbPwicYjihwoh4AaABAg","responsibility":"company","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwOPQtyAE29tjumzVZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxctt7ATkO2lSd4o6l4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugw7Z8oUIxBzVgZqPj94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzOW7YqqzrOhibLzEt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzrUUaRxXg9nr6phVl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]