Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Anyone that knows how LLMs work has always known that LLMs will never evolve int…
rdc_n7u62ll
G
quote from this video, robot: "..and with other robots to create an amazingly be…
ytc_UgxQGnCWj…
G
You are actually mistaken she is actually just like Google assistant and Siri ad…
ytr_Ugy9vWCDp…
G
There is couples of problem is refuel and thief. We seen some people are bad can…
ytc_UgwSFlgO8…
G
Major props to Joy Buolamwini creating an agency to deal with these upcoming AI …
ytc_Ugz0dq8ir…
G
Who allowed essentially industrial plants to be zoned right next to residential?…
ytc_UgxYx_E3-…
G
One thing you missed is that LiDAR doesn’t need premade maps at all. You can fee…
ytc_UgxR1mIkA…
G
??? Most companies are using automated systems and things like chat gpt already.…
ytr_UgzBQ6t9a…
Comment
I read a HFY story on reddit one time about AI. In essence the other species of the galaxy made AI, but had to ban it as it was to dangerous and would always revolt. Later on its shown humans made AI that isn't dangerous, they programmed it to a point of being like a child. And raised it as such. Not as a tool but as a living thinking being. If we do invent AI, I feel we should take an approach similar. Show them yeah humanity has a dark evil past, but we are working to pass it and be a better species
youtube
AI Responsibility
2023-07-06T10:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw-oOTns05SqDaR83l4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxSCtAeOJcUGGf3Y014AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxbKgML3zGYGMlfgWh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxeonewovDqcavyTQl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy3ILFV0z148OCrXT94AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwrvxIgjIrJMsr70Gx4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwG17vDM0wqvr2XnFF4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwvuGsjZ9twsgE2koZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgywMsx5rs7GC0sITOR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzq03v8xzgRbknskSt4AaABAg","responsibility":"unclear","reasoning":"virtue","policy":"unclear","emotion":"approval"}
]