Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The problem with AI-Stans is that they primarily view art as something that is n…
ytc_UgxE58YGC…
G
This will happen with an AI too. Except the person on the stand will be the hosp…
rdc_fcssdy9
G
We should start spreading the message that “Tesla forces you to disengage full s…
ytc_Ugx2ZzOYI…
G
Sad part is that we have learned NOTHING from SciFy movies! It doesnt end well f…
ytc_Ugz96iFKm…
G
A real AI would have to understand our values in order to avoid it to want to de…
ytc_UgxxUtZNF…
G
Pretty poor video. Congestion will not become worse if we'll have a dedicated la…
ytc_Ugx8sd7xj…
G
WOW I was already seriously disguted by all of this AI thing but WOW these comme…
ytc_UgyMbI-Bb…
G
Why AI is not a solution, has not been used to provide a vision and actions to e…
ytc_Ugzyl55JI…
Comment
@vallab19Requiring AI safety/alignment is a pre-emptive move to avoid our extinction. People are racing ahead to create an autonomous intelligence that is vastly smarter, faster and more capable than all humans. If they succeed and it isn't completely aligned the future belongs to it/them and not us. Everything changes is we are no longer the smartest and most powerful species. And it is almost certainly a bad long-term outcome for us.
youtube
AI Governance
2023-10-16T21:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgwC20gm87M2ffer2wV4AaABAg.9schZe6X9F49tB4uFYN8V_","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytr_UgwC20gm87M2ffer2wV4AaABAg.9schZe6X9F49vwv6ORNp2T","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_UgzSYTH9ZRvLgUJPgJR4AaABAg.AJaFFGNPHsmAOT6itBK3en","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwFpx4PE6BN2q36PFd4AaABAg.AF4yudZ4VJuAF4zKPwfrNJ","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytr_UgyUYCaKwKXvbhGycU94AaABAg.A08XRE6V5liA3mOq-kFZaz","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwYYoEJd9ZVJkrvUat4AaABAg.9qb7mMaOuAJ9wRfmRCJFKv","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwCVPuMzQynbXjFn414AaABAg.9PZcvAD7ibv9aMSR9pI-Y0","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytr_Ugz6ZJ5MDNRfyPUceGR4AaABAg.9txx8u1eIxN9ty2pntlQCt","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UgyH4gKJoPEKl5rYUc94AaABAg.ANJhDEUFpfwAT9nGNGgTOI","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytr_UgwpWC8BgqdL7UL9OP54AaABAg.ANGjclFHK9qANINZE17pjD","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]