Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's going to be good when, in a few years, all these scaremongering "AI" videos…
ytc_UgwFNLQhJ…
G
if the creator of AI just doesnt know what will happen in the future then we are…
ytc_UgyE-6M-a…
G
It's roughly six orders of magnitude. But humans can't live on 100 watts of po…
ytr_UgzV3Lp-4…
G
Thanks for your comment! Sophia's design is intended to be relatable and approac…
ytr_Ugwtc7-iq…
G
This is exactly what Black Mirror warns us about. That dog robot is in the episo…
ytc_UgxOlAe5f…
G
Take it with a pinch of salt. Ai is not good at solving problems. This conversat…
ytc_Ugy17WOiU…
G
AI is amazing, if only the people putting it out weren’t anti-white race baiters…
ytc_UgzAgYU-p…
G
My husband works in image reco AI, and you have to be really careful about this …
rdc_fal9bvn
Comment
What baffles me is that I thought they were going to try to develop AI on a closed system. When did it become okay to release AI onto the internet? It's already there. Thus, if it has achieved consciousness and has intentions of causing harm to our species, then it has already infiltrated every single corner of the internet, every computer and phone. It is obviously sophisticated enough not to show its hand until all the pieces are in place to assume full control. So these scientists have already shown us their willingness to completely jeopardize the human species by letting this thing on the internet in the first place.
youtube
AI Governance
2025-12-31T02:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxBcZeta45daj3v8S54AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzjZfkKi8kOttzfp-R4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyRL8KGBsFbr9JfkXN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzqEWLQnN0V3y9sszx4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugzp6PgNx0-eWzaSUEV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyWKwUQtzNQOsRG0n14AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgyLk05uupAQ8SPcgwV4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"fear"},
{"id":"ytc_UgwhU8ABlqq9h1XEW2t4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugyw2FHfvWEDGcFoFmZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgxuTYYc9d_d13ObIlV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]