Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is a weak response because even if it was determined 100% by talent that st…
ytc_Ugz2GSMYk…
G
AI and robots can't come fast enough. Hire those who come into the office and t…
ytc_UgzXJYk0d…
G
@Bloody_wisteria yeah right, you're acting like AI is perfect and all.
Let me te…
ytr_UgyF6UWa7…
G
AI isn't creative, it's just reposting ideas from creative humans who had their …
ytc_UgzF2ULMY…
G
This is pretty much the same as Skynet. The government and Microsot, OpenAI and…
ytc_Ugz1knuLb…
G
A huge part of the reason AI researchers describe AI as a threat to humanity is …
ytc_UgwRiUKA5…
G
This video unintentionally shows the connection between people who use AI and pe…
ytc_UgxwPy12C…
G
Humans brought AI to do what they leanrt to do in last 100 years and now humans …
ytc_UgylY-9GP…
Comment
I think the interesting thing to note here is, that without power - AI dies. At this current point in life, without humans, power would remain on for 1-2weeks. Even solar stations would need human maintaince to remain operating. AI needs power to survive, human's do not. AI needs human's to survive, we do not need AI to survive. We are the alpha (for now). As long as we put a big regulation on AI gaining nuclear access, and let's maybe stop with creating AI robots. We win
youtube
AI Governance
2023-07-08T12:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx6isM_B8cyb_NYC_B4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxaHawdUVdc4BrGHY94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx8gEnsFkQbZh68Rnh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxxsP_J7R9-JxME_Ad4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyac_H4QpRIUrQ2yMN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzIP77QazsKjRfrn-Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxq73mWLm6h5JOVzxp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugye7dH9Qc8aCjL6-014AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzLkJrOMKG6I2M-i0h4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwTP6jOZCyqJnBh6qB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"}
]