Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You are crazy and clearly shouldnt Drive a car. Even 4 years later there is stil…
ytr_Ugzfw1Jw6…
G
"ML start to think likes human using AI"😂
ML and DL is used to built AI not othe…
ytc_Ugxb-EevP…
G
All of the above can easily be replaced by AI. And far more creatively and effic…
ytr_Ugz7i1S3E…
G
So what happens if Ai threatens every job field and person working in the near f…
ytr_Ugw46Ckie…
G
if ai users are "artists" then I might as well be fcking da vinchee bro…
ytc_UgwHVMCAA…
G
AI may not be intelligent/conscious but it's damned accurate and fast. It can do…
ytc_UgzYkbVVv…
G
"Cut! Hey Dave, lets try that again. Remember, these little robots are the anti-…
ytc_UgyXEuPBf…
G
In my opinion, AI should be programmed to choose an outcome that results in the …
ytc_UgwtnHm6-…
Comment
It’s naive to think AI will stop at a HAL 9000 or JARVIS. They will always improve to the point where humans are impediments to further development (using resources the AI could use like electricity or land) and must be eliminated.
youtube
AI Governance
2025-06-20T15:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxnMcF3tZxVnVukWJV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxL1HGcuhK1mRwfYnV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwuxNWjK_m_ju-QWzt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxH7rDZxJaUKal-plh4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzAo5ZTwqESTOWk1a54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"ban","emotion":"indifference"},
{"id":"ytc_UgwU-Fg69PdFa9AS7zx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxzSQXuoF9zBcI1WgV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgydE-4e2tdZeRpmdzV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyzDQzSwlJXEhvJD_N4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzBZl4Z926rj-6EIEh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]