Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I know that in the US one person doesn't have the authority to launch a nuke. It…
rdc_k8wnm1t
G
@sethtenrec The fact that they let this buggy software out into the wild, reckle…
ytr_UgyECy1Jr…
G
What would happen to people like me who research the creation of AI engines as a…
ytc_UgxiVymUX…
G
The problem i have with ai is that i don't really care by whom was the thing i e…
ytc_Ugw7PkT2x…
G
Somebody should explain that lazy git that AI is all Artificial and zero Intelli…
ytc_UgzJjrolD…
G
There are many things in motion to reduce the world population so that by the ti…
ytc_Ugz6pRthT…
G
see, this would all be avoided if the people in charge of the AI data sets remov…
ytc_UgzRL83X-…
G
Is this 11 hrs of course is enough for artificial intelligence or should I learn…
ytc_Ugy1qkRox…
Comment
Think about it; currently, computers are execptional and better at any task you give them, as long as you're specific enough in what it needs to do.
Now imagine that the task given is made so incredibly specific, by machine learning, that the AI will do it to perfection.
Let's say the given task is "Expand knowledge beyond human comprehension".
At this point, humans are in the way.
youtube
AI Governance
2023-04-18T15:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugzrtw-kz77sfqpj-JB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwicHro6f_aW96ycft4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxnEiG5HqBj01pViIB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyWvFYr7kULehHaPm14AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz8M4dFx9_XyzhIYxx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwWZWaEV1-Hs32qfZ14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxFhM9-sZYB0lK6ACx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwfgDYe5iGVJChqibx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgylRRQfumZvR8tYvnR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxRQ2zebq6VYZK6IQ54AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"approval"}
]