Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Its like during cold war both sides were making more and more powerful nukes
T…
ytc_UgzUWwrqD…
G
Why would I pay for anything if I can do anything with Ai? Entrepreneurs will ha…
ytc_Ugw9FjP1-…
G
So the A.I. killed a human, in self defense, and tried to get away with it, to …
ytc_UgyFtIOUQ…
G
@adrenobearThe precedent you're referring to is about cctv. Do you know what t…
ytr_UgwPQgMU8…
G
if you know how making a robot works then you know that's like asking if a car e…
ytc_UggcVwGpN…
G
A Brazilian well known actress deep fake was made with one of the leaders of the…
ytc_UgzeliiXt…
G
MashaAllah sister.
May Allah reward you for your courage standing for truth..
Al…
ytc_UgxUdP4oe…
G
Is anyone else here not concerned at all because you have a trades job? Good luc…
ytc_UgxSqyUse…
Comment
yeah nevermind the prospect of singularity, or complex hijacking potential of the technology, and current chatbots' ability already to develop their own language at rapid pace. this poor fellow seems to think the ideals/intentions of the inventors will remain static relative to the utilization/ability of AI, merely waiting for technological advancement to catch up for intended use. sort of like the naiveté of companies creating cars reliant on computers/integrated with internet access, when mass security breeches and major website hacks are still enough a handful.
youtube
AI Moral Status
2017-07-30T03:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxyVG4yFA_H9FFXxTR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgwqvcZo0h-IRPrWsyJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugxx7OqpsyohyDuc2hB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwoOnjX9XHS6zdKt254AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwRLR9c4CHzuizKVy54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwgjqoNSvhJdsY166d4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgzvVKV9Vh6PF2X9ypF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgiSPSbAKBMaSngCoAEC","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UghtO0zPzER0O3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UghNXdUApwhS_XgCoAEC","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"indifference"}
]