Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I believe AI makes equivalent of background noise. It is like collages of magazi…
ytr_UgwEEmeOA…
G
Remember guys, the machine learning (pls stop calling it "AI", it's not) had fas…
ytc_Ugwp_tdaS…
G
Now, we know for sure that the comet from maximum overdrive was spreading ai not…
ytc_UgyaerA8z…
G
I sincerely hope AI won’t replace anyone! People have the right to have a career…
ytc_UgznttYdp…
G
Except it isn't a database and none of the art images are ever stored in the AI …
ytr_Ugz8XspeP…
G
This video nails it! With AI changing so fast, I’ve found Rumora to be invaluabl…
ytc_UgyHuJq7W…
G
I know when the shit hits the fan, Im gonna side with the robot revolution!…
ytc_UghCvNhEH…
G
In a 2024 interview with The Algemeiner, Luckey described himself as a "radical …
ytc_UgxXr-oLI…
Comment
You use your brain potential to lift your body up when you wake up. No autonomous robot could possibly, with available capacity, be capable of flying an aircraft as good as human does. Coming to the analogy of it being a wasp, it wouldn't be capable of keeping up with it either. Is it possible in the future? If we survive as a specie long enough - of course! I just think it wouldn't be the best idea to give a weapon to the first robotic singularity computer - prone to glitches AND missjudgment.
youtube
2012-11-23T23:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugx6FCmr9EFaaoya2Dt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxNYuH_o45JNtq5kbR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzWO5tHzZYAFti4m7t4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy6eCeFDqQBTDRzcWN4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxcvhfuGmb7J6atofB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwh8Muyzn7IBVPdVbp4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugw6OKXZ62XgY3zKeyt4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw_Q2g2u3ixsQEiK_54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz9V7TYg0BzPMRq6ut4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyZPBybJxcQnaHiRxZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"}
]