Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@sorituanasution1180 Sam is. Who else? He doesn't know how AI works and is tryin…
ytr_UgyWKqHUz…
G
I've seen a video on youtube about how to create a whole coloring book using can…
ytc_Ugyh1_LX_…
G
but would it be ethical to make a Substance P algorithm for AIs? probably not. …
ytr_Ugx275bEq…
G
the fact that AI is gonna effectively replace at least half of all jobs currentl…
ytc_UgwCJITD6…
G
girl but that's the whole point of automation ?? in every industry y
who _wants_…
ytc_Ugx_ejL7t…
G
I believe mechanical and maintenance jobs that specialize in automation will thr…
ytc_UgwszE2ld…
G
There is alot of Confidence Conning going on with the Big 5 Tech companies and m…
rdc_n5h5uv9
G
@andrearamos8780 it's written in human DNA to be frightened of something that's…
ytr_Ugy_Aur8m…
Comment
You'll never convince me that AI will ever have any form of "Sentience." The real danger of AI is simply what Humans program it to do, Not AI itself.
Most of the examples given here is as you presented, that it uses human speech to garner "connections." It's just a program that uses this, nothing more.
youtube
AI Governance
2023-07-07T22:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyZon6b-Q1NHCYcLPN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxyDEVVYS7ZtiTgeSF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz94wP5JGrChj8-IVF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzHCeBdpIebh4Gj_ax4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxHt1YvcljuNrMfbsx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgzWpoHBZNGsfX6pMBJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyCdkBkoUy6qGCf55t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx7cl44rk2dykDc5F14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzhI6-qnbT5uhCXDUF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwvqezGa-lnuerAps94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}
]