Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The communications they are releasing are made up rubbish, someone has had an is…
ytc_UgyGcQCpU…
G
In the US here. Sometimes I wish there was a solar roof over the parking lots at…
rdc_eueqoif
G
Autopilot is not self driving. Autopilot is the level 2 driving assistant softwa…
ytc_UgxoZfYo1…
G
My problem with AI as a tool argument is that it's so corny
If you play any ra…
ytc_UgzjRjYVW…
G
If anyone is telling you AI is conscious, they are either ignorant of how machin…
ytc_UgzbbxzGy…
G
What about using AI to create quality shoots? There’s a tremendous gap. You sti…
ytr_UgxCynoDO…
G
17:20
something me and my friend tended to do a lot, is if we had a conversatio…
ytc_UgztJc_BM…
G
Ai does but it doesn’t understand
It makes based on 1000 of pictures but doesn’t…
ytc_UgxNoddeN…
Comment
Some people I talk about this tell me to get my head out of sci-fi movies and back to reality. But really, those complacent people stuck in the 80's, 90's and even 2000's are the ones that need a reality check. AI is doing what was science fiction just 5 years ago. AI and deep fake alone could easily start problems that may even turn violent. You also forgot to mention NVIDIA, that presentation by Jensen alone will give you pause as they have advanced performance a million times in 5 years. Once AIs are powered by those things in data centers I really hope safety measures will have been in place already.
youtube
AI Governance
2023-07-07T06:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxGGEAKQNNdTYvXkoJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwfDvnIVaJlNVXUnKh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzwgpbzA87Oo5YBoiZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgylGkpxw_7bAB7X3Td4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxgqZObDnutqxgQ9DR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy1GaTg6mnUUStSSyp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyfsSAZw6h3iGJNGOR4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxm23pc7rPGiIpuVyJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyooFoavHf-YYxj5DB4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgytJM4rhafqo_LBeVt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]