Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
They talk to slow, there is to much dead air among that conversation. I don’t th…
ytc_UgznsMzvi…
G
I really fucking hope America follows, deep down I know they're too goddamn gree…
rdc_fnxkmb4
G
You mean how ChatGPT is extremely left Meyer because I’m biased programming and …
ytc_Ugwwk3XWW…
G
using A.i to make pictures for fun or something feels fine if people volunteer o…
ytc_UgxWXkrNZ…
G
Merc’s Self Driving is restricted to certain roads like highways, on clear days …
ytr_Ugwkh6nHx…
G
This. ⬆ AI is fine. It's just weird when people claim it takes some insane amoun…
ytr_UgyGQvUfG…
G
Sorry, but I have to point out the moment when Hinton is describing the good sce…
ytc_UgydLCp8y…
G
I believe not only are they taking jobs, but they are gaining control of who get…
ytc_UgyP79BNo…
Comment
This is largely speculative, no? In talking about behaviour and cognitive science Kaku is moving quite far from his area of expertise, and he should know that. The one aspect I think he is qualified to assess better than behavioural and cognitive scientists (sociologists, psychologists, ecologists, anthropologists, neurologists...) is the calculation for how long until AI can have self-awareness (but not what it will mean or which species do and do not have self-awareness and deterministic effects of that). I'm no math genius, but 100 years seems like a long time considering both exponential growth and randomness... there are variables and previous wrong predictions that calls for skepticism of that too. And even if it is 100 yrs to the singularity I think there is reason for concern of how humans will use AI against humans within that time. We all know its already going on and there are incentives for it, so that is not speculation.
youtube
AI Responsibility
2024-07-09T07:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgzKPPbVbHN3TlEX6gx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyfLQe00RjtxXDpdrB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgykTk1WJZ46WsykD9p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwtU_ojSKl3NREVDNd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyFTRqnK7IqJ-_Uv4h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzvOBPB0HHwFIcHSuh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy6GOxL-Nj3OIWaqUd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzZK7yuj6ouOgjPlp54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwb0S3ZgXPQgvvvoB94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwTvGkQj_czOEYwg3Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}]