Anything that is turning complete & has enough ram can emulate x86, and an x86 emulator can boot Linux.
Anything that is turning complete & has enough ram can emulate x86, and an x86 emulator can boot Linux.
This is just my personal opinion, but I don’t think words can be ‘owned’ like that. More than that, I don’t think ideas can be owned. Ownership as a concept is based on exclusivity, and words/ideas can be copied identically and infinitely.
Why do the leaders in AI know so little about it? Transformers are completely incapable of maintaining any internal state, yet techbros somehow think it will magically have one. Sometimes, machine learning can be more of an art than a science, but they seem to think it’s alchemy. They think they’re making pentagrams out of noncyclic graphs, but are really just summoning a mirror into their own stupidity.
It’s really unfortunate, since they drown out all the news about novel and interesting methods of machine learning. KANs, DNCs, MAMBA, they all have a lot of promise, but can’t get any recognition because transformers are the laziest and most dominant methods.
Honestly, I think we need another winter. All this hype is drowning out any decent research, and so all we are getting are bogus tests and experiments that are irreproducible because they’re so expensive. It’s crazy how unscientific these ‘research’ organizations are. And OpenAI is being paid by Microsoft to basically jerk-off sam Altman. It’s plain shameful.
I refuse to believe the python one ever happens. Unless you are importing libraries you don’t understand, and refuse to read the documentation for, I don’t see how a string could magically appear from numeric types.