

I gotta ask, would you consider humans intelligent?
I mean the entire scientific method depends on the deterministic nature of the universe, arguably making everything ever to follow manually coded instructions.
I gotta ask, would you consider humans intelligent?
I mean the entire scientific method depends on the deterministic nature of the universe, arguably making everything ever to follow manually coded instructions.
Just break it down logically,
Would you consider something capable of playing minesweeper intelligent? (ie do you think it has a higher level of understanding than pure random chance?)
Do you consider software running on a manufactured silicon chip natural or artificial?
It was actually the 50s, commonly attributed to John McCarthy.
Crazy its been around almost a century at this point
The term is probably older than you.
We could also use a model or 2 trained on ethical data.
Until then its pretty easy to argue all ai is unethical.
SIMD Might be the term youre looking for (Single Input Multiple Data)
Source: 🤷♂️ trust me bro
Dunno what else to tell ya cause they are moving to open source, but hey googles free if you want to find out for yourself
4% of US alone is 12 million people.
If even 25% of them decide hardware purchases based on driver support, 3 million sales isn’t ignorable.
(The number of PCs sold globally per year is similarly 300,000,000, so even then theyd lose out on 12 million potential sales YEARLY)
The market is also pretty shit post-covid, so I’m sure every hardware company is dying for a way to boost sales metrics.
With the linux server market share and recent ai boom, theyd have to be more than just blind deaf and dumb to not release linux drivers.
Maybe this was true back in like the early 2000’s?
Probably should have elaborated more in the original comment, but essentially I’m not a professional so the freedom of creating custom UI + having some standard variable structures like 2d and 3d transformations are worth it.
It also has a python-eqsue language, good build in ide, documentation, generic GPU access, and most importantly personally is extremely cross platform.
Mostly visualisations though, with rust doing the actual legwork
Mostly for visualisations, but having a standardised reference for 2d and 3d transforms has come in handy too.
Admittedly, visuals aside, rust does most of the mathematical heavy lifting
Edit to note I’m not employed in data science, so I have a lot more wiggle room for things to go wrong
Ive had surprising luck with Godot for basic things, complimenting it with rust or opengl for higher performance
Can’t say ive don’t the full thing myself cause I couldnt find an easy way to mount network drives (there was a lot of jerry-rigging going on), but ive gotten to a webui before
Fuck it, throw a 512 GB SD in an old phone and run a full jellyfin server in termux
I hate to say but technically collecting statistics is non-anonymous identifiable tracking, especially in this age where theres so many datasets companies can coorelate them to
Could use a waydroid container then vlan/DMZ/VPN it to hell, while still getting the usual android experience
Try nix with flakes and drown in the tears of joy
Papers are most importantly a documentation of exactly what and how a procedure was performed, adding a vagueness filter over that is only going to decrease its value infinitely.
Real question is why are we using generative ai at all (gets money out of idiot rich people)
Yeah