Report it. (New account, blog spam, funky domain, poorly configured server, etc.)
Report it. (New account, blog spam, funky domain, poorly configured server, etc.)
Just call it X-Twitter. It ain’t twitter any more, and “X” is just dumb as fuck.
I remember they finally were able to make a ball point pen all by themselves in 2017.
When I actually start seeing products that aren’t contaminated with fake ICs or are actually grounded properly without hyper-strict foreign supervision, I’ll change my tune. Until then, there isn’t an article in the world that will convince me that China is actually innovating or taking steps to make quality products.
What the hell are you popping off about? All you need to do is go to Amazon or Ali Express to see the absolute plethora of Chinese product clones. Just search for a 3D printers, as a good example.
Yeah, I guess I am a bit prejudice against the bulk Chinese electronic garbage that is usually an extreme fire hazard. Maybe it’s all the fake or counterfeit chips I have had to replace that has pissed me off.
Look bub. I don’t give a flying fuck about China or the politics involved. Their manufacturing sucks ass and their actual their innovation is rare. If they copied products correctly, I would have a bit more respect for their business model.
If you want to go all-out tankie on people, maybe you should go back to your own instance.
That’s great and all, but you missed the theme of this thread.
China has been the king of bulk products for years. Saying that recent investments will alter world politics in 25 years is a bit strange. Saturating markets is what China does best.
There is a bit more history behind TSMC. You left out the bits where they partnered with other companies, like Philips, that gave them access to proprietary information. They continued building relationships with other large companies and investing back into their own business.
China isn’t doing that. China has had access to older fab equipment for years but still fails to truly innovate. If US companies could trust China enough not to steal modern tech, there could be some real benefits to having fabs in China. The world kinda figured out never to send proprietary information to China years ago. Companies still do and doesn’t take long for a thousand clones to pop up on Ali Express shortly after.
Trains are one thing, modern chip fab is a completely different. Buying older equipment is not going to get them anywhere but into the production of chips that have been on the market for 10 years already.
This is one industry where each generation has hard limits for manufacturing.
But they are buying mature-node equipment, says the article. That doesn’t mean shit other then more cloning and counterfeiting.
Future chips not affected by THIS cpu bug yet.
It’s just getting filled up with love.
Same. I support AI completely as a tool to solve specific problems and that is about it. What is really cool is that AI libraries and such got a massive boost of needed development so plebs like me can code simple ANN apps in Python with little skill. Documentation has improved 100x and hardware support is fairly good.
LinkedIn seems to be an interesting indicator of where tech is in its hype cycle. I guess LinkedIn went from 100% AI-awesome-everything about 2 months ago to almost zero posts and ads about it. I suppose most of the vaporware AI products are imploding now…
Of course, algorithmic feeds are a thing, so your experience might be different.
FYI, you can download your photos in bulk with Google Takeout, but you need to have enough space in Google Drive to do it. (Takeout zips up all your photos and will drop 10GB chunks in Drive.)
I was doing something similar to you recently. I downloaded all my photos and de-duped by generating MD5 hashes for all the pictures that were downloaded. (I was moving all of my photos to a local NAS, so it wasn’t quite what you are doing.)
If your dups have consistent MD5 hashes, that might work for you but it’s hard to say.
Experts say that is not possible.
Spoiler tags are broken for me. Strange.
I feel better knowing that Chinese military partners get to feel the joy as well. (Curbs can be a bitch sometimes…)
I am going to need your 50 point summary of those obvious points in the longest form possible by this afternoon so I can be completely convinced that I have already made up my mind in the correct way. Thanks.
Why not? Those CPUs got perfect scores on Red Star OS.
It was on old 3.5" drives a long time ago, before anything fancy was ever built into the drives. It was in a seriously rough working environment anyway, so we saw a lot of failed drives. If strange experiments didn’t work to get the things working, mainly for lulz, the next option was to see if a sledge hammer would fix the problem. Funny thing… that never worked either.
I used to take failed drives while they were powered on and kinda snap them really with a fast twisting motion in an attempt to get the arm to move or get the platters spinning.
It never worked.
I am curious as to why they would offload any AI tasks to another chip? I just did a super quick search for upscaling models on GitHub (https://github.com/marcan/cl-waifu2x/tree/master/models) and they are tiny as far as AI models go.
Its the rendering bit that takes all the complex maths, and if that is reduced, that would leave plenty of room for running a baby AI. Granted, the method I linked to was only doing 29k pixels per second, but they said they weren’t GPU optimized. (FSR4 is going to be fully GPU optimized, I am sure of it.)
If the rendered image is only 85% of a 4k image, that’s ~1.2 million pixels that need to be computed and it still seems plausible to keep everything on the GPU.
With all of that blurted out, is FSR4 AI going to be offloaded to something else? It seems like there would be a significant technical challenges in creating another data bus that would also have to sync with memory and the GPU for offloading AI compute at speeds that didn’t risk create additional lag. (I am just hypothesizing, btw.)