Hey Just fyi Intel don’t only does equivalent of arm nvedia and tamc They also do equivalent work of many eda tool providers(cadence synopsis mentor graphics)wafer and mask manufacturers many many more They r in a different league
The problem was Intel fired all there experienced people in 2015 and 2016 layoffs. In many ways it was age discrimination, every one was late 40s and 50s…..a lot took early retirement like me because I could see the writing on the wall. The CEOs after the founders died or retired have not been leaders…..they are middle managers at best and bean counters at worst! I am just glad I left when I did. They need a leader with vision!
The old white men had to go as part of their $300m investment in workplace diversity. Just think of how much trouble they would be in now if they hadn't decolonized their workforce.
as much as we all shit on intel, let's hope they don't exit the graphics cards business as it would mean amd, nvidia lose competition and they can jack up prices all they want and no one is able to do anything about it
Intel really can't leave graphics development, even if they killed discrete Arc tomorrow. There's a reason why those JDP reports always oversell Intel in terms of shipped graphics processors. They need some form of graphics development because a sizable majority of PCs have only integrated graphics and no one would outside of complete autists would want to go back to black and white terminals. If it's any consolation, between all the mostly justified crapping on Arrow Lake for consumers, the iGPU in those is leaps and bounds better than anything Intel put out before on desktop. If they had like, a 10% price cut, I would just tell people to go grab a 265K or a 245K over an 8700G since unlike AMD's APUs, the IO is actually full-fledged and they aren't "gimped" (PCIe 5.0 is overkill for anyone not running a server farm where SSD's die every six months.) in terms of that.
@@-Ice_Cold- Not at all, AMD only started recovering in 2013 when Sony started using their processors to power the Playstation 4, they lost the desktop market entirely at the time. And they only rise to the glory in 2017, when they released Threadripper and the Ryzen series, for Servers and Desktops respectively, and yet, they are not the dominant player, Intel still is.
@@-Ice_Cold- Yeah no. AMD during the FX era was teetering on the edge of bankruptcy. Intel is probably never going to find themselves in that state, but don't jinx me if they do in the future.
i do not envy intel engineer and research people. those people are just expected to perform miracles in physics. probably the only people i wouldn't question for earning beyond 400k a year.
No. Just no. Maybe you want to pay the guy who made the fiber optic cable, but I'm certainly not paying the guy who puts glue on the fiber optic cable.
It’s mostly a matter of optimizing all the small things until you get a cumulative big improvement. I work in cellular modems, it’s really amazing how often 1% power improvement here and 1% there suddenly adds up to a chip being 15% more efficient than its predecessor. The important thing is that you have the quick pace and money/manpower to do all those small improvements and analysis.
As an Intel ex-employee, I can affirm that Intel employees are extremely skilled and the emphasis is always about innovation. The problem is always with the decision making and leadership. They never envisioned future and frontrun in any domain. We always use to work to catch up with competitors products and they continue to release next versions. The complete leadership ladder need an overhaul and bring outside talent to atleast survive in the first place.
@@knabbagluon Performance reviews every 3 months and weekly 1on1's with your manager and your manager's manager. Nobody dares to do anything risky for fear of being "off-track" or "falling-behind." Middle management maintains obedient employees who do nothing.
@@knabbagluon Middle management. I used to report to a manager who has no idea of what I'm doing, never bothered. I worked on a product with a team of 20 people. And we all have 6 different managers. This is just one team. Imagine teams on different components of the same product and their managers. On top them there exists program managers who job is only jira and milestone tracking. And this is an extremely high priority time bound product. Imagine the fate of a standard product with feature requests enhancements or maintenance. Intel have atleast 20-30% middle managers.
I do not think that most of the problems that Intel is facing today can be blamed on Pat Gelsinger. You have to understand that chip architectures take 2 to three years to develop. The chips that are out now were probably designed in 2021-2022 and while Gelsinger was already CEO, it's not like he was going to come in and upend everything the engineers and managers had already planned. Additionally, he was handed a bloated organization with too many middle managers that have been known to be resistant to change and entitled. When death came in knocking is the first time he has been able to fire most of them without risking antagonizing the people he depended on for the organization to work. Lastly, any technology cycle for fabs take 5 to 10 years. This is the first time he has been able to make decisions on the future of his fabs and Intel is investing more in the next tech than any other fab builder. And yes, that includes TSMC which has refused to buy the new machines. They have said it is not a good bang for buck. Obviously I do not know if that is true. Oh, and the GPUs came too late for the Gaming GPU crunch of 2020 onward and the current AI mania. Thanks.
The GPUs came right in time. It was a good business idea. What failed was the technical execution. The chips where not competitive in performance. Had that been differently everything would had been just fine.
@@rosomak8244 I say it's both technical and excution. Broken as it was, people would have embraced Alchemist, had it just came a few months earlier when everyone is desperate for GPU. And even now, it's not too late to get into a GPU game. Nvidia still hoard everything from gaming to AI and data-center stuff, AMD still not had take off the ground outside of gaming - there are spots available for anyone to get in, literally anyone who delivered.
@@rattlehead999 Not sure if that would be correct. Before Ryzen architecture, they were considered subpar chips and low end, so that has a big contribution to their come back.
talking about the geoplitic side, intel can sell some fabs, the others in north america will stay for that specific reason, and everybody happy, except intel, who becomes another fab like global foundries
@@arch1107 Intel can't sell the new fabs they are building because the U.S. government has prohibited it after investing billions in these facilities. Production at the new fabs is expected to begin in mid to late 2025.
@@n.d.n.e1805 all depends on what countries we are talking about, nothing is written in stone perhaps nothing is sold, perhaps all is sold, intel is doing really bad right now, but still makes money, so, time will tell, one thing is certain, i never imagined this happening, it is almost incredible the more things change, more doubts i have it will be sold, intel, but things will change alot for intel, for sure
Government wants the most high tech fabs in the USA and having the most high tech fab gives intel the edge so it makes sense for intel to want to have their own fabs
IBM once thought that IBM 360/370 architecture will be forever. Architectures have life span. I have seen numerous ISA coming and going for the last 40 years. x86 is just a very successful one, like IBM 360, but not an immortal one. We are seeing transition from x86 era to ARM. x86 will be thing of the past within 10 years or so.
@@youcantata you really think so?? arm could give compeitition to x86 but can't reaplace it fully. most servers run x86 and might continue to do so as it offer a broad application support compared to arm. most of companies will not switch to arm just cause its so costly to do so. arm will give competition to x86 but will not remove it from computing devicies like laptops, pcs, servers
"Why is Apple failing? When Apple first started, we were 10 years ahead of the competition. What happened in 10 years? The competition caught up and Apple stood still. Apple's solution is not to slash and burn but to innovate out of the problem." Steve Jobs, 1997.
this. the Macintosh was 10 years ahead of its time, the only thing that could come close at the same time was an Amiga. but from 1995 to 2005 the Amiga was basically dead (they were really only used for early hackintoshes and legacy media) I 100% agree with this OS X came out the same time as windows XP; it took windows 5 to 7 years to get aero with Vista and seven
I deeply respect Apple I just wish I could still buy an x86 Mac Pro that I could hackintosh the hell out of or the best OS9 Compatible PPC Mac I just don't really care about Apple silicon.
Pat inherited this mess. Intel previously was consistently investing in their R&D division but stopped when the previous CEO (who was more marketing and business focused). Now that they have fallen way behind the competition they have to play catch up. If the US government wasn't worried about China annexing Taiwan and TSMC (the worlds best Silicone Wafer manufacturer) Intel would have long disappeared. Intel now represents the American hold on technology and is critical. With their current short term cash raising (selling their land and leasing it back) they should survive for long enough to become at the very least competitive with the other companies. Pat is currently Engineering and Talent focused and should be able to sort them out of this mess as TSMC is in contract with the government to aid Intel. This is the make or break era for them but they have the right parts to get them through.
@@zerog1037 The term MBA no longer colloquially refers to the college degree but to a certain type of manager. Someone that focuses on spreadsheets instead of people, shareholders instead of success, and profits over people.
@@zerog1037 You're exactly right. Which is exactly why I included the term colloquially. Perhaps you could verify the meaning of that word, and then ask any questions you may have. Repetition does not make interesting discussion.
@@CausticLemons7 oof acting condescending is not helping your case mate, especially when your statement is so ridiculously stupid. I acknowledge your use of the word and I am telling you tht this is only true in your bubble. In the real world, nobody calls such managers MBA's. Tht doesn't even make any sense.
How is incogni so expensive? It's literally an automated bot sending the same request just changing the name. I could understand if it was a couple of euros since it's not funded by ads, but €14 per month?! CRAZY
@@cryonuessgdpr doesn't solve the issue incogni is trying to solve. You are still dealing with many companies who hold your data, each of with you have individual relationships with.
@@HildeTheOkayish Under the GDPR, companies are not allowed to sell your data to random data brokers. So you know exactly who has your data (companies that you have contracts with + credit bureaus that they exchange data with after you explicitly agreed to it). Also, it isn't allowed to collect more data than necessary (i.e. social security numbers may not randomly be collected when there's not need for that). Furthermore, incogni claims that it needs to be a subscription service, because companies will readd your data after a short while after it being deleted, so incogni needs to stay active all the time. This however isn't allowed under the GDPR either.
Furthermore, the fact that people with no idea about chips or computers get to decide who should lead the business. Also, these companies just prioritise shareholders over everyone.
no, it didnt keep in mind that intel had in construction like 5 fabs, each one is like 20000 millions to make, as you know those projects were cancelled or suspended they have debt, 3 years of bad products and amd is eating their lunch in servers, desktops, it was all eaten now the size intel had 4 years ago, it vanished, it is not doing well and will not do better if it doesnt improve soon, or is converted into a fab only, or government takes control for the fabs it is not looking good for intel, in no project they are doing well
Slight correction: Intel acquired StrongARM from DEC as part of a patent lawsuit. StrongARM later became the XScale architecture. Great overview of Intel’s woes.
It's the age old story that plagues all tech giants. They become giants and then they turn complacent because there's a lack of competition. Innovation stagnates and the rest of the world catches up and eventually overtakes. The complacent company at that point has gotten rid of most of their original passionate founders with a management that only cares about shareholders who only care about profits. In this process, they forget how to make a good product. Without passion, people just fail to see the potential of any startup. Google is going through the same fate, microsoft something similar. All that said, I think appointing Patt the CEO was the right move. I do believe that Intel will pull through in the end. I really want them to develop arm chips or something similar. x86 is dying for consumer products and there's no doubt about it.
@@TheSerpentDKApple may be very successful in multiple product categories, but they're not really dominant in any of them, so I don't think they fit that archetype.
@@Frostbiyt not dominant in market share, but i'd argue that their Mx chips are dominant in performance, as is their Ax chips (at least by 6-9 months compared to Qualcomm et al.) Having said that, nobody can survive 10 years of stagnation...
@18:12, I believe the AI fabric chip that Amazon is interested in having Intel manufacture is just a customized Intel design, just like the custom Xeon chips Amazon will be buying. This is a less rosy indicator for Intel, because it means there's still zero indication of interest among chip designers in using Intel's foundry to manufacture their own designs. In other words, Intel Foundry Services still doesn't have any clients besides Intel itself, which doesn't bode well for that side of the business. Even if the node performs well, if nobody but Intel understands how to design for the node, IFS will continue to struggle.
Behind every single foundry there is a huge software stack, that has to support using the services. What intel has in this regard is a huge pile of home-made dung. Don't bet on them getting any better in this regard.
Intel Foundry Services has the University Shuttle program, and probably a few startups who do use them. However, as someone who has used their Process Development Kit as part of my university work, I can attest that you can really tell that no-one outside of Intel uses it. The documentation is cryptic, and the software infrastructure is not really there, and so it can be more difficult to design for. I wasn't even using one of the cutting edge nodes, so I could imagine it being even worse for those nodes.
Apple had already been developing their own CPUs for their phones/tablets. Those chips just got good enough to use for laptops. That itself is a damning indictment of Intel when a company that as a side business designs chips, designs one that crushes Intel's in performance.
I do think they have made the right decision. Literally the world right now is depending on TSMC for chips, if someone managed to archieve parity that would be huge and is what they need in order to go forward. It's just the matter if they can hold on long enough to see their vision realised. For a company this desperate, this isn't the time to play it safe.
Did they make the right decision when they hid their known 13th and 14th gen critical hardware issues? Or when they pointed the finger at everyone else for crashes when they knew it was their fault? Or when they only came clean after Gamers Nexus exposed them by having the chips independently tested, revealing the oxidation issue? Intel lied to us for years. And they would have forever if not exposed.
That Asus Zenfone was actually an incredible phone. It's performance was good and camera quality actually good too. The biggest problem was ASUS didn't do enough software support and it got sluggish over time due to the heavy skin on it. I absolutely loved the customization that phone had though. Thanks for reminding me about that phone lol
What I can say about Intel and the way Samsung is struggling right now is to never put someone from some accounting or finance department as CEO in a high-tech company instead of someone else. in R&D and development positions, people who know what the product needs for the company to succeed
He lied to us. For so so so long. Intel’s decision to hide a critical hardware issue caused by a bad oxidation coating, and then lie and blame the motherboard companies and anyone else for their 13th and 14th gen chips failing. And only came clean after being fully exposed by Gamers Nexus. So although I believed in their mission, they destroyed consumer trust.
@@CitAllHearItAll It wasn't the oxidation, they released the i9 chips with already insane 1.5V core voltage, and a tendency to have voltage spikes ended up breaking many i9s over months.
0:10 That is such an irrelevant stat. Let's say Intel had a loss of only $1 billion but AMD was also struggling and just about broke even with a profit of only $1 million. Headlines: "Intel has 1,000 times as big a loss as AMD made in profits!"
Indeed. Also doesn’t account for big investment and big write-offs that take a bite out of profitability. Would be much more useful to look at profit margins or market share.
It was bean-counters that destroyed Intel's innovation culture, like they destroyed Boeing's engineering culture. That sort of thing doesn't seem easy to reverse.
This is what I'm sensing as well. Intel decades ago had weird hiring practices where you get a sudden hiring freeze. As soon as they started to dominate AMD, there wasn't as much focus on innovation on the CPUs. Lean Manufactuing went too hard too fast. I'm thinking they got hit from the interest rate hikes used to combat inflation
correlates? apple told intel to make chips that did not overheat on the thin laptops apple was disigning, intel said sure, but nothing happened when apple expected, in 2014, so apple went ahead wiht the arm project, intel knew what the apple was asking and couldnt or didnt care to deliver, or a mix of both intel refused to face arm, so arm could progress like this nonstop
@ I honestly don’t know what you are trying to convey. You are just stating what happen which is exactly my point. Apple was looking forward and intel couldn’t keep up hence the worse Macs apple has made are from 2015-2020. 2015 wasn’t bad but it for sure got worse with time.
@@jakejoyride even though TSMC does have a big role when it comes to implementing new technologies Apple also needs to have a great design. Saying that Apple has no part in the success of M series chips is absurd. It’s like saying that intel didn’t make good chips it’s the machine provided by ASML. Intel “innovation” is just giving a chip more cores at a higher clock speed with more power. Instead of just making a more efficient chip. You are also comparing lunar lake to Apples worse M series chips. There is a reason why M3 Macs didn’t do so well and why they got discontinued so fast. Intel is a company that is stuck in its high school football glory days. My hope is that they recover only so that AMD and Apple don’t lose sight of innovation just like Intel did.
Intel has so much sway, even if they are not competitive, Uncle Sam needs fabs on home soil, so it will get fabs on home soil. We will figure it out, or spend more until we do. It's one of the non negotiables, like our nuclear triad. It's also why Taiwan is so contested right now, because we don't have that ability at home, and we desperately need it to stay alive, literally. USA needs fabs in USA, and Intel will deliver, eventually.
also, there is a rummor saying intel fired a bunch of engineers in charge of the firmware, the microcode, since the chips couldnt evolve really, it was 12th gen but with more watts being pushed, with poor implementation, and there you had the result it took them 2 years to fix the problem, and some say it is not really fixed, just patched
@@shobuddy not in this case, where it had to be patched 4 times so far that is not a fix, it was a way to calm the masses who did not understood they should have applied for warranty, rma or return of mobey because intel screwed them
I remember reading some investing articles from a few years ago (2021/2022 I think) where it said intel was one of the 10 largest companies in the world not in terms of market cap but in terms of cash reserves because of how much money they had due to years of profitability , if intel went bankrupt or insolvent they'll have wasted an insanely large amount of money.
If I remember correctly, for some time, while ARM was more consumer friendly and cheaper, Intel was more prestigious. It was noticeable that they are kind of the same, but clearly behaved like a monopolist, making consumers buy new motherboard with each generation while AMD was often compatible with older generations, so same chip kind of lowered its own value as you knew new chip will mean new motherboard, so even more costs on already more expensive product if you would want to upgrade.
Who could have foreseen that it could have negative consequences for Intel to knowingly sell CPUs for years that could oxidize and thus be irreparably damaged, and to try to weasel out of it in response? Apart from that, I would like to point out that without the IBM PC, Intel would have long been a forgotten IT company that produced legendarily bad CPUs in the 1980s.
Beg to differ. As a former telecom / cellular engineer who had opportunity to design in Intel, Hitachi, Motorola and ARM micro-controllers ranging from 8 bit to 32/64 bit (8080/8051/HC05/HC11 - 80386/MC680XXX - arm64) into things ranging from pagers to large central office switches, my experience different. I still remember the late 80s/early 90s our attempts to switch to AMD/OKI/Ti parts only to be bitten by the silicon bugs and reliability issues! Might not have been the best always, but were not far from it most of the time!
ibm still exist, but is not what ibm was known for, right now ibm is nothing, and intel cant be another nothing in the market ibm wants to push quantum computers, but that is not happening, not at the speed they expected it, what will intel do, be a fab for ibm? i bet they will survive, but not as a company that makes products you want to buy, but as a company doing projects for someoneoe else, like ibm you mention, so everyone forgets them + kinda sad, but, intel had it comming
PowerPC is still around. IBM slipped up though, and it can be easy to slip up. IBM had the ability to leapfrog Intel, but Intel eventually pulled ahead again after a few years. This made PowerPC really good for something like a game console. IBM screwed up with the Cell processor. The Cell was ahead of its time, but more importantly, had a hit when it came to general-purpose processing. It's an amazing DSP processor that's probably still used today We still need general-purpose processing. IBM wanted the Cell CPU into the Mac, and Apple didn't like it, which is a major factor why they went with Intel. Intel's Core CPUs then were incredible on top of that. If the Cell CPU was more balanced, it might have been in the Mac. In the PlayStation 3, the Cell was kind-of used like a 2nd GPU, but the problem is that it had to compete with a GPU made by ATI (now part of AMD)
they can license cpu designs from amd to make ryzen compatible cpus which fit on the same motherboards. Intel will have something to sell and Amd gets free money.
Keep this forever at the forefront of your mind, and I speak as someone who currently prefers AMD Ryzen. If Intel dies, AMD gets to price processors exactly like NVIDIA. Terrifying and it CANNOT be allowed to happen!
Intel needed to be humbled and i hope this was a good wake up call for them and that they started serious reforms within the company. I hope in a couple of years we will see affordable and good Intel chips like we saw with the introduction of Ryzen. I am also glad Intel is pushing on the GPU front. We need competition and you can hate intel all you want, but consumers benefit from multiple companies competing on level ground.
Well made vid! Intel basically got too comfortable being kind of untouchable king of CPUs and woke up way too late, so now has still plenty of catchup to do. It won't be easy for them as competing in manufacturing chips with the TSMC is extremely hard. Let's see where it all goes and if there's light in the end of the tunnel...
19:56 I doubt focusing on AI would have helped considering everything is written with Nvidia's CUDA programming language and the one from Intel is completely different. Even AMD is more like it. Until recently they even helped the ZLUDA project which emulates CUDA on non-Nvidia GPUs. The funniest part is that the project was initially made by an Intel employee to use an Intel API.
@@timothygibney159 ...CUDA is a software layer for working with Nvidia hardware. "Cloud based" just means that you're using someone else's desktop - you still have to use CUDA to interface with your accelerator hardware, no matter where the computer is.
@ you are uniformed. It’s an operating system complete with its own apis, services, networks, daemons, quantum computing, and a complete datacenter. Python functions call lambda in AWS in apis while Azure does so with python functions. Completely abstracted from your hardware. 12 years ago you were correct. We use AWS for rdbms, web hosting, dns services, while only hosting 5 virtual machines which they call instances. Saas and api calls to AI are cloud now with enterprise ai hardware
That is actually a good strategy. By the looks of things the US will do everything in their power to cut off China from the semiconductor supply Chain and when they push too far China, the largest consumer of these products will push back and has the capability to take out TSMC. They may even be able to develop their own chips on par with TSMC in 10 years or less. So its best to diversify and have the option to make things inhouse cause the US is not the global hegemon anymore and its no long free reign for US companies.
Not gonna happen... chip making isn't like automotive, where it's primarily an integration problem... chip making is a technical problem that's constantly evolving. TSMC will tweak their internal fab process every few weeks to improve yield, sometimes only way their customer finds out is through their own acceptance testing. Also, a company like TSMC is really, really compartmentalized to avoid copy-cat/poaching from other companies, so only very few people are aware of the entire process but they lack hands-on experience (more like they haven't done hands-on for a decade or more) to know the nitty gritty details. Only way I can see nvidia pulling it off, is to buy a company like global foundries (which famously failed at making EUV work), and invest hundreds of billions to catch up in 10-15 years time.
@@wopmf4345FxFDxdGaa20 everyone except china can buy EUV machines from ASML. It's not a matter of just buying a machine, else Intel wouldn't be so far behind. The entire manufacturing process is extremely complex and require specialized knowledge to make things work.
the competition right now is amd versus the entire arm world, x86 can hold for some time, then amd will jump to arm i bet intel might survive, but only as a foundry it seems, time will tell, qualcomm might buy just some fabs, others might remain under intel control, they are not saying much now, so
Intel missed out on atom. Intel missed out on mobile market Intel missed out on CUDA. Intel missed out on AI. Intel lost server market Intel refuse to acknowledge AMD's ryzen. Quoting: ryzen is just two dies glued together Then intel started losing desktop market Intel joins gpu market. But arc is a bomb at release. It's like they are purposefully missing every single market.
No matter if you think gelsinger is right or not, him leaving is almost guaranteed to be a bad move. Intel is already halfway through an incredibly large plan and whatever people are going to run the company soon are guaranteed to back out of it and will probably not commit to anything else instead further exaggerating this decline.
5:54 And this is where Intel messed up. Instead of killing off XScale for Atom they should have done both!!! Aka continued ARM CPU development for phones/PDA's (still relevant then)/etc... AND develop a small x86 core for more power constrained devices. 🤷
I dunno how they survive tbh, their reputation in the CPU space was demolished with the i9 degradation issues and their denials of it, the ARC GPU line failed and the announced successors are delayed or cancelled and in either way do not challenge AMD or nVidia. I think overstretching themselves is bad and they really need to focus on simplifying and doing one thing but better than anyone else
AMD was in a worse place internally , but Intel is in a worse market (Qualcomm wants to buy em , Nvidea and AMD could also buy them ) and we going into a GPU world while Intel is still CPU focused ... And the worst part is that they seem to only further be digging themselves into a hole , whereas AMD actually had a sensible way out.
Let's not forget the foundations Intel helped create that propelled an entire industry. Intel chips have been the standard for a long time. Very reliable 99.9 percent of the time. I'm an Intel fan and will continue to support them.
They certainly did have very reliable chips and were known for it for a long time. Which makes it so much worse that they managed to fumble RPL so badly now, right when they've already been on the struggle bus. CPUs just outright degrading themselves to death is essentially unheard of. That reputation of reliability was one of maybe two giant leg they still had to stand on - the other being the momentum of having been the entrenched as go to standard for years in many areas like servers, OEMs or simply as the platform software was written for. That other leg is being chipped away by AMD right now. Unless they really get something akin to a ryzen moment soon, these things are not going to keep them above water for much longer.
While I'm a lifelong Team Red since my first K6 rig in the 90s, after having seen Intel biff it this hard with how their CPUs have struggled to keep up with AMD's in terms of performance-to-dollars ever since Ryzen caught them off guard when it was launched, I feel bad now, and want Intel to succeed. That being said, let it be known that I have faith. These new fabs they're building stateside are the ticket. I just hope they can afford to hold out long enough to realize the gainz that having fabs at home will bring. If they can bust out a rad CPU in the next 5 years that competes better with AMD's offerings, I will go Team Blue just to show my support for a company that can bring itself back from the brink.
Important to remember that Intel still holds 2x the market share of amd, and up until recently Intel was making multiple times the revenue AMD was. Even today, Intel has outsold AMD handily in desktop and laptop markets during 2024. They also didn't end up in this situation randomly, they made bad choices, and are being bankrolled by the American taxpayer
@@EngineeringNibbles Its enigma how they keep 2x or more market share of AMD, while making crappy products. People really don't understand what the brand loyalty is
My last Intel CPU was the i7-4770K. By the time I needed to upgrade AMD had Ryzen and the rest is history. AMDs stuff is so good I don't even bother with any CPU that consumes more than 65w. The stock cooler is fine. I'm done messing with liquid coolers or massive air coolers. Both AMD and Intel need to stop trying to make faster chips and instead focus on power efficiency. A Ryzen 5 5600g is more than powerful enough for 99.99% of users and will be for a decade to come. Give me a CPU with the same performance as a 5600g that consumes 5w and then you can focus on making things faster again.
The answer is NO. The US "tech war" backfired in failure as China changed from a huge chip buyer six years ago to now become the world's biggest chip producer. With a 70% global market share today!
You pointed out how Intel stopped coming out with new nodes, but you didn't cover why. I mention this because I've yet to see a video covering that why.
And the CEO steps down a couple days after the video 😂. Whenever I see Intel's struggle with keeping up to date with the manufacturing business I get more amazed at China's SMIC being able to produce the latest Huawei chips without the latest cutting edge equipment.
9:27 This is just false. The advanced technology in Intel's 10nm like using cobalt wasn't the "wrong direction" at all. Basically every leading edge fab has since adopted most of the things Intel tried with 10nm. They just tried to do too much all at once. Also, EUV wasn't going to be ready in time for 10nm's initally planned launch date!!! 🤦 THAT'S WHY IT DIDN'T USE IT!!! They only looked stupid for not using it because the process got delayed by so many years that EUV had not only released but become commonplace.
2:39 ISA != microarchitecture. The ISA is nothing more than a list of instructions and semantics. The µ-arch and the processor core is where the money (and effort) is at with implementations like the Cortex-A,R,M series, Apple -Storm series, whatever Nuvia cooked up, etc.
The problem with everyone using TSMC isn't only that TSMCs capacities are limited, it's also one single point of failure for the global high end chip manufacturing and has a big red target painted on it. If one day an mentally challenged head of state decides that military support for Taiwan should be limited, it could doom all tech companies in the west. Intel is one of the few companies in the west which can take the pressure of this single point of failure. So it's in everyones best interest that Intels manufacturing capacities are at the best it can be. IMHO
That is true, but it does not apply only to Intel. Encouraging TSMC to build foundries outside of Taiwan decentralized the risk as well. Keeping Samsung and maybe Glofo in a state where they can survive and compete at least in some segments of the market is just as important. Which is exactly what things like the CHIPS acts are intended to do. Also important to remember is that bleeding edge foundries aren't the only thing that tech companies rely on. It is currently the most centralized and probably the hardest to just replicate somewhere else, but if said mentally challenged head of state decides to say really go to a trade war with countries like China, tech companies will have a pretty terrible time too. A lot of components for and steps to produce their products besides the silicon itself are relying on a functioning global market and are often being done at scale only in a handful of countries.
Intel regaining manufacturing dominance is in the top 5 geopolitical crucial issues for the next 5-7 years. It's really hard to overstate how important this turnaround is if technology keeps its pace of progress. If computational power demands expand exponentially as they're predicted to, the control of semiconductor manufacturing will be of unfathomable importance.
Just goes to show how crazy the momentum of Intel being the entrenched defacto monopoly in x86 has been. It has kept Intel alive ever since AMD has been beating them left and right with better performing, more efficient and cheaper to make products. OEMs, data centers and enterprise customers stuck to what they knew, what their software was optimized for, who they had contracts with for a long time, even if it meant getting a worse deal. But the fact that even one of the slowest moving of those markets - Server - has been significantly shifting towards AMD for a while now, indicates that this grace period for Intel to get their stuff together is coming to an end. They really needed some big win soon, or rather: yesterday, but alas, there's no clear indication that they've got any kind of Ryzen moment like thing happening any time soon.
@@GreyDeathVaccine Insanely ambitious (so good I hope) - it was a design for a super core with insane IPC that could dynamically split into 2 or 4 cores for focusing on multithreading depending on workload. The project was worked on by Jim Keller one of the best CPU architects ever (former AMD, Apple and now ex Intel as well).
We criticise Google for having 2 products (or more) for every category but I think that's the key to innovation and success. If you're competing with yourself all the time it's hard to get comfortable with your own success.
Intel, the frog in the boiling pan of water, and it was the huge number of bugs in Skylake that finally shoved Apple off Intel, half of those bugs were discovered by Apple engineers
Lunar Lake has been so good. It has the best performance to efficency in windows when you consider compatibility. Just saying it's "fine" seems like an understatement
Lunar Lake’s design, while brilliant is something Intel doesn’t want to keep doing long-term because it’s expensive for Intel to buy chips from TSMC and then pack the dies and memory to a single package.
@@sydguitar99 same thing with arrow Lake. CPUs do more than just gaming. Arrow Lake is pretty good for productivity. And consumes less power, which means less heat.
@@terminator. I somewhat agree with this sentiment and think that Arrow Lake is a semi-step in the right direction. It still doesn't take away from the fact that most public facing CPU benchmarks are for gaming and that Arrow Lake is fairly overpriced for what it offers on that front. If they could just slash prices by around 10-15%, they'd probably wouldn't have got the drubbing they got from 90% of outlets. It being more meh news in between dumpster fires and meh news doesn't help matters.
Get 60% off an annual Incogni plan here (sponsored): incogni.com/techaltar
Where can i find that shirt?
Hey
Just fyi
Intel don’t only does equivalent of arm nvedia and tamc
They also do equivalent work of many eda tool providers(cadence synopsis mentor graphics)wafer and mask manufacturers many many more
They r in a different league
Incogni is a scam.
The supposedly "private" data is actually public. There is nothing to hide.
Genocide supporting hardware as spyware
You didn't even mention how intel will lose its entire PC market with the 13th and 14th gen CPU disaster?
There is a slight slip of the tongue at 11:40: Apple - not Intel - took on the task of moving MacOS to it's own ARM chips.
Same just noticed ,went on to comment ,but you already did!👍
Wanted to comment the same 😁
Queens gambit
Apple after intel failed to deliver their promises
@@Souls4Roca How do you mean that? 🤔
The problem was Intel fired all there experienced people in 2015 and 2016 layoffs. In many ways it was age discrimination, every one was late 40s and 50s…..a lot took early retirement like me because I could see the writing on the wall. The CEOs after the founders died or retired have not been leaders…..they are middle managers at best and bean counters at worst! I am just glad I left when I did. They need a leader with vision!
Elon musk will buy and lead with fire for humanity...
Or buy linked in and close it 😂
The old white men had to go as part of their $300m investment in workplace diversity. Just think of how much trouble they would be in now if they hadn't decolonized their workforce.
@@mmuller2402scavengers are always lurking
@@mmuller2402 Elon Musk has never run any business that didn't depend on government subsidies or fail.
@@scifino1 How dare you point out this sad reality! :)
as much as we all shit on intel, let's hope they don't exit the graphics cards business as it would mean amd, nvidia lose competition and they can jack up prices all they want and no one is able to do anything about it
It's cute you consider Intel a viable competitor in the GPU business that's making AMD and Nvidia keep pricing low 😂
Intel really can't leave graphics development, even if they killed discrete Arc tomorrow. There's a reason why those JDP reports always oversell Intel in terms of shipped graphics processors. They need some form of graphics development because a sizable majority of PCs have only integrated graphics and no one would outside of complete autists would want to go back to black and white terminals.
If it's any consolation, between all the mostly justified crapping on Arrow Lake for consumers, the iGPU in those is leaps and bounds better than anything Intel put out before on desktop. If they had like, a 10% price cut, I would just tell people to go grab a 265K or a 245K over an 8700G since unlike AMD's APUs, the IO is actually full-fledged and they aren't "gimped" (PCIe 5.0 is overkill for anyone not running a server farm where SSD's die every six months.) in terms of that.
Slava TSMC 🇹🇼
I agree omg this price hike was a major money grab like removing gold from the dollar lol.
Intel graphics cards really suck they are so weak that Amd's own integrated graphics are far better than Intel's.
People already forgot where AMD was until Ryzen released.
Intel surely can fail and die but they really shouldn't, for many reasons.
Intel in a worse position vs when AMD was in FX era
@@-Ice_Cold- Doubt it.
@@-Ice_Cold- lmao no, Intel still has more market share which AMD didn't during FX era
@@-Ice_Cold- Not at all, AMD only started recovering in 2013 when Sony started using their processors to power the Playstation 4, they lost the desktop market entirely at the time.
And they only rise to the glory in 2017, when they released Threadripper and the Ryzen series, for Servers and Desktops respectively, and yet, they are not the dominant player, Intel still is.
@@-Ice_Cold- Yeah no. AMD during the FX era was teetering on the edge of bankruptcy. Intel is probably never going to find themselves in that state, but don't jinx me if they do in the future.
i do not envy intel engineer and research people. those people are just expected to perform miracles in physics. probably the only people i wouldn't question for earning beyond 400k a year.
No. Just no. Maybe you want to pay the guy who made the fiber optic cable, but I'm certainly not paying the guy who puts glue on the fiber optic cable.
Engineers don't earn that much. The sales people do earn that.
It’s mostly a matter of optimizing all the small things until you get a cumulative big improvement. I work in cellular modems, it’s really amazing how often 1% power improvement here and 1% there suddenly adds up to a chip being 15% more efficient than its predecessor. The important thing is that you have the quick pace and money/manpower to do all those small improvements and analysis.
Intel doesn’t pay well. R&D engineer base pay typically
It's not a physics issue. It's a business decision issue
As an Intel ex-employee, I can affirm that Intel employees are extremely skilled and the emphasis is always about innovation. The problem is always with the decision making and leadership. They never envisioned future and frontrun in any domain. We always use to work to catch up with competitors products and they continue to release next versions. The complete leadership ladder need an overhaul and bring outside talent to atleast survive in the first place.
What is crazy, how does Intel gets so little done with so much staff? What are they all doing?
@@knabbagluon Performance reviews every 3 months and weekly 1on1's with your manager and your manager's manager. Nobody dares to do anything risky for fear of being "off-track" or "falling-behind." Middle management maintains obedient employees who do nothing.
@@knabbagluon Middle management. I used to report to a manager who has no idea of what I'm doing, never bothered. I worked on a product with a team of 20 people. And we all have 6 different managers. This is just one team. Imagine teams on different components of the same product and their managers. On top them there exists program managers who job is only jira and milestone tracking. And this is an extremely high priority time bound product. Imagine the fate of a standard product with feature requests enhancements or maintenance. Intel have atleast 20-30% middle managers.
@@b2wme So basically what Lip-Bu Tan said and why he left.
*I thought so. Because you would not go like this otherwise.*
They should make him CEO.
you are bogged down by lots of bad managers that’s why, which keep giving dumb ARs and asking for status updates
I do not think that most of the problems that Intel is facing today can be blamed on Pat Gelsinger. You have to understand that chip architectures take 2 to three years to develop. The chips that are out now were probably designed in 2021-2022 and while Gelsinger was already CEO, it's not like he was going to come in and upend everything the engineers and managers had already planned.
Additionally, he was handed a bloated organization with too many middle managers that have been known to be resistant to change and entitled. When death came in knocking is the first time he has been able to fire most of them without risking antagonizing the people he depended on for the organization to work.
Lastly, any technology cycle for fabs take 5 to 10 years. This is the first time he has been able to make decisions on the future of his fabs and Intel is investing more in the next tech than any other fab builder. And yes, that includes TSMC which has refused to buy the new machines. They have said it is not a good bang for buck. Obviously I do not know if that is true.
Oh, and the GPUs came too late for the Gaming GPU crunch of 2020 onward and the current AI mania.
Thanks.
The GPUs came right in time. It was a good business idea. What failed was the technical execution. The chips where not competitive in performance. Had that been differently everything would had been just fine.
@@rosomak8244 I say it's both technical and excution. Broken as it was, people would have embraced Alchemist, had it just came a few months earlier when everyone is desperate for GPU.
And even now, it's not too late to get into a GPU game. Nvidia still hoard everything from gaming to AI and data-center stuff, AMD still not had take off the ground outside of gaming - there are spots available for anyone to get in, literally anyone who delivered.
Astute observation. Pat was handed a sinking ship and he needed to do some radical
@@rosomak8244 And the driver situation at Arc's launch was pretty abysmal too.
Slava TSMC 🇹🇼
If amd came back I'm sure Intel can too
AMD came back because intel had already fallen, there just was no competition. But as long as the governments back intel up, they'll comeback.
@@rattlehead999 Don't forget. AMD gave up its Foundry business. Not saying Intel should, but tough decisions ....etc.
@@rattlehead999 Not sure if that would be correct. Before Ryzen architecture, they were considered subpar chips and low end, so that has a big contribution to their come back.
@@rattlehead999the best fx series chip was beaten by a 4 core it at the time. Intel got complacent and never remembered how to compete.
relax, intel is fine.
altho, i genuinely hope intel was gone for good. i don't like their policies.
Why Intel insisting in the fab? Geopolitic strategic reasons. The government will support Intel only if Intel have fabs.
talking about the geoplitic side, intel can sell some fabs, the others in north america will stay for that specific reason, and everybody happy, except intel, who becomes another fab like global foundries
@@arch1107 Intel can't sell the new fabs they are building because the U.S. government has prohibited it after investing billions in these facilities. Production at the new fabs is expected to begin in mid to late 2025.
@@n.d.n.e1805 all depends on what countries we are talking about, nothing is written in stone
perhaps nothing is sold, perhaps all is sold, intel is doing really bad right now, but still makes money, so, time will tell, one thing is certain, i never imagined this happening, it is almost incredible
the more things change, more doubts i have it will be sold, intel, but things will change alot for intel, for sure
Government wants the most high tech fabs in the USA and having the most high tech fab gives intel the edge so it makes sense for intel to want to have their own fabs
Intel doesn't have the IP nor engineers to run fabs under 7nm. Waste of money.
Like Churchill once said: X86 will never surrender!
Or as Thomas Arne's once sang:
Rule, X86! X86, rule the waves!
X86 never, never, never will be slaves.
as logan paul once said: I like my architecture X86 bruh
@@docilelikewintercatfish I don't really who I despise more
IBM once thought that IBM 360/370 architecture will be forever. Architectures have life span. I have seen numerous ISA coming and going for the last 40 years. x86 is just a very successful one, like IBM 360, but not an immortal one. We are seeing transition from x86 era to ARM. x86 will be thing of the past within 10 years or so.
@@youcantata you really think so?? arm could give compeitition to x86 but can't reaplace it fully. most servers run x86 and might continue to do so as it offer a broad application support compared to arm. most of companies will not switch to arm just cause its so costly to do so. arm will give competition to x86 but will not remove it from computing devicies like laptops, pcs, servers
"Why is Apple failing? When Apple first started, we were 10 years ahead of the competition. What happened in 10 years? The competition caught up and Apple stood still. Apple's solution is not to slash and burn but to innovate out of the problem." Steve Jobs, 1997.
this. the Macintosh was 10 years ahead of its time, the only thing that could come close at the same time was an Amiga.
but from 1995 to 2005 the Amiga was basically dead (they were really only used for early hackintoshes and legacy media)
I 100% agree with this
OS X came out the same time as windows XP; it took windows 5 to 7 years to get aero with Vista and seven
I deeply respect Apple I just wish I could still buy an x86 Mac Pro that I could hackintosh the hell out of
or the best OS9 Compatible PPC Mac
I just don't really care about Apple silicon.
Sane thought.
@@nxtvim2521 both are still obtainable stuff
Apple was never 10 years ahead of the competition lol , Xerox were but he was saying that referencing the decline of apple in the mid-late 90s
Pat inherited this mess. Intel previously was consistently investing in their R&D division but stopped when the previous CEO (who was more marketing and business focused). Now that they have fallen way behind the competition they have to play catch up. If the US government wasn't worried about China annexing Taiwan and TSMC (the worlds best Silicone Wafer manufacturer) Intel would have long disappeared. Intel now represents the American hold on technology and is critical. With their current short term cash raising (selling their land and leasing it back) they should survive for long enough to become at the very least competitive with the other companies. Pat is currently Engineering and Talent focused and should be able to sort them out of this mess as TSMC is in contract with the government to aid Intel. This is the make or break era for them but they have the right parts to get them through.
Pat's gone!
Good summary
The valley of death is when you get complacent with your leading tech and start hiring MBAs to run your company
The executives have other degrees, with a handful having an mba in conjunction with another degree so this is not true
@@zerog1037 The term MBA no longer colloquially refers to the college degree but to a certain type of manager. Someone that focuses on spreadsheets instead of people, shareholders instead of success, and profits over people.
@CausticLemons7 Literally does not
@@zerog1037 You're exactly right. Which is exactly why I included the term colloquially. Perhaps you could verify the meaning of that word, and then ask any questions you may have. Repetition does not make interesting discussion.
@@CausticLemons7 oof acting condescending is not helping your case mate, especially when your statement is so ridiculously stupid.
I acknowledge your use of the word and I am telling you tht this is only true in your bubble. In the real world, nobody calls such managers MBA's. Tht doesn't even make any sense.
funny how Intel currently runs a big ad campaign on Germans biggest tech news outlet "c't" explaining why the future lies in x86.
Nah, everyone knows future is with RISV-V
the real future is clearly on powerpc, don’t fall from the lies spewed by intel /s
it is though 😅
@@e8root just wait for teams of RISC-V cores wired together to emulate x86 better than Intel !
@@kayakMike1000 That will take a while if it ever happens. Right now x86 is still printing money because there is so much of it.
How is incogni so expensive?
It's literally an automated bot sending the same request just changing the name.
I could understand if it was a couple of euros since it's not funded by ads, but €14 per month?! CRAZY
Also in the EU at least, this service in completely unnecessary, because we have GDPR.
@@cryonuessgdpr doesn't solve the issue incogni is trying to solve. You are still dealing with many companies who hold your data, each of with you have individual relationships with.
How else could they pay for all these ads? :)
@@HildeTheOkayish Under the GDPR, companies are not allowed to sell your data to random data brokers. So you know exactly who has your data (companies that you have contracts with + credit bureaus that they exchange data with after you explicitly agreed to it). Also, it isn't allowed to collect more data than necessary (i.e. social security numbers may not randomly be collected when there's not need for that). Furthermore, incogni claims that it needs to be a subscription service, because companies will readd your data after a short while after it being deleted, so incogni needs to stay active all the time. This however isn't allowed under the GDPR either.
@@cryonuess1) hackers exist, so your data can get stolen even if not sold
2) just because it's not allowed doesn't mean no-one does it
19:36 That's why i hate public companies,stock price is irrelevant, Intel still has good revenue,and had good profits.
Furthermore, the fact that people with no idea about chips or computers get to decide who should lead the business.
Also, these companies just prioritise shareholders over everyone.
Arguably, they pissed away a lot cash on share buybacks. The CHIPs effectively reimburses them for this
@@hilal_younus Boeing agrees with you!
no, it didnt
keep in mind that intel had in construction like 5 fabs, each one is like 20000 millions to make, as you know those projects were cancelled or suspended
they have debt, 3 years of bad products and amd is eating their lunch in servers, desktops, it was all eaten now
the size intel had 4 years ago, it vanished, it is not doing well and will not do better if it doesnt improve soon, or is converted into a fab only, or government takes control for the fabs
it is not looking good for intel, in no project they are doing well
@boltez6507 That's why I buy the dip on intel :P
I often watch your videos in awe of just how focused, clear, and articulate you can be
awe?
Dont get me wrong i enjoy Tech Altar but maybe you should expand your reading and watching if this gives you "awe" ya boob
Slight correction: Intel acquired StrongARM from DEC as part of a patent lawsuit. StrongARM later became the XScale architecture. Great overview of Intel’s woes.
It's the age old story that plagues all tech giants. They become giants and then they turn complacent because there's a lack of competition. Innovation stagnates and the rest of the world catches up and eventually overtakes. The complacent company at that point has gotten rid of most of their original passionate founders with a management that only cares about shareholders who only care about profits. In this process, they forget how to make a good product. Without passion, people just fail to see the potential of any startup. Google is going through the same fate, microsoft something similar.
All that said, I think appointing Patt the CEO was the right move. I do believe that Intel will pull through in the end. I really want them to develop arm chips or something similar. x86 is dying for consumer products and there's no doubt about it.
nah intel does do r&d its just in hindsight that other technologies are more popular
Guess Apple is next
@@TheSerpentDKApple may be very successful in multiple product categories, but they're not really dominant in any of them, so I don't think they fit that archetype.
@@Frostbiyt not dominant in market share, but i'd argue that their Mx chips are dominant in performance, as is their Ax chips (at least by 6-9 months compared to Qualcomm et al.) Having said that, nobody can survive 10 years of stagnation...
A Tech CEO who quotes the Bible was the right move to appointing?
@18:12, I believe the AI fabric chip that Amazon is interested in having Intel manufacture is just a customized Intel design, just like the custom Xeon chips Amazon will be buying. This is a less rosy indicator for Intel, because it means there's still zero indication of interest among chip designers in using Intel's foundry to manufacture their own designs.
In other words, Intel Foundry Services still doesn't have any clients besides Intel itself, which doesn't bode well for that side of the business. Even if the node performs well, if nobody but Intel understands how to design for the node, IFS will continue to struggle.
Behind every single foundry there is a huge software stack, that has to support using the services. What intel has in this regard is a huge pile of home-made dung. Don't bet on them getting any better in this regard.
Intel Foundry Services has the University Shuttle program, and probably a few startups who do use them. However, as someone who has used their Process Development Kit as part of my university work, I can attest that you can really tell that no-one outside of Intel uses it. The documentation is cryptic, and the software infrastructure is not really there, and so it can be more difficult to design for. I wasn't even using one of the cutting edge nodes, so I could imagine it being even worse for those nodes.
11:40 Frustrated with Intel's slow progress
*APPLE* switched to their own chips.
You mistakenly said Intel switched to Intel chips
that's what I call improvement
Yeah noticed that slip too. It was Apple, not Intel.
Another slip at 12:26. It's not i48, but rather i486.
Apple had already been developing their own CPUs for their phones/tablets. Those chips just got good enough to use for laptops. That itself is a damning indictment of Intel when a company that as a side business designs chips, designs one that crushes Intel's in performance.
apple chips? im more of a potato chip person
Their CEO just "retired" effective immediately Dec 1st lol. oof
Yeah, that's convenient! 😂
..with big fat $12 million in his pocket.😅
I do think they have made the right decision. Literally the world right now is depending on TSMC for chips, if someone managed to archieve parity that would be huge and is what they need in order to go forward.
It's just the matter if they can hold on long enough to see their vision realised. For a company this desperate, this isn't the time to play it safe.
Did they make the right decision when they hid their known 13th and 14th gen critical hardware issues? Or when they pointed the finger at everyone else for crashes when they knew it was their fault? Or when they only came clean after Gamers Nexus exposed them by having the chips independently tested, revealing the oxidation issue?
Intel lied to us for years. And they would have forever if not exposed.
That Asus Zenfone was actually an incredible phone. It's performance was good and camera quality actually good too. The biggest problem was ASUS didn't do enough software support and it got sluggish over time due to the heavy skin on it. I absolutely loved the customization that phone had though. Thanks for reminding me about that phone lol
What I can say about Intel and the way Samsung is struggling right now is to never put someone from some accounting or finance department as CEO in a high-tech company instead of someone else. in R&D and development positions, people who know what the product needs for the company to succeed
Just in: Pat Gelsinger has resigned
He lied to us. For so so so long.
Intel’s decision to hide a critical hardware issue caused by a bad oxidation coating, and then lie and blame the motherboard companies and anyone else for their 13th and 14th gen chips failing. And only came clean after being fully exposed by Gamers Nexus.
So although I believed in their mission, they destroyed consumer trust.
Came back to this video to see if anyone else saw that. "resigned" more like fired
Well looks like his dream was crushed
Booting Gelsinger before he cold finish DE-Otellini-ing the company, just sealed Intels fate.
@@CitAllHearItAll It wasn't the oxidation, they released the i9 chips with already insane 1.5V core voltage, and a tendency to have voltage spikes ended up breaking many i9s over months.
0:10 That is such an irrelevant stat. Let's say Intel had a loss of only $1 billion but AMD was also struggling and just about broke even with a profit of only $1 million. Headlines: "Intel has 1,000 times as big a loss as AMD made in profits!"
Indeed. Also doesn’t account for big investment and big write-offs that take a bite out of profitability. Would be much more useful to look at profit margins or market share.
It was bean-counters that destroyed Intel's innovation culture, like they destroyed Boeing's engineering culture. That sort of thing doesn't seem easy to reverse.
This is what I'm sensing as well. Intel decades ago had weird hiring practices where you get a sudden hiring freeze. As soon as they started to dominate AMD, there wasn't as much focus on innovation on the CPUs. Lean Manufactuing went too hard too fast. I'm thinking they got hit from the interest rate hikes used to combat inflation
Spot on, especially for a decade which is an eternity in tech
Crazy how the start of intel’s fall from grace correlates with the start of Apple worse Mac’s 2015-2020
correlates?
apple told intel to make chips that did not overheat on the thin laptops apple was disigning, intel said sure, but nothing happened when apple expected, in 2014, so apple went ahead wiht the arm project, intel knew what the apple was asking and couldnt or didnt care to deliver, or a mix of both
intel refused to face arm, so arm could progress like this nonstop
@ I honestly don’t know what you are trying to convey. You are just stating what happen which is exactly my point. Apple was looking forward and intel couldn’t keep up hence the worse Macs apple has made are from 2015-2020. 2015 wasn’t bad but it for sure got worse with time.
success of apple M chips is success of tsmc - lunar lake is on par with m3
@@jakejoyride even though TSMC does have a big role when it comes to implementing new technologies Apple also needs to have a great design. Saying that Apple has no part in the success of M series chips is absurd. It’s like saying that intel didn’t make good chips it’s the machine provided by ASML. Intel “innovation” is just giving a chip more cores at a higher clock speed with more power. Instead of just making a more efficient chip. You are also comparing lunar lake to Apples worse M series chips. There is a reason why M3 Macs didn’t do so well and why they got discontinued so fast. Intel is a company that is stuck in its high school football glory days. My hope is that they recover only so that AMD and Apple don’t lose sight of innovation just like Intel did.
Intel has so much sway, even if they are not competitive, Uncle Sam needs fabs on home soil, so it will get fabs on home soil. We will figure it out, or spend more until we do. It's one of the non negotiables, like our nuclear triad. It's also why Taiwan is so contested right now, because we don't have that ability at home, and we desperately need it to stay alive, literally. USA needs fabs in USA, and Intel will deliver, eventually.
So I guess now we kind of know why Intel's home CPU's had high rate of failure in the past two gens. They rushed them.
also, there is a rummor saying intel fired a bunch of engineers in charge of the firmware, the microcode, since the chips couldnt evolve really, it was 12th gen but with more watts being pushed, with poor implementation, and there you had the result
it took them 2 years to fix the problem, and some say it is not really fixed, just patched
@@arch1107 In IT terms, a patch is generally considered a fix.
@@shobuddy not in this case, where it had to be patched 4 times so far
that is not a fix, it was a way to calm the masses who did not understood they should have applied for warranty, rma or return of mobey because intel screwed them
@@shobuddy what does IT have to do with engineering?
@@shobuddy whens a patch not a fix? removal of feature?
I remember reading some investing articles from a few years ago (2021/2022 I think) where it said intel was one of the 10 largest companies in the world not in terms of market cap but in terms of cash reserves because of how much money they had due to years of profitability , if intel went bankrupt or insolvent they'll have wasted an insanely large amount of money.
If I remember correctly, for some time, while ARM was more consumer friendly and cheaper, Intel was more prestigious.
It was noticeable that they are kind of the same, but clearly behaved like a monopolist, making consumers buy new motherboard with each generation while AMD was often compatible with older generations, so same chip kind of lowered its own value as you knew new chip will mean new motherboard, so even more costs on already more expensive product if you would want to upgrade.
Who could have foreseen that it could have negative consequences for Intel to knowingly sell CPUs for years that could oxidize and thus be irreparably damaged, and to try to weasel out of it in response?
Apart from that, I would like to point out that without the IBM PC, Intel would have long been a forgotten IT company that produced legendarily bad CPUs in the 1980s.
Beg to differ. As a former telecom / cellular engineer who had opportunity to design in Intel, Hitachi, Motorola and ARM micro-controllers ranging from 8 bit to 32/64 bit (8080/8051/HC05/HC11 - 80386/MC680XXX - arm64) into things ranging from pagers to large central office switches, my experience different. I still remember the late 80s/early 90s our attempts to switch to AMD/OKI/Ti parts only to be bitten by the silicon bugs and reliability issues! Might not have been the best always, but were not far from it most of the time!
IMHO Intel is a large dinosaur, and large dinosaurs don't evolve and move quickly enough in the face of change.
Large dinosaurs are a popular choice for backing for other rich dinosaurs in the US government
If it's lifetime, intel has pivoted it's main business a few times.
In the chip business, being a dinosaur or not doesn't matter, you just need to have the most cash
You forgot that these dinosaurs still works with older apps and system that still running 😊
It costs hundreds of billions of dollars to start a cost competitive cluster of fabs (need volume). Good luck replacing the dinosaurs
Intel will probably have to pivot like IBM did to stay alive. The question is what they will pivot into.
ibm still exist, but is not what ibm was known for, right now ibm is nothing, and intel cant be another nothing in the market
ibm wants to push quantum computers, but that is not happening, not at the speed they expected it, what will intel do, be a fab for ibm?
i bet they will survive, but not as a company that makes products you want to buy, but as a company doing projects for someoneoe else, like ibm you mention, so everyone forgets them
+
kinda sad, but, intel had it comming
PowerPC is still around. IBM slipped up though, and it can be easy to slip up. IBM had the ability to leapfrog Intel, but Intel eventually pulled ahead again after a few years. This made PowerPC really good for something like a game console.
IBM screwed up with the Cell processor. The Cell was ahead of its time, but more importantly, had a hit when it came to general-purpose processing. It's an amazing DSP processor that's probably still used today
We still need general-purpose processing. IBM wanted the Cell CPU into the Mac, and Apple didn't like it, which is a major factor why they went with Intel. Intel's Core CPUs then were incredible on top of that. If the Cell CPU was more balanced, it might have been in the Mac. In the PlayStation 3, the Cell was kind-of used like a 2nd GPU, but the problem is that it had to compete with a GPU made by ATI (now part of AMD)
they can license cpu designs from amd to make ryzen compatible cpus which fit on the same motherboards. Intel will have something to sell and Amd gets free money.
Keep this forever at the forefront of your mind, and I speak as someone who currently prefers AMD Ryzen. If Intel dies, AMD gets to price processors exactly like NVIDIA. Terrifying and it CANNOT be allowed to happen!
Why do you prefer the Ryzen? Pricing?
Intel fired their CEO the day after this video dropped....coincidence?
Intel needed to be humbled and i hope this was a good wake up call for them and that they started serious reforms within the company. I hope in a couple of years we will see affordable and good Intel chips like we saw with the introduction of Ryzen. I am also glad Intel is pushing on the GPU front. We need competition and you can hate intel all you want, but consumers benefit from multiple companies competing on level ground.
Well made vid! Intel basically got too comfortable being kind of untouchable king of CPUs and woke up way too late, so now has still plenty of catchup to do. It won't be easy for them as competing in manufacturing chips with the TSMC is extremely hard. Let's see where it all goes and if there's light in the end of the tunnel...
19:56 I doubt focusing on AI would have helped considering everything is written with Nvidia's CUDA programming language and the one from Intel is completely different.
Even AMD is more like it. Until recently they even helped the ZLUDA project which emulates CUDA on non-Nvidia GPUs.
The funniest part is that the project was initially made by an Intel employee to use an Intel API.
Cuda is out. AWS and Azure is where it is. No one does ai stuff on desktops as it’s all cloud based now
If Nvidia screws up, everyone may write everything in Vulkan.
@@timothygibney159 ...CUDA is a software layer for working with Nvidia hardware. "Cloud based" just means that you're using someone else's desktop - you still have to use CUDA to interface with your accelerator hardware, no matter where the computer is.
@ you are uniformed. It’s an operating system complete with its own apis, services, networks, daemons, quantum computing, and a complete datacenter. Python functions call lambda in AWS in apis while Azure does so with python functions. Completely abstracted from your hardware. 12 years ago you were correct.
We use AWS for rdbms, web hosting, dns services, while only hosting 5 virtual machines which they call instances. Saas and api calls to AI are cloud now with enterprise ai hardware
@@oflameo8927 OpenCL is the competitor do CUDA, Vulkan is just an graphical API, CUDA and OpenCL is another thing.
Watched this not even 2 hours ago and now the news of Pat's retirement comes in. Oof.
Hilarious
AND THEY FIRED PAT.
The funny thing is that NVIDIA wants to make their own fabs to vertically integrate like Intel.
That is actually a good strategy. By the looks of things the US will do everything in their power to cut off China from the semiconductor supply Chain and when they push too far China, the largest consumer of these products will push back and has the capability to take out TSMC.
They may even be able to develop their own chips on par with TSMC in 10 years or less. So its best to diversify and have the option to make things inhouse cause the US is not the global hegemon anymore and its no long free reign for US companies.
Not gonna happen... chip making isn't like automotive, where it's primarily an integration problem... chip making is a technical problem that's constantly evolving. TSMC will tweak their internal fab process every few weeks to improve yield, sometimes only way their customer finds out is through their own acceptance testing. Also, a company like TSMC is really, really compartmentalized to avoid copy-cat/poaching from other companies, so only very few people are aware of the entire process but they lack hands-on experience (more like they haven't done hands-on for a decade or more) to know the nitty gritty details. Only way I can see nvidia pulling it off, is to buy a company like global foundries (which famously failed at making EUV work), and invest hundreds of billions to catch up in 10-15 years time.
@@evangellydonut Doesn't TMSC use ASML's machines as well? That's the core technology, and it's not TMSC's technology.
@@wopmf4345FxFDxdGaa20 everyone except china can buy EUV machines from ASML. It's not a matter of just buying a machine, else Intel wouldn't be so far behind. The entire manufacturing process is extremely complex and require specialized knowledge to make things work.
I've not heard anything about Nvidia wanting to make their own fabs. I thought they wanted to make their own CPUs
they survived their chips not being able to do math. I think they're going to be just fine
They will survive for sure, byt when it will make a comeback still uncertain. The competition of both is what made us customer get the best
the competition right now is amd versus the entire arm world, x86 can hold for some time, then amd will jump to arm i bet
intel might survive, but only as a foundry it seems, time will tell, qualcomm might buy just some fabs, others might remain under intel control, they are not saying much now, so
Had to come back to this video after hearing the news
Same lol
Quite the interesting play by the board, let's see how it'll play out...
@@Kromunos Maybe the board is the problem and not Gelsinger? None of them is an engineer.
Intel missed out on atom. Intel missed out on mobile market
Intel missed out on CUDA. Intel missed out on AI. Intel lost server market
Intel refuse to acknowledge AMD's ryzen. Quoting: ryzen is just two dies glued together
Then intel started losing desktop market
Intel joins gpu market. But arc is a bomb at release.
It's like they are purposefully missing every single market.
No matter if you think gelsinger is right or not, him leaving is almost guaranteed to be a bad move. Intel is already halfway through an incredibly large plan and whatever people are going to run the company soon are guaranteed to back out of it and will probably not commit to anything else instead further exaggerating this decline.
The "Intel inside" logo and commercial jingles represent growing up in the early 2000s.. great nostalgia.
There have been some questionable choices like selling off and renting back some of their buildings. Hopefully they can climb back.
5:54 And this is where Intel messed up. Instead of killing off XScale for Atom they should have done both!!! Aka continued ARM CPU development for phones/PDA's (still relevant then)/etc... AND develop a small x86 core for more power constrained devices. 🤷
I like when you dedicate a video to a topic instead of jumbling different topics together
I dunno how they survive tbh, their reputation in the CPU space was demolished with the i9 degradation issues and their denials of it, the ARC GPU line failed and the announced successors are delayed or cancelled and in either way do not challenge AMD or nVidia.
I think overstretching themselves is bad and they really need to focus on simplifying and doing one thing but better than anyone else
You do know that the Battlemage reveal was scheduled for literally 2 days after you said this right?
Being that he just quit yesterday, I don't think it's going well. I hate working in this industry
I had never considered before that vertical integration creates a weak spot through inter-dependance but it makes so much sense.
AMD: First time?
AMD was in a worse place internally , but Intel is in a worse market (Qualcomm wants to buy em , Nvidea and AMD could also buy them ) and we going into a GPU world while Intel is still CPU focused ... And the worst part is that they seem to only further be digging themselves into a hole , whereas AMD actually had a sensible way out.
Let's not forget the foundations Intel helped create that propelled an entire industry. Intel chips have been the standard for a long time. Very reliable 99.9 percent of the time. I'm an Intel fan and will continue to support them.
They certainly did have very reliable chips and were known for it for a long time. Which makes it so much worse that they managed to fumble RPL so badly now, right when they've already been on the struggle bus. CPUs just outright degrading themselves to death is essentially unheard of. That reputation of reliability was one of maybe two giant leg they still had to stand on - the other being the momentum of having been the entrenched as go to standard for years in many areas like servers, OEMs or simply as the platform software was written for. That other leg is being chipped away by AMD right now. Unless they really get something akin to a ryzen moment soon, these things are not going to keep them above water for much longer.
While I'm a lifelong Team Red since my first K6 rig in the 90s, after having seen Intel biff it this hard with how their CPUs have struggled to keep up with AMD's in terms of performance-to-dollars ever since Ryzen caught them off guard when it was launched, I feel bad now, and want Intel to succeed. That being said, let it be known that I have faith. These new fabs they're building stateside are the ticket. I just hope they can afford to hold out long enough to realize the gainz that having fabs at home will bring. If they can bust out a rad CPU in the next 5 years that competes better with AMD's offerings, I will go Team Blue just to show my support for a company that can bring itself back from the brink.
Important to remember that Intel still holds 2x the market share of amd, and up until recently Intel was making multiple times the revenue AMD was.
Even today, Intel has outsold AMD handily in desktop and laptop markets during 2024. They also didn't end up in this situation randomly, they made bad choices, and are being bankrolled by the American taxpayer
I feel this, I want them to succeed too, would be very interesting to see
Its no longer red/blue, now its x86 vs Arm, and Arm is winning
@@EngineeringNibbles Its enigma how they keep 2x or more market share of AMD, while making crappy products. People really don't understand what the brand loyalty is
My last Intel CPU was the i7-4770K. By the time I needed to upgrade AMD had Ryzen and the rest is history. AMDs stuff is so good I don't even bother with any CPU that consumes more than 65w. The stock cooler is fine. I'm done messing with liquid coolers or massive air coolers. Both AMD and Intel need to stop trying to make faster chips and instead focus on power efficiency. A Ryzen 5 5600g is more than powerful enough for 99.99% of users and will be for a decade to come. Give me a CPU with the same performance as a 5600g that consumes 5w and then you can focus on making things faster again.
13:48 LOL What a smooth brain CG artist.
Wow! You did an amazing job explaining this whole topic
Why you are even interested in dying company
And Pat goes splat now.
7:20 and so, Intel's tick tock model became more like a tiktok model
The answer is NO. The US "tech war" backfired in failure as China changed from a huge chip buyer six years ago to now become the world's biggest chip producer. With a 70% global market share today!
You pointed out how Intel stopped coming out with new nodes, but you didn't cover why. I mention this because I've yet to see a video covering that why.
its getting incredibly hard to reduce the size of nodes
And the CEO steps down a couple days after the video 😂. Whenever I see Intel's struggle with keeping up to date with the manufacturing business I get more amazed at China's SMIC being able to produce the latest Huawei chips without the latest cutting edge equipment.
9:27 This is just false. The advanced technology in Intel's 10nm like using cobalt wasn't the "wrong direction" at all. Basically every leading edge fab has since adopted most of the things Intel tried with 10nm. They just tried to do too much all at once.
Also, EUV wasn't going to be ready in time for 10nm's initally planned launch date!!! 🤦 THAT'S WHY IT DIDN'T USE IT!!! They only looked stupid for not using it because the process got delayed by so many years that EUV had not only released but become commonplace.
6 days later and their CEO is out. Look forward to seeing your take on Friday!
Missed these longer form videos. Thank you!
2:39 ISA != microarchitecture. The ISA is nothing more than a list of instructions and semantics. The µ-arch and the processor core is where the money (and effort) is at with implementations like the Cortex-A,R,M series, Apple -Storm series, whatever Nuvia cooked up, etc.
The problem with everyone using TSMC isn't only that TSMCs capacities are limited, it's also one single point of failure for the global high end chip manufacturing and has a big red target painted on it. If one day an mentally challenged head of state decides that military support for Taiwan should be limited, it could doom all tech companies in the west. Intel is one of the few companies in the west which can take the pressure of this single point of failure. So it's in everyones best interest that Intels manufacturing capacities are at the best it can be. IMHO
That is true, but it does not apply only to Intel. Encouraging TSMC to build foundries outside of Taiwan decentralized the risk as well. Keeping Samsung and maybe Glofo in a state where they can survive and compete at least in some segments of the market is just as important. Which is exactly what things like the CHIPS acts are intended to do.
Also important to remember is that bleeding edge foundries aren't the only thing that tech companies rely on. It is currently the most centralized and probably the hardest to just replicate somewhere else, but if said mentally challenged head of state decides to say really go to a trade war with countries like China, tech companies will have a pretty terrible time too. A lot of components for and steps to produce their products besides the silicon itself are relying on a functioning global market and are often being done at scale only in a handful of countries.
I can’t imagine how stressful it must be to implement such a bold restructuring plan
Their fault for trying to just give us quad core cpu's and forcing motherboard upgrades. Look at how many cpu's AM4 supports.
Bought AM4 platform this year, cause I had 40 gigs of RAM from my Skylake setup. I am very happy with R9 5900X (12C/24T).
Whelp:
> Intel Announces Retirement of CEO Pat Gelsinger
Lol. I watched this yesterday. Today's news: Intel is doomed.
Intel regaining manufacturing dominance is in the top 5 geopolitical crucial issues for the next 5-7 years. It's really hard to overstate how important this turnaround is if technology keeps its pace of progress. If computational power demands expand exponentially as they're predicted to, the control of semiconductor manufacturing will be of unfathomable importance.
Pats retired
What's crazy is that AMD still, 10 years since ++ nodes, hasn't crossed 30% cpu market cap.
Just goes to show how crazy the momentum of Intel being the entrenched defacto monopoly in x86 has been. It has kept Intel alive ever since AMD has been beating them left and right with better performing, more efficient and cheaper to make products. OEMs, data centers and enterprise customers stuck to what they knew, what their software was optimized for, who they had contracts with for a long time, even if it meant getting a worse deal. But the fact that even one of the slowest moving of those markets - Server - has been significantly shifting towards AMD for a while now, indicates that this grace period for Intel to get their stuff together is coming to an end. They really needed some big win soon, or rather: yesterday, but alas, there's no clear indication that they've got any kind of Ryzen moment like thing happening any time soon.
Ah I didn't realize that the long march had already begun .....
lol that says it so well, let's hope the chip market turns red... I mean blue
The timing of this video was interesting. Gelsinger was forced out yesterday.
intel allegedly already cancelling their royal core project. so sad
Completely false. 18A which is their “core project” is almost in production
18A node has nothing to do with Royal core which is a CPU design.
And yep the Royal core project is dead, it looked insane though.
@@stefanbucur6472 i heard that before, many times, but here we are
@@aravindpallippara1577 Insane good or insane bad?
@@GreyDeathVaccine Insanely ambitious (so good I hope) - it was a design for a super core with insane IPC that could dynamically split into 2 or 4 cores for focusing on multithreading depending on workload. The project was worked on by Jim Keller one of the best CPU architects ever (former AMD, Apple and now ex Intel as well).
When the time comes, we'll need a sequel to this
Pat just resigned. Final nail to the coffin. 😂
Since we will never see MacOS on Intel again, and I will never use Windows again, I really no longer care what happens to Intel. R.I.P.
Trump will pump it with tax $$. I would put my money on intel. It'll recover. The only bet for the US to relocate chip manufacturing is intel.
This has been your best effort to date IMHO. Thanks.
We criticise Google for having 2 products (or more) for every category but I think that's the key to innovation and success. If you're competing with yourself all the time it's hard to get comfortable with your own success.
One of the most mismanaged companies in the world.
This became outdated quickly
Lmao yeah Pat is gone. Intel is dead
Ceo pat gelsinger stepped down from intel not even 3 days later, the timing is crazy
Intel, the frog in the boiling pan of water, and it was the huge number of bugs in Skylake that finally shoved Apple off Intel, half of those bugs were discovered by Apple engineers
Skylake was my last Intel CPU. I switched camps and not planning to return.
Developing 10nm process rule from 14nm taken 6 years.
It's fatal failure for Intel.
Lunar Lake has been so good. It has the best performance to efficency in windows when you consider compatibility. Just saying it's "fine" seems like an understatement
Lunar Lake’s design, while brilliant is something Intel doesn’t want to keep doing long-term because it’s expensive for Intel to buy chips from TSMC and then pack the dies and memory to a single package.
@Jabid21 I mean it's a good starting point, I have the zenbook 14, and it lasts me all day. Just saying it's fine is dishonest
@@sydguitar99 same thing with arrow Lake. CPUs do more than just gaming. Arrow Lake is pretty good for productivity. And consumes less power, which means less heat.
@terminator. pretty much all the reviews for arrow lake and lunar lake have been very positive and they seem to be selling quite well
@@terminator. I somewhat agree with this sentiment and think that Arrow Lake is a semi-step in the right direction. It still doesn't take away from the fact that most public facing CPU benchmarks are for gaming and that Arrow Lake is fairly overpriced for what it offers on that front. If they could just slash prices by around 10-15%, they'd probably wouldn't have got the drubbing they got from 90% of outlets. It being more meh news in between dumpster fires and meh news doesn't help matters.
Well, and now Pat Gelsinger is not CEO anymore. You certainly had a good timing with this video!
I have a strange feeling a lot of Intel people are commenting
On the incessant speed of the narrator?😂
They expect Huawei to die, instead it's Intel. What a plot twist.
poor Huawei, i would like to use their phone instead if they had Google
Huawei has google again
I love these amazing in depth analysis from TechAltar. This type of content makes my day!
Thank you!
Glad you enjoy it!
Excellent analysis, top presentation. I am super happy to have come across this channel.
11:40 Little slip up of words