The core problem at Intel is they promoted the myth that ISA has no impact on performance to such a degree they started fully believing it while also somehow believing their process advantage was unassailable. By that time they'd accumulated so many worthless departments that turning it around at any time after 2010 was an impossibility.
You could be the greatest business leader in history but you cannot save Intel without making most of the company hate you, so it will not happen. Just look at the blame game being played in these threads where somehow it's always the fault of these newly found to be inept individuals, and never the blundering morass of the bureaucratic whole.
I started at intel in 1988 and loved working there up until about 2005. The author of this article did a fantastic job enumerating the number of launched products, but there were twice as many that were cancelled. It became such a clusterfuck of leaders vying for promotion to bigger projects and taking over flailing ones only to can them after a year. The 80s and 90s were hyper efficient and focused on churning out clear roadmaps. But the fragmentation of the market was something intel couldn’t handle: its platform didn’t cover all segments no matter how hard it tried it couldn’t do everything. I think the market is still reconverging after all the segmentation. The term “ubiquitous computing” was thrown around a lot in 2000, and it finally happened but it is arm that won. I think there will be a reconvergence of personal computing platforms and I can’t wait to see who vacuums up all the little guys. But after reading this, damn I missed launching the 486 and Pentium. Those were some of the best days of my career.
I'll give a viewpoint that the article reads like a listing of spec sheets and process improvements for CPUs of that era and not much else. Not really worth reading imho.
I'd love some discussion on why Intel left XScale and went to Atom and i think Itanium is worthy of discussion in this era too. I don't really want a raw listing of [In year X Intel launched Y with SPEC_SHEET_LISTING features].
IMO, Intel took us from common, affordable CPUs to high-priced, "Intel-only" CPUs. It was originally designed to use Rambus RAM, and it turned out Intel had a stake in that company. Intel got greedy and tried to force the market to go the way it wanted.
Honestly, AMD saved the x86 market for us common folks. Their approach of extending x86 to 64-bit and adopting DDR RAM allowed for the continuation of affordable, mainstream CPUs. This enabled companies to buy tons of servers for cheap.
Intel’s u-turn on x86-64 shows even they knew they couldn’t win.
AMD has saved Intel’s x86 platform more than once. The market wants a common, gradual upgrade path for the PC platform not a sudden, expensive, single-vendor ecosystem.
> I'd love some discussion on why Intel left XScale and went to Atom
I thought it was pretty obvious. They didn't control the ARM ISA and ARM Ltd designs had caught up to + surpassed XScale innovations (superscalar, Out-of-order pipelining, MIPS/w, etc). So instead of further innovating they decided to launch a competitor of their own ISA.
Intel at the time was clear about it: they wanted to concentrate fully on x86. They thought they could do everything with x86; hadn’t they already won against their RISC competitors by pushing billions into x86? Why would ARM be any different? Shortsighted, in hindsight, but you can see how they got there.
Intel had constantly try to bring in visionaries, but failed over and over. With the exception of Jim Keller, Intel was duped into believing in incompetent people. At a critical juncture during the smart-phone revolution it was Mike Bell, a full-on Mr. Magoo. He never did anything after his stint with Intel worth mentioning - he was exposed as a pretender. Eric Kim would be another. Murthy Renduchintala is another. It goes on and on.
Also critical was the the failure of an in-house exec named Anand Chandrasekher who completely flubbed the mega-project coop between Intel and Nokia to bring about Moblin OS and create a third phone ecosystem to the marketplace. WHY would Anand be put in charge of such an important effort?????? In Intel's defense, this project was submarined by Nokia's Stephen Elop, who usurped their CEO and left Intel standing at the altar. (Elop was a former Microsoft exec, Microsoft was also working on their foray into smartphones at the time. . very suspicious). XScale was mis-handled, Intel had a working phone with XScale prior to the iPhone being release .. but Intel was afraid of fostering a development community outside of x86 (Balmer once chanted -> developer, developer, developer).
My guess is that ultimately, Intel suffers from the Kodak conundrum, i.e. they have probably rejected true visionaries because their ideas would always threaten the sacred cash cows. They have been afraid to innovate at the expense of profit margins (short term thinkers).
Interesting to me is that Intel was constantly shedding people in 2008 and 2009 with high revenues, high market share, tech leads, etc.
Smacks of financialization and wall-street centric managerial groupthink, rather than having the talented engineers to fight the coming mobile wars which were already very very apparent (thus the Atom), or even the current war of failure in discrete graphics.
Once the MBAs gain control of a dynamic technology company (I saw it at Medtronic personally), the technology and talent soul of the company is on a ticking timer of death. Medtronic turned into a acquire-tech-and-products-via buyout/acquisition rather than in-house, and Intel was also a treadmill of acquire-destroy (at least from my perspective Medtronic sometimes acquired companies and they became successful product lines, but Intel always seemed clueless in executing their acquisitions.
I look at all the 2000s acquisitions of Intel: sure shows they were "trying" at mobile, in the "signal wall street we are trying by acquiring companies so we keep our executive positions" but zero about actually chasing what mobile needed: low power, high performance.
shedding ppl in the USA, yes. bringing on hordes of cheap engineers from India and Malaysia at the same time though. labor arbitrage was probably MBA-think as well, to your point. (also, Intel was sued along with other big wheels for collusion, i.e. agreeing not to hire from one another in the US to keep salaries down - they settled this class action suit). managed demolition of a once great company.
I don't know tbh, heard both good and bad things .. he was brought in after many of the problems had already become serious. He probably had a very difficult charter.
I don’t recall if there was ever a difference between “abort” and “fail.” I could choose to abort the operation, or tell it … to fail? That this is a failure?
These are the years when Intel lost dominance, right? This article doesn't seem to show much insight as to why that happened or what caused the missteps.
Yep, in 2014 Intel's Haswell architecture was a banger. It was one of those occasional node+design intersections which yields a CPU with an unusually long useful lifespan due to a combination of Haswell being stronger than a typical gen and the many generations that followed being decidedly 'meh'. In fact, I still run a Haswell i5 in a well-optimized, slightly overclocked retro gaming system (with a more modern SSD and GFX card).
About a year ago I looked into what practical benefits I'd gain if I upgraded the CPU and mobo to a more recent (but still used) spec from eBay. Using it mainly for retro game emulation and virtual pinball, I assessed single core performance and no CPU/mobo upgrade looked potentially compelling in real-world performance until at least 2020-ish - which is pretty crazy. Even then, one of the primary benefits would be access to NVME drives. It reminded me how much Intel under-performed and, more broadly, how the end of Moore's Law and Dennard Scaling combined around roughly 2010-ish to end the 30+ year 'Golden Era' of scaling that gave us computers which often roughly doubled performance across a broad range of applications which you could feel in everyday use - AND at >30% lower price - every three years or so.
Nowadays 8% to 15% performance uplift across mainstream applications at the same price is considered good and people are delighted if the performance is >15% OR if the price for the same performance drops >15%. If a generation delivers both >15% performance AND >15% lower price it would be stop-the-presses newsworthy. Kind of sad how our far our expectations have fallen compared to 1995-2005 when >30% perf at <30% price was considered baseline and >50% at <50% price was good and ~double perf at around half price was "great deal, time to upgrade again boys!".
Intel lost dominance in the 2017-2019 era. The rise of Ryzen and Apple finally deciding to switch to Apple Silicon were the two fundamental blows to Intel. They have been able to make a brief comeback in 2021-2022 with Alder Lake but quickly fell behind again and now have staked everything on 18A being competitive with TSMC N2 this year.
The Atom model was the breaking point for Intel. No one forgives them for wasting their money on Atom-based laptops, which are slower than a tortoise. Never play with the customer's intelligence.
I was working as a contractor in this period and remember meeting a thermometer company. They had made the extremely questionable decision to build it with Intel Edison, which used an even lower performance product line called Quark. The Edison chips baffled me. Worse performance than many ARM SoCs at the time, far worse efficiency, and they cost so much. That thermometer had a BOM cost of over $40 and barely enough battery life for its intended purpose.
I've always wondered, how do some smart companies, or smart film directors, or smart musicians can fail so hard? I understand that, sometimes, it's a matter of someone abusing a project for personal gain. Some CEOs, workers just want to pitch, pocket the money, and move on, but the level of absurdity of some of the decisions made are counter-productive the 'get rich quick' scheme too. I think there are self perpetuating echo chamber self dellusions. Perhaps this is why an outside perspective can see the painfully obvious. This is probably why having some churn with the outside world, and also understanding what is the periphery of the outside, unbiased opinion is, is very important.
At some point organizations get taken over by the 9-5 crowd who just want to collect a paycheck and live a nice life. This also leads to the hard-driving talent to leave for more aggressive organizations, leaving behind a more average team. What leaders remain will come up with not so great ideas, and the rank and file will follow along because there won't be a critical mass of passionate thought leaders to find a better way.
I don't mean to look down on this kind of group, I am probably one of them. There is nothing wrong with people enjoying a good work life balance at a decent paying job. However, I think there is a reality that if one wants a world-best company creating world-best products this is simply not good enough. Just like a team of weekend warriors would not be able to win the Superbowl (or even ever make it anywhere close to a NFL team) - which is perfectly fine! - the same way it's not fair to expect an average organization to perform world champion feats.
Disagree. 9-5 working is fine, and probably more efficient long term than permanent crunches.
Organisations fail when the ‘business’ people take over. People who let short term money-thinking make the decisions, instead good taste, vision or judgement.
Think Intel when they turned down making the iPhone chips because they didn’t think it’d be profitable enough, or Google’s head of advertising (same guy who killed yahoo search) degrading search results to improve ad revenue.
Apple have been remarkably immune to it post-Jobs, but it’s clear that’s on the way out with the recent revelations about in-app purchases.
Nah I’ve been on both sides of the fence. 9-5ers may reliably accomplish tasks through superior discipline, but they don’t do the heroics that really move individual teams forward.
Relying on "heroics" often indicates a process problem. This thread is kind of giving me a "Grindset / HustlePorn" vibe. With good decision making, focus, and discipline, 9-5 employees absolutely can make great things. And history is littered with the burnt-out husks of "hero" engineers working 120 hour weeks only to have their company fail and get sold for pennies on the dollar.
Once the MBAs take over there is less incentive provided to staff to innovate and disrupt internal products and services.
The innovators in the company are likely correlated with doing more than 9-5. These people get frustrated that their ideas no longer get traction and leave the company.
Eventually what's left are the people happy to just deliver what their told without much extra thought. These people are probably more likely to just clock in the hours. Any remaining innovators now have another reason to become even more frustrated and leave.
I wonder if there will be a similar situation at nvidia, which apparently has a challenge with so many of their employees being rich as their stock has rocketed up in value, and then could cause concerns about motivation or if skilled and knowledgeable employees will leave.
I think many Nvidia employees will stick around because their new found wealth being at the biggest most important company in the world will give them insight about the market they invest in. I make an order of magnitude more day trading than as a software engineer at a Mag7 company and I stay employed for the access to they way modern businesses think. Companies like mine are an amalgamation of management and engineering from other Silicon Valley companies so the tribal knowledge gets spread around to my neck of the woods.
Essentially no organizations actually reward telling your superiors that they're wrong. You pretend to sip the kool-aid and work on your resume. If one or two high-ranking leaders are steering the ship to the rocks, there's basically nothing the rank-and-file can do
> This is probably why having some churn with the outside world, and also understanding what is the periphery of the outside, unbiased opinion is, is very important.
Maximally efficient is minimally robust.
Squeezing every penny out of something means optimizing perfectly for present conditions--no more, no less. As long as those conditions shift slowly, slight adjustments work.
If those conditions shift suddenly though, death ensues.
It doesn’t even have to be that negative. With the best intentions in the world, it’s rare to have a CEO who is fundamentally capable of understanding both the technology and the viable market applications of that technology. Steve Jobs didn’t manage to do it at NeXT.
NeXT was a failed rocket launch (analogous to early rocket failures within SpaceX). A great step forward and a necessary step in the evolution of the PC. I thought NeXT workstations were pretty bad-ass for their time and place.
Recall that only 3 years prior to NeXT, was computers like the Atari ST .. what a vast difference!!
The original NeXT computer was a gorgeously sexy machine but slow compared to competitive workstations and considered very expensive for what it was at the time. It also didn't have the software ecosystem of a less expensive loaded PC or loaded Mac II. It's easy to look back with hindsight and rose-tinted glasses, squint a little, and see a macOS machine but it wouldn't be that for many years.
I mean, the NeXT, Atari ST and Mac computers around that time were all m68k-based... And the Atari ST was the cheapest by far, since it was competing in the home computer market.
yes, this was a direct consequence of the Craig Barrett mentality. Intel wanted a finger in many pies, since it could not predict what would be the next 'thing'. So they went on multiple acquisition sprees hoping to hit gold on something. I can't think of a single post-2000 acquisition that succeeded.
The early Atoms had pretty good performance per watt compared to Intel's other offerings. The whole 'netbook' and 'nettop' market segment was pretty much enabled by the Atom chips, and similar machines are still around nowadays. The E-cores found in recent Intel generations are also very Atom-like.
about a year after 'netbook's came out, the iPad was in the wild and it destroyed any chance of these ever catching on.. sure, they were cheaper, but the user experience on a tablet was just so much better. (and tablets got cheaper fast)
I feel like the 2012 atoms made some sense. What baffles me is that atom was complete shit until 2020. Intel sold laptop chips in 2022 that didn't support FMA or AVX2 because they used an atom designed e-core that didn't support them.
I would argue that IBM failed with the BIOS, since they assumed nobody could make a compatible machine without their authorization since they controlled the BIOS and the idea of a clean room reimplementation never dawned on them. It did dawn on Compaq. MCA came afterward.
> they assumed nobody could make a compatible machine without their authorization since they controlled the BIOS and the idea of a clean room reimplementation never dawned on them
So you're saying that they were somehow unaware of how new BIOS implementations were used in CP/M to port it to new systems?
And that they distributed the BIOS source code with every IBM PC... to make it harder for competitors to build compatible machines due to copyright claims?
And that they were somehow unaware DOS had largely reimplemented the CP/M design and API? (Though DOS's
FAT filesystem was a successor to FAT8 from Microsoft's Disk BASIC rather than CP/M's filesystem.)
yes, they tried with the 'Compute Continuum' .. but this never panned out. They spent loads of bandwidth and money trying to bring this reality into being, but it failed miserably. They assumed every user would have a smart-TV, smart-phone, tablet, and desktop .. all running their hardware/software. Turns out, no - they won't. They didn't "see" that the phone would dominate the non-business segment as it has.
I think a key reason they missed mobile is that it was during Intel's peak dominance and growth. Mobile was smaller, less powerful chips at lower prices and lower margins than Intel's flagship CPUs in that era. The founders who built the company were gone and Intel was a conglomerate run by people hired/promoted for managing existing product/category growth not discovering and homesteading new categories. They managed the conglomerate with a portfolio approach of assessing new opportunities on things Wall Street analysts focus on: margins, total revenue, projected market size and meta-metrics like 'return on capital'.
It's classic Christiansen "Innovator's Dilemma" disruption. Market leading incumbents run by business managers won't assess emerging unproven new opportunities as being worth serious sustained investment compared to the existing categories they're currently dominating.
They wasted the $$ that could have saved Intel by buying market shares back to the treasury to appease hedge fund managers and accountants to increase the share price/yield - a true 'bonfire of the Vanities', not to mention the 'Shitanium' = born dead all tries at resuscitation failed. That one also almost killed HP - it limps along - a broken thing
Yes, the flowering of Moore's Law - especially with SSD and memory density - that is still unfolding to the point that an iPhone/android has power of a high end work station from the year ~~2000, same with CMOS optical sensor density and patterned lenses
i don't think it's just the performance .. it's a form-factor paradigm shift in the consumer end. the younger generations just don't care about screen real estate as much as genX and early Millenials did. the devices became (surprisingly) much more addictive than what ppl expected and consequently, the devices went into pockets, into bed with them .. etc, sad really.
"They" in this case is just me, and I make very little money off of my writing. I write tech history because I want to, and there's little other reason.
The core problem at Intel is they promoted the myth that ISA has no impact on performance to such a degree they started fully believing it while also somehow believing their process advantage was unassailable. By that time they'd accumulated so many worthless departments that turning it around at any time after 2010 was an impossibility.
You could be the greatest business leader in history but you cannot save Intel without making most of the company hate you, so it will not happen. Just look at the blame game being played in these threads where somehow it's always the fault of these newly found to be inept individuals, and never the blundering morass of the bureaucratic whole.
https://chipsandcheese.com/p/arm-or-x86-isa-doesnt-matter
>You could be the greatest business leader in history but you cannot save Intel without making most of the company hate you, so it will not happen.
This is deep. It also highlight why it is easier to hire somebody outside of the company rather than promoting from within.
I started at intel in 1988 and loved working there up until about 2005. The author of this article did a fantastic job enumerating the number of launched products, but there were twice as many that were cancelled. It became such a clusterfuck of leaders vying for promotion to bigger projects and taking over flailing ones only to can them after a year. The 80s and 90s were hyper efficient and focused on churning out clear roadmaps. But the fragmentation of the market was something intel couldn’t handle: its platform didn’t cover all segments no matter how hard it tried it couldn’t do everything. I think the market is still reconverging after all the segmentation. The term “ubiquitous computing” was thrown around a lot in 2000, and it finally happened but it is arm that won. I think there will be a reconvergence of personal computing platforms and I can’t wait to see who vacuums up all the little guys. But after reading this, damn I missed launching the 486 and Pentium. Those were some of the best days of my career.
I'll give a viewpoint that the article reads like a listing of spec sheets and process improvements for CPUs of that era and not much else. Not really worth reading imho.
I'd love some discussion on why Intel left XScale and went to Atom and i think Itanium is worthy of discussion in this era too. I don't really want a raw listing of [In year X Intel launched Y with SPEC_SHEET_LISTING features].
>Itanium
IMO, Intel took us from common, affordable CPUs to high-priced, "Intel-only" CPUs. It was originally designed to use Rambus RAM, and it turned out Intel had a stake in that company. Intel got greedy and tried to force the market to go the way it wanted.
Honestly, AMD saved the x86 market for us common folks. Their approach of extending x86 to 64-bit and adopting DDR RAM allowed for the continuation of affordable, mainstream CPUs. This enabled companies to buy tons of servers for cheap.
Intel’s u-turn on x86-64 shows even they knew they couldn’t win.
AMD has saved Intel’s x86 platform more than once. The market wants a common, gradual upgrade path for the PC platform not a sudden, expensive, single-vendor ecosystem.
Itanium didn't support RDRAM until Itanium 2.
> I'd love some discussion on why Intel left XScale and went to Atom
I thought it was pretty obvious. They didn't control the ARM ISA and ARM Ltd designs had caught up to + surpassed XScale innovations (superscalar, Out-of-order pipelining, MIPS/w, etc). So instead of further innovating they decided to launch a competitor of their own ISA.
Intel at the time was clear about it: they wanted to concentrate fully on x86. They thought they could do everything with x86; hadn’t they already won against their RISC competitors by pushing billions into x86? Why would ARM be any different? Shortsighted, in hindsight, but you can see how they got there.
> i think Itanium is worthy of discussion in this era too
Itanium was a massive technical failure but a massiver business success.
Intel spent a gigabuck and drove every single non-x86 competitor out of the server business with the exception of IBM.
Mr. Magoo-ism galore.
Intel had constantly try to bring in visionaries, but failed over and over. With the exception of Jim Keller, Intel was duped into believing in incompetent people. At a critical juncture during the smart-phone revolution it was Mike Bell, a full-on Mr. Magoo. He never did anything after his stint with Intel worth mentioning - he was exposed as a pretender. Eric Kim would be another. Murthy Renduchintala is another. It goes on and on. Also critical was the the failure of an in-house exec named Anand Chandrasekher who completely flubbed the mega-project coop between Intel and Nokia to bring about Moblin OS and create a third phone ecosystem to the marketplace. WHY would Anand be put in charge of such an important effort?????? In Intel's defense, this project was submarined by Nokia's Stephen Elop, who usurped their CEO and left Intel standing at the altar. (Elop was a former Microsoft exec, Microsoft was also working on their foray into smartphones at the time. . very suspicious). XScale was mis-handled, Intel had a working phone with XScale prior to the iPhone being release .. but Intel was afraid of fostering a development community outside of x86 (Balmer once chanted -> developer, developer, developer). My guess is that ultimately, Intel suffers from the Kodak conundrum, i.e. they have probably rejected true visionaries because their ideas would always threaten the sacred cash cows. They have been afraid to innovate at the expense of profit margins (short term thinkers).
> Murthy Renduchintala
He was a joke at Qualcomm before he went to Intel too. That Intel considered snagging him a coup was a consistent source of amusement.
Interesting to me is that Intel was constantly shedding people in 2008 and 2009 with high revenues, high market share, tech leads, etc.
Smacks of financialization and wall-street centric managerial groupthink, rather than having the talented engineers to fight the coming mobile wars which were already very very apparent (thus the Atom), or even the current war of failure in discrete graphics.
Once the MBAs gain control of a dynamic technology company (I saw it at Medtronic personally), the technology and talent soul of the company is on a ticking timer of death. Medtronic turned into a acquire-tech-and-products-via buyout/acquisition rather than in-house, and Intel was also a treadmill of acquire-destroy (at least from my perspective Medtronic sometimes acquired companies and they became successful product lines, but Intel always seemed clueless in executing their acquisitions.
I look at all the 2000s acquisitions of Intel: sure shows they were "trying" at mobile, in the "signal wall street we are trying by acquiring companies so we keep our executive positions" but zero about actually chasing what mobile needed: low power, high performance.
shedding ppl in the USA, yes. bringing on hordes of cheap engineers from India and Malaysia at the same time though. labor arbitrage was probably MBA-think as well, to your point. (also, Intel was sued along with other big wheels for collusion, i.e. agreeing not to hire from one another in the US to keep salaries down - they settled this class action suit). managed demolition of a once great company.
Is Raja Koduri another phony?
I don't know tbh, heard both good and bad things .. he was brought in after many of the problems had already become serious. He probably had a very difficult charter.
The site’s domain name is the best use of a .fail tld ever.
OT from TFA, so high jacking your thread …
I don’t recall if there was ever a difference between “abort” and “fail.” I could choose to abort the operation, or tell it … to fail? That this is a failure?
¯\_(ツ)_/¯
Take reading a file from disk.
Abort would cancel the entire file read.
Retry would attempt that sector again.
Fail would fail that sector, but the program might decide to keep trying to read the rest of the file.
In practice abort and fail were often the same.
The article mostly focuses on the 2008-2014 era.
Yes. It is part of a series in which I cover Shockley -> Fairchild -> Intel, up to last month.
These are the years when Intel lost dominance, right? This article doesn't seem to show much insight as to why that happened or what caused the missteps.
Intel really lost dominance when 14nm stagnated. This article only goes up to that point.
Yep, in 2014 Intel's Haswell architecture was a banger. It was one of those occasional node+design intersections which yields a CPU with an unusually long useful lifespan due to a combination of Haswell being stronger than a typical gen and the many generations that followed being decidedly 'meh'. In fact, I still run a Haswell i5 in a well-optimized, slightly overclocked retro gaming system (with a more modern SSD and GFX card).
About a year ago I looked into what practical benefits I'd gain if I upgraded the CPU and mobo to a more recent (but still used) spec from eBay. Using it mainly for retro game emulation and virtual pinball, I assessed single core performance and no CPU/mobo upgrade looked potentially compelling in real-world performance until at least 2020-ish - which is pretty crazy. Even then, one of the primary benefits would be access to NVME drives. It reminded me how much Intel under-performed and, more broadly, how the end of Moore's Law and Dennard Scaling combined around roughly 2010-ish to end the 30+ year 'Golden Era' of scaling that gave us computers which often roughly doubled performance across a broad range of applications which you could feel in everyday use - AND at >30% lower price - every three years or so.
Nowadays 8% to 15% performance uplift across mainstream applications at the same price is considered good and people are delighted if the performance is >15% OR if the price for the same performance drops >15%. If a generation delivers both >15% performance AND >15% lower price it would be stop-the-presses newsworthy. Kind of sad how our far our expectations have fallen compared to 1995-2005 when >30% perf at <30% price was considered baseline and >50% at <50% price was good and ~double perf at around half price was "great deal, time to upgrade again boys!".
Intel lost dominance in the 2017-2019 era. The rise of Ryzen and Apple finally deciding to switch to Apple Silicon were the two fundamental blows to Intel. They have been able to make a brief comeback in 2021-2022 with Alder Lake but quickly fell behind again and now have staked everything on 18A being competitive with TSMC N2 this year.
The Atom model was the breaking point for Intel. No one forgives them for wasting their money on Atom-based laptops, which are slower than a tortoise. Never play with the customer's intelligence.
I was working as a contractor in this period and remember meeting a thermometer company. They had made the extremely questionable decision to build it with Intel Edison, which used an even lower performance product line called Quark. The Edison chips baffled me. Worse performance than many ARM SoCs at the time, far worse efficiency, and they cost so much. That thermometer had a BOM cost of over $40 and barely enough battery life for its intended purpose.
I've always wondered, how do some smart companies, or smart film directors, or smart musicians can fail so hard? I understand that, sometimes, it's a matter of someone abusing a project for personal gain. Some CEOs, workers just want to pitch, pocket the money, and move on, but the level of absurdity of some of the decisions made are counter-productive the 'get rich quick' scheme too. I think there are self perpetuating echo chamber self dellusions. Perhaps this is why an outside perspective can see the painfully obvious. This is probably why having some churn with the outside world, and also understanding what is the periphery of the outside, unbiased opinion is, is very important.
At some point organizations get taken over by the 9-5 crowd who just want to collect a paycheck and live a nice life. This also leads to the hard-driving talent to leave for more aggressive organizations, leaving behind a more average team. What leaders remain will come up with not so great ideas, and the rank and file will follow along because there won't be a critical mass of passionate thought leaders to find a better way.
I don't mean to look down on this kind of group, I am probably one of them. There is nothing wrong with people enjoying a good work life balance at a decent paying job. However, I think there is a reality that if one wants a world-best company creating world-best products this is simply not good enough. Just like a team of weekend warriors would not be able to win the Superbowl (or even ever make it anywhere close to a NFL team) - which is perfectly fine! - the same way it's not fair to expect an average organization to perform world champion feats.
Disagree. 9-5 working is fine, and probably more efficient long term than permanent crunches.
Organisations fail when the ‘business’ people take over. People who let short term money-thinking make the decisions, instead good taste, vision or judgement.
Think Intel when they turned down making the iPhone chips because they didn’t think it’d be profitable enough, or Google’s head of advertising (same guy who killed yahoo search) degrading search results to improve ad revenue.
Apple have been remarkably immune to it post-Jobs, but it’s clear that’s on the way out with the recent revelations about in-app purchases.
Nah I’ve been on both sides of the fence. 9-5ers may reliably accomplish tasks through superior discipline, but they don’t do the heroics that really move individual teams forward.
Relying on "heroics" often indicates a process problem. This thread is kind of giving me a "Grindset / HustlePorn" vibe. With good decision making, focus, and discipline, 9-5 employees absolutely can make great things. And history is littered with the burnt-out husks of "hero" engineers working 120 hour weeks only to have their company fail and get sold for pennies on the dollar.
Once the MBAs take over there is less incentive provided to staff to innovate and disrupt internal products and services.
The innovators in the company are likely correlated with doing more than 9-5. These people get frustrated that their ideas no longer get traction and leave the company.
Eventually what's left are the people happy to just deliver what their told without much extra thought. These people are probably more likely to just clock in the hours. Any remaining innovators now have another reason to become even more frustrated and leave.
I wonder if there will be a similar situation at nvidia, which apparently has a challenge with so many of their employees being rich as their stock has rocketed up in value, and then could cause concerns about motivation or if skilled and knowledgeable employees will leave.
I think many Nvidia employees will stick around because their new found wealth being at the biggest most important company in the world will give them insight about the market they invest in. I make an order of magnitude more day trading than as a software engineer at a Mag7 company and I stay employed for the access to they way modern businesses think. Companies like mine are an amalgamation of management and engineering from other Silicon Valley companies so the tribal knowledge gets spread around to my neck of the woods.
I don’t think that’s common
Essentially no organizations actually reward telling your superiors that they're wrong. You pretend to sip the kool-aid and work on your resume. If one or two high-ranking leaders are steering the ship to the rocks, there's basically nothing the rank-and-file can do
Who doth not answer to the rudder shall answer to the rock.
> This is probably why having some churn with the outside world, and also understanding what is the periphery of the outside, unbiased opinion is, is very important.
Maximally efficient is minimally robust.
Squeezing every penny out of something means optimizing perfectly for present conditions--no more, no less. As long as those conditions shift slowly, slight adjustments work.
If those conditions shift suddenly though, death ensues.
It doesn’t even have to be that negative. With the best intentions in the world, it’s rare to have a CEO who is fundamentally capable of understanding both the technology and the viable market applications of that technology. Steve Jobs didn’t manage to do it at NeXT.
NeXT was a failed rocket launch (analogous to early rocket failures within SpaceX). A great step forward and a necessary step in the evolution of the PC. I thought NeXT workstations were pretty bad-ass for their time and place. Recall that only 3 years prior to NeXT, was computers like the Atari ST .. what a vast difference!!
The original NeXT computer was a gorgeously sexy machine but slow compared to competitive workstations and considered very expensive for what it was at the time. It also didn't have the software ecosystem of a less expensive loaded PC or loaded Mac II. It's easy to look back with hindsight and rose-tinted glasses, squint a little, and see a macOS machine but it wouldn't be that for many years.
I mean, the NeXT, Atari ST and Mac computers around that time were all m68k-based... And the Atari ST was the cheapest by far, since it was competing in the home computer market.
I could tell they were cooked when they bought McAfee.
yes, this was a direct consequence of the Craig Barrett mentality. Intel wanted a finger in many pies, since it could not predict what would be the next 'thing'. So they went on multiple acquisition sprees hoping to hit gold on something. I can't think of a single post-2000 acquisition that succeeded.
They what??
Oh I forgot that one. That's hilarious.
> McAfee Corp. ... Intel Security Group from 2014 to 2017
https://en.wikipedia.org/wiki/McAfee
Atom was shit. A desperation move. I was so embarrassed to recommend a Poulsbo laptop to friend, it was the worst machine I have every seen.
The early Atoms had pretty good performance per watt compared to Intel's other offerings. The whole 'netbook' and 'nettop' market segment was pretty much enabled by the Atom chips, and similar machines are still around nowadays. The E-cores found in recent Intel generations are also very Atom-like.
about a year after 'netbook's came out, the iPad was in the wild and it destroyed any chance of these ever catching on.. sure, they were cheaper, but the user experience on a tablet was just so much better. (and tablets got cheaper fast)
I feel like the 2012 atoms made some sense. What baffles me is that atom was complete shit until 2020. Intel sold laptop chips in 2022 that didn't support FMA or AVX2 because they used an atom designed e-core that didn't support them.
Intel is a failed monopolist, unlike Apple! So is IBM with MCA, micro-channel-architecture
I would argue that IBM failed with the BIOS, since they assumed nobody could make a compatible machine without their authorization since they controlled the BIOS and the idea of a clean room reimplementation never dawned on them. It did dawn on Compaq. MCA came afterward.
> they assumed nobody could make a compatible machine without their authorization since they controlled the BIOS and the idea of a clean room reimplementation never dawned on them
So you're saying that they were somehow unaware of how new BIOS implementations were used in CP/M to port it to new systems?
And that they distributed the BIOS source code with every IBM PC... to make it harder for competitors to build compatible machines due to copyright claims?
And that they were somehow unaware DOS had largely reimplemented the CP/M design and API? (Though DOS's FAT filesystem was a successor to FAT8 from Microsoft's Disk BASIC rather than CP/M's filesystem.)
yes, they tried with the 'Compute Continuum' .. but this never panned out. They spent loads of bandwidth and money trying to bring this reality into being, but it failed miserably. They assumed every user would have a smart-TV, smart-phone, tablet, and desktop .. all running their hardware/software. Turns out, no - they won't. They didn't "see" that the phone would dominate the non-business segment as it has.
I think a key reason they missed mobile is that it was during Intel's peak dominance and growth. Mobile was smaller, less powerful chips at lower prices and lower margins than Intel's flagship CPUs in that era. The founders who built the company were gone and Intel was a conglomerate run by people hired/promoted for managing existing product/category growth not discovering and homesteading new categories. They managed the conglomerate with a portfolio approach of assessing new opportunities on things Wall Street analysts focus on: margins, total revenue, projected market size and meta-metrics like 'return on capital'.
It's classic Christiansen "Innovator's Dilemma" disruption. Market leading incumbents run by business managers won't assess emerging unproven new opportunities as being worth serious sustained investment compared to the existing categories they're currently dominating.
managers, yeh, intel luvs managers ;)
They wasted the $$ that could have saved Intel by buying market shares back to the treasury to appease hedge fund managers and accountants to increase the share price/yield - a true 'bonfire of the Vanities', not to mention the 'Shitanium' = born dead all tries at resuscitation failed. That one also almost killed HP - it limps along - a broken thing
Yes, the flowering of Moore's Law - especially with SSD and memory density - that is still unfolding to the point that an iPhone/android has power of a high end work station from the year ~~2000, same with CMOS optical sensor density and patterned lenses
i don't think it's just the performance .. it's a form-factor paradigm shift in the consumer end. the younger generations just don't care about screen real estate as much as genX and early Millenials did. the devices became (surprisingly) much more addictive than what ppl expected and consequently, the devices went into pockets, into bed with them .. etc, sad really.
Their domain name is probably most of their market cap
"They" in this case is just me, and I make very little money off of my writing. I write tech history because I want to, and there's little other reason.