I am not a fan of Edison, he and his companies did many shady and awful things. That said… Edison can’t really be implicated too strongly here.
Edison didn’t invent the light bulb, he basically took existing work and combined it and added some bits to make a lightbulb that people could actually buy and use.
Two major improvements were lowering costs and IMPROVING life span of bulbs.
The major actual conspiracy- a literal global conspiracy like crazies speculate on all the time but is real- came decades later into the roaring 20’s when mass produced consumer bulbs could last 2000+ hours. Lightbulb makers across the globe- big names that are still around. GE, Osram etc. had a conference and created pretty much a real world supervillain league complete with a shadowy council with legitimate public face and named after an old god- the Phoebus cartel- yeah, an actual cartel- conspired to limit bulb life to around 1000 hours to drive sales. And they largely succeeded until about WW2
Prior to that bulb technology has improved at a pretty steady and earnest pace. The oldest active use semi continuous bulb in America is in California and has over 1 million hours and was first turned on around 1902. It remains constantly on today and can be viewed in person or via live web cam. So why don’t all bulbs last that long? A few reasons. Firstly- it generally isn’t running that kills bulbs as quickly as being turned off and on. It seems counter intuitive but the way common bulbs work, it is the switch from being on and being heated/energized to off and the changes that causes, as well as the initial “surge” most bulbs experience at activation that wears bulbs fastest. In general most common lights that are left on last more total hours than ones that get turned off and on. Incandescent bulbs generally die heat related deaths. Bulbs basically turn electricity into heat and light- more heat than light usually. The Filament gets hot. The current across the Filament is…
.. effectively even, constant. The heat produced can be simplified to the amount of power through the bulb and the thickness of a filament. For a given current, a thicker filament doesn’t get as hot. But it isn’t possible to manufacture a perfectly even thickness filament down to the atomic level that has perfectly uniform conductivity. Not at present and certainly not in past centuries. This creates “hot spots” across the filament. As the filament gets hot, parts of it basically vaporize. It gets thinner. Areas that are already thinner burn hotter and thusly they vaporize more quickly, and in doing so get even thinner which then speeds up the process until eventually there is a spot where the metal is all gone and current can’t flow. The bulb “burns out.”
Most incandescent bulbs use a tungsten alloy or tungsten filaments. It has been this way for a loooong time.
Bulbs like the record holding million plus hour bulb use different materials. The record holding bulb uses a carbon filament which isn’t as susceptible to heat “evaporation” of the material. The problem is or has been, especially in the early days of electrical lights, materials and processes to make bulbs of such materials were prohibitive. Rare, expensive, inconsistent or difficult. There are other issues and compromises as well. The earliest bulbs were essentially “open” in that the filament was in the same atmosphere as the room. By vacuum sealing the glass bulb and removing oxygen, the efficiency, life, and safety of the bulb are improved. They then figured out and developed feasible technology to allow the bulb to have to oxygen removed and an inert gas substituted to further aid longevity and efficiency, and over time the gas mix and sealing etc. were improved.
Tungsten was settled on by most because it was a good compromise. Tungsten bulbs can last thousands of hours and their properties make them well suited for consumer use and to the now established infrastructure. At the time they were conducive to the infrastructure available as well. The properties and logistics of supply also made tungsten a material that could be used while keeping costs relatively affordable. You have to remember before electric lights sun or candles, followed by gas lamps and such, were the worlds dominant light sources and candles and gas were not only at times very expensive and not always easily available, but they posed various dangers and thousands of hours of continuous operation wouldn’t generally be practical. Light was a sort of luxury for much of history outside of daylights reach.
So while tungsten doesn’t last as long as some other options that were available early in the life of electric lighting, it was actually a pretty good compromise. It would be a bit like future historians calling out chip inventors or manufacturers because not every computer has a server grade processor or can run for 40+ years. We know that isn’t practical because you probably don’t want or need a digital wrist watch the size of a phone or tablet with a 100 core processor to tell you the time and your phone might be lucky to last a decade but if they could make it last 50 years by using a processor from digital calculator you probably wouldn’t care that it was $5 because it would be slow and useless. Even IF they could put the best technology in the world in every electronic device and not raise the price… most people would probably trade their electronics in within a few years or a decade. Go try to use a first gen smart phone or boot a machine from 2000 and watch some YouTube
on dial up and tell me that you’d take that over a newer model. Most people aren’t using hardware that old in their daily lives. Technology, especially electronics has tended to leap ahead at a rapid pace. There is arguably little sense in making “forever” electronics that are resource and cost and labor intensive but can last a century because they’ll likely be obsolete long before that. Say you bought an incandescent bulb in 1980 designed to last 100 years. Say it cost 10x the cost of a regular bulb but that’s 100x as long as the average bulb life so good deal right? But… by the early 2000’s LED lights that could last long periods AND have lower energy costs to operate as well as lower resource usage due to less power to operate were out. Home LED’s were about 6-10x the cost of equivalent bulbs when they started hitting the mass market. So your 100 year bulb would be 20 years old and there would be better technology to upgrade to.
What’s more, theogony itself generates only a fraction of the heat of a comprable incandescent bulb. That not only means that it can lower cooling costs in all but the coldest climates, but that the fixtures and housings can be designed differently because the danger of fire from heat off the bulb is mitigated. Less complex and resource consuming fixtures that don’t require as much space or materials for heat management from surrounding materials such as drywall in recessed lighting. Which also opens up new avenues of design and engineering for lighting and other improvements. Which then mitigates the long term usefulness of a “100 year bulb” because- let’s say in 100 years from now we commonly generate consumer light with 0 heat produced or even no electricity used.
If you move or you update fixtures etc. your old bulbs become useless basically but still cost the original buyer and our global resources for 100 years of light.
It is important to distinguish between planned obsolescence and inherent obsolescence, and understand when they coincide. There isn’t a reason to print news paper on granite slabs that can withstand 10,000 years of time because the odds of anyone holding on to a daily paper for 10,000 years are slim, and that’s a lot of work making and distributing all those granite slabs not to mention- even if it lasts 10,000 years and rain won’t wreck it, few folks will pay $2400 for the morning paper each day. There is a distinction between intentionally among a product to break so you will buy a new one and in making a product that can only be expected to let so long because it’s use or specifics indicate that the average buyer won’t hold on to it for that long. Running shoes will get sweaty and stained. The insoles wear out and don’t offer support. The fabric get stretched and they begin to fit less well. Most people will not pay $100 to have new soles cobbled to their $100 running shoes or
$20 for inserts and $12 for laces to keep their $100 5 year old pair of stinky running shoes. More advanced shows may use gas pockets and/or materials engineering to provide support and cushioning. These things might require machines costing hundreds of millions of dollars to be able to repair them as they wear. Even if a cobbler invested that, what are the des they’d see a return on a machine that can repair certain running shoes from a period of 10 years from half a century ago? That repair might costs hundreds or thousands of dollars- more than the cost to buy new shoes with the latest style and technology and a new look that aren’t smelly and sweaty and don’t have other west and tear from age and use. So if we know the average owner only keeps a thing for so long on average… why does it need to last longer when that tends to increase the costs and complexities and consumables of making a thing?
Cars from 50+ years ago were extremely serviceable in general. A lot are still around- but are most people or most cars today in most of the world 50+ years old? No. The average American owns a car 6-11 years. It isn’t that the cars don’t still run after that, in fact most could run for a century or more with routine upkeep. Newer cars are fashionable but they also have better technology, safety, efficiency, performance, features, compatibility in a modern world. They represent the developments in society and technology over time. I drive old cars and don’t care for most newer cars, but I can’t argue that for most people the newer cars make sense. Most people aren’t going to keep a single car their whole life and when faced with a repair that is thousands of dollars on your average car used as an appliance by the average driver- they don’t want to put that much money into an old car that will likely need even more money before long when other things wear or break. They just buy another
So it is worth mentioning that distinction. Things that last “forever” aren’t always better for anyone or everyone. Oh, and that worlds longest running bulb? It still runs but for some time it produces around 4 watts equivalent of light. So it isn’t bright enough for most modern homes where 60 watts equivalente is the average.
Edison didn’t invent the light bulb, he basically took existing work and combined it and added some bits to make a lightbulb that people could actually buy and use.
Two major improvements were lowering costs and IMPROVING life span of bulbs.
The major actual conspiracy- a literal global conspiracy like crazies speculate on all the time but is real- came decades later into the roaring 20’s when mass produced consumer bulbs could last 2000+ hours. Lightbulb makers across the globe- big names that are still around. GE, Osram etc. had a conference and created pretty much a real world supervillain league complete with a shadowy council with legitimate public face and named after an old god- the Phoebus cartel- yeah, an actual cartel- conspired to limit bulb life to around 1000 hours to drive sales. And they largely succeeded until about WW2
Most incandescent bulbs use a tungsten alloy or tungsten filaments. It has been this way for a loooong time.
If you move or you update fixtures etc. your old bulbs become useless basically but still cost the original buyer and our global resources for 100 years of light.