Press "Enter" to skip to content

Fedora 23: 64-bit Only?

So let’s see how this flies in the wide world of FOSS….

Stephen Smoogen blogs recently that he’s pitching a proposal for a 64-bit only Fedora starting with Fedora 23 — that’s not the next one, but the one after that; maybe Fedora 24, if it is not possible by Fedora 23.

For those of you keeping score at home, Smoogen is a long-time Fedora contributor who now serves on Fedora’s EPEL Steering Committee. And EPEL? That’s what’s commonly known as Extra Packages for Enterprise Linux, “a Fedora Special Interest Group that creates, maintains, and manages a high quality set of additional packages for Enterprise Linux, including, but not limited to, Red Hat Enterprise Linux (RHEL), CentOS and Scientific Linux (SL), Oracle Linux (OL),” according to their wiki.

Fedora logo
Fedora 23: 32-bit users need not apply?
Smoogen writes in introducing his self-described “Devil’s Proposal” the following: “I am going to make the uncomfortable and ugly proposal to drop 32 bit in Fedora 23 and only look at 64 bit architectures as primary architectures.”

Continuing, and I’ll paraphrase here, all 32 bit architectures would be moved to being secondary architectures, with their own build teams to maintain builds. Then he wraps up the initial paragraph with, “At the moment that would make the only 64 bit primary architecture x86_64 with arm64 and ppc64 possible candidates for mainstream support in F24 (if they aren’t ready by Fedora 23).”

Let’s put aside the incalculable enormity of bad in this proposal, which could very well be immeasurable. Let’s go instead to Smoogen’s reasons: First, he has a graph that shows use in 32-bit is down; a graph which also shows 64-bit use down as well, but let’s not quibble. Second, and probably the only rational reason (albeit a stretch) notes changes in builder options work better and faster, if at all, in 64-bit. Third, and this is the humdinger, “I don’t have a Pentium III to try and replicate your problem with” as a reason developers are developing solely for 64-bit.

You know, I get it. I understand the symbiotic relationship between the Fedora Project and Red Hat, and how the former serves as a de facto test bed for development that, sooner or later, ends up in Red Hat Enterprise Linux, Scientific Linux and CentOS (and even Oracle Linux). I even get that Red Hat may no longer have a need for 32-bit development. I also get that while 32-bit hardware use in developed countries is on the decline, I’d like to see statistics on 32-bit hardware use in developing countries before I’m ready to say that 32-bit is irrelevant.

Yeah, there’s subtext here: This type of first-world thinking where we can’t be bothered with anything less than the latest and greatest contributes to the digital divide between rich and poor nations. But I digress.

Closer to home, here’s something else I understand: The complete myth that Fedora can only be used by the Linux wizards out there because of the tiresome “bleeding edge” mantra many spout for no other reason than they don’t know better. The reality is that with a few very, very simple tweaks (like this, for starters), Fedora is an excellent daily-use distro that anyone — anyone — can use.

Which ironically leads us to this oh-so-welcoming paragraph toward the end of Smoogen’s blog post: “This may also mean that people with older hardware end up dropping Fedora altogether and going to Debian or Arch. I would actually say that the people doing so are being active and taking control of their destinies which is better than waiting for hand-scraps.”

So that’s redlining the GTFO tachometer for those who have used 32-bit Fedora, no? Got a 32-bit machine? Go be active and control your own destiny, and don’t let the door hit you in the butt on the way out. We’re only offering you scraps anyway….

I’d make popcorn and get comfortable to watch how Fedora Ambassadors spin that to the wider FOSS public, especially in explaining how that dovetails into the “friends” aspect of Fedora’s four foundations.

My guess is that this proposal will be debated among those in the Fedora Project, and my hope is that it crashes and burns. Smoogen made a “Devil’s Proposal,” but I hope he was prepared to catch hell for it.


  1. Larry Cafiero Larry Cafiero Post author | January 21, 2015

    Upon further review, Stephen has given his proposal a second thought and offers a mea culpa:

    It should be noted, too, that the linked blog post appeared shortly before the commentary above was posted, so there was no cause-and-effect here.

    Thanks, Stephen.

  2. David David January 21, 2015

    What an irresponsible proposal. Building on 32-bit helps catch bugs that upstream developers miss because they only test on 64-bit, so that helps everyone. Also, Fedora needs to recognize their place as a leader in the Linux distribution world and how many other distros sync with them. Many of those other distros intend to continue to support 32-bit, so this could be damaging for them and only make their jobs more difficult.

  3. Sum Yung Gai Sum Yung Gai January 21, 2015

    Thanks for the update on this, Larry. Glad to see he either changed his mind or clarified his original position. Either way, yes, it would be absurd to go solely 64-bit.

    Where’s Weird Al Yankovic when ya need him?

    I’ve been overseas, too, like many others. I’m a GNU/Linux evangelist. Many people in, say, South America are still on Pentium III Tualatins with, say, 1GB DRAM. They’re on AMD Semprons and Athlon XP’s with 1 or 2GB DRAM. Those computers still work. They’re running Windows XP or even Windows 2000 in some cases. I put something like Debian or Ubuntu (en espanol) on those boxes, and not only are they delighted with the updated functionality (in their language, no less!), but that “old” computer runs at a decent speed with it. Ta-da, GNU/Linux converts. Yes, I *do* put Flash on there. Yes, I *do* put the proprietary nVidious driver on there if need be. Yes, I *do* make sure their iJewelry or smartphones or whatever can connect to said computer. And now they love GNU/Linux. That’s a win.

    And I did it on their “old” 32-bit hardware.

    Gotta love that.


  4. Andrey Andrey January 22, 2015

    I am absolutely sure that is going to be a bad step even for developed world. There are projects like when old computers with Linux are donated to poor people; schools with low budgets that will be not able to reuse their equipment.

    Speaking for me personally – I have couple of old things that are now being investigated by my son and his friends. Yes, they have build their own server based on the motherboard from 2001 with 512 Mb of RAM and 2*10 Gb HDDs. They were really happy to put hands on the ‘ancient’ HW. But on the top they installed F21-86 server which runs pretty well for their small needs.

  5. Monopoly Monopoly January 22, 2015

    See this is the problem with RHEL and spin-offs. They are the “Leaders” in the Linux world. What they say must be expected to be set in stone “SystemD” being a great example of it, read deeper into RHEL and you will understand why.

    Anyway, Great Idea Smoogen, the entire word is running X86_64 archtitecture beacuse everyone has a computer no older than 4 years.

    with the ever more popular Linux in devices, more and more stupid ideas head into the wind and more and more stupid adaptations have been made. Maybe it is time to go back to a safer “BSD” world.

  6. JFM JFM January 22, 2015

    THe 64 bit-capable Core 2 has been around us since June 2006. That makes eight and half years not four. Because of the gretaer number of registers it is about 30% faster if you use it on 64 bits. I still have a 32 bits computer around for digitalizing vinyls but since all the software I need is already on it and this machine is cut off from the Internet I couldn’t care less about new versions of Fedora.

    So while for THird World countries it could make sense to maintain for security one 32 bits old version of Fedora for the years to come (notice that aside from the 64 bits question every new version requires more memory and horse power) using scarce resources on relasing both a 32 bit and a 64 bit versions for every new Fedora is becomoing ever more silly. Replace Fedora for your favorite distribution.

  7. corneliu dabija corneliu dabija January 22, 2015

    Actually the 64 bit architecture is way older than 2006. The first use of 64 bit in servers started in the ’80s and the first 64 bit desktops appeared in 2003. That’s 12 years already.
    64 bit processors are very cheap now, under $50, and memory is also cheap, at roughly $10 per GB, and even if one does not use more than 4 GB still the system is faster just because a 64 bit instruction can carry more addresses. So I don’t see why the third world can’t afford 64 bit computers.

    @Monopoly: It was idiotic to bring systemd to this discussion. Please leave systemd out of this. There is no monopoly in Linux. Only an imbecile would believe that.

  8. Neko Nata Neko Nata January 22, 2015

    32-bit is important for PCs up to 2GB — I suspect eventual speed gains are offset by the need to deal with larger executables and libraries.

    This problem is pervasive in Linux and not restricted to Fedora. And not restricted to the 32-bit/64-bit dichotomy.

    A certain “discomfort” is caused by Ubuntu focus on recent architectures, which led Lubuntu to state it would support i586 as a minimum.

    As a matter of fact, there’s a convenient instruction (SSE2) which is not supported even in some i686. The result is that Lubuntu support is now restricted to post Pentium 4 / Pentium M processors. But these processors are not always old — even the Z80 is still made today — but are actually PAE-capable and quite adequate even for KDE (given enough memory, that is).

    In my tests, two applications were consistently compiled using that instruction: Midori and Qupzilla. That means distros which would be lightweight actually ar not fit for some older processors. In fact, only Handylinux and Antix had those applications compiled without the SSE2 requirement. I’m not affiliated with any of these; if anyone knows about other which works with older CPUs, I’d be thankful to hear about it. The distribution I normally use, Mageia, despite being geared towards i586s, unfortunately has the same problem.

    Perhaps we could collect (privacy considerations aside) the output of “lshw”, “hwinfo” or whatever and, based on that output, use a script to automatically select which distributions would be fit for use…

  9. Bucky Bucky January 22, 2015

    On the one hand, it’s okay to recognize that resources are finite, and sometimes decisions have to be made to stop supporting something–not because that thing doesn’t have merits, but because the resources you have to allocate to it are too large in the grand scheme of things.

    I get that. Eventually, the 32-bit architecture WILL BE dropped as too much for too little.

    But my reading of that graph is that the x86_32 usage is pretty much holding steady. It might be that kissing it goodbye by this time next year is a little too soon, yet.

  10. David Anderson David Anderson January 22, 2015

    Developing countries? We don’t need to go that far.

    I just got an x86-based Pentium 4 machine out of a garage where it’d been for a few years, upgraded it with approx £35/$60 of new parts to give it a new hard drive (it was still using ATA and you can’t buy ATA drives any more, so I had to get a SATA-to-ATA converter), more RAM, and installed Fedora 21 with XFCE. The family are very grateful; it does word processing, web browsing, DVDs, audio books, email and much else for them – and all at a very acceptable speed. Ditching x86 now is far too early.


  11. JFM JFM January 23, 2015

    Yes but does your old Pentium 4 require to run the latest and greatest for what it does?

    Also since your typical Pentium II, III,Pentium 4 motherboard suports only so much memory and applications keep becoming hungier and hungrier b(additional features don’t come free) old machines becaome increasingly unable to run thge latest and greatest.

    It would probably make more sense to at somoe point stop issuing 32 bit release but to keep releasing security fixes for the last 32 bits for say five years.

  12. Larry Cafiero Larry Cafiero Post author | January 23, 2015

    David is right: You don’t have to go that far, and of all the machines in my lab (well, collection in the workroom, really), only two are 64-bit, and one of those is a Sun Ultra 10 running Solaris. Not a complaint, but I’ve rarely had enough disposable income to afford the latest and greatest hardware.

    SYG – It’s been awhile and great to hear from you! I agree completely – it’s a win. Ping me and let me know about some of your activities around promoting FOSS.

  13. JFM JFM January 23, 2015

    NekoNata. THe Z80 is still being made today but as a processor for microwave ovens and washing machines not for desktop.

    Alos a computer is not just a processor it is a lso a motherboard: even when CPU has enough horsepower you can find the motherbord only supports an amount of memory that makes it unusable for a modern distribution. At that point it does not make much sens to support say a Pentium II whose, from very distant memories, motherboards only supported , at most, 128 or 256 Megs of memory and were typically sold with only 64 Megs.

    It doesn’t make sense to make a significant effort for supporting the 0.1% of users that a) are still running PIIs and b) upgraded their machine to the maximum memory supported by the motherboard. Better have them use versions of distributions contemporary to their processors.

  14. Neko Nata Neko Nata January 23, 2015

    @ JFM, Hi.

    Yes, the Z80 now is used on special applications and it would not be fit for desktop use. 100% agreed. OTOH, there’s nothing wrong in wanting to run Linux on it (which won’t happen, too, Linux had minimum requirements, a 386 processor was one of them IIRC). My example of the Z80 was just to say old things are still being currently made. If I would give an example of a processor which can still be used for a desktop, I’d mention the AMD Geode — it’s useful for desktops of limited power (both regarding processing and energy). I actually use one such machine (as a server, in fact, with 512MB RAM which go mostly unused).

    Regarding old machines which either have a slow processor or small memory, I also got an AMD Sempron, 1.6 GHz, 1.5 GB RAM with a quite fast NVidia video card which can run KDE fast enough but on which I cannot use *any* Ubuntu-derived distribution. And it’s not a 486, nor a 586… it’s defined as a 686. My point is those numbers no longer are useful to know whether a distro will run on one’s hardware — that’s why Lubuntu states clearly now they require a Pentium 4 or better. For the record, a Pentium 4 is also an 686 and some of them are older than my AMD. So, it’s not an age problem.

    Some poor people in remote places can only get by with what they get. It is of course unrealistic to expect a Pentium II to show 1920×1080 video, but a medicine doctor could use such an old machine for basic patient schedule (for instance). A kid with such an old equipment would be better than without it. Using an old Linux version is surely a good suggestion, but it’s not always easy to find (for cost reasons, too) and it is unsafe (some of the recent vulnerabilities will come up again if we use older versions without restraint).

    I understand cost is a factor and there are limits to what a distribution can do. The problem is that people have different expectations: if I get a distro for 586 computers, I expect it (naively, it seems) to support all 586 CPUs. Maybe the distro builders mean “it even works on a single particular 586, so it’s a 586-based distro”… then I (and others) will be out of luck… we’ll not be able to sort if a distro will be useful based on the target architecture; we’ll have to read the small letters.

    I’m not angry, just a little upset. We get things for free, there’s no place for anger.

    Now, on the opposite direction of your approach, I’d like to say 64-bit will never be necessary in some applications. I’ve read even some 16-bit are still used (old Win16 ones, used with Wine), but many tools work problem spaces which are not helped by using more bits. It’s hard to justify a simple 64-bit text editor, for instance.

    Or put it another way, using 8-bit, there are lots of applications which would benefit from 16-bit. Fewer would benefit of a 32-bit model; fewer yet would require 64-bit and, right now, there’s not many I can think would need 128-bit. And what would require a 256-bit CPU? The mind boggles! :^)

    Thanks for your considerations; despite my uncommon exceptions, your position is what I would call common sense.

Comments are closed.

Latest FOSS News: