Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Android

'Motorola Has Let Me Down For the Last Time' (androidcentral.com) 114

Jerry Hildenbrand, writing for AndroidCentral: If you're ever in the mood to think about a "how the mighty have fallen" story, you need to look no further than Motorola. The company used to be at the forefront of technology in everything digital, but buyouts, restructuring, and eventually becoming another OEM nameplate has left Motorola little more than a memory that old tech dudes like me will fondly look back with melancholy reflections of the good old days. If I sound bitter, it's because I am, just a little. [...] The company has had a very poor record regarding updates since it was sold to Lenovo; both the big grand Android platform updates and the important but overlooked security patch updates. This compounds the whole issue, as the only realistic chance Z2 Force owners have to get those critically important updates they have missed is when they are bundled into the Android 9 release. These patches have no glitz or glamour associated, but they are the types of updates that keep you and your personal information safer. I've mentioned it before and I'll say it again: manufacturers owe us security patches on a regular basis if they expect us to owe them our allegiance.
This discussion has been archived. No new comments can be posted.

'Motorola Has Let Me Down For the Last Time'

Comments Filter:
  • by Anonymous Coward

    What critical patches do you think youre missing out on by not getting the latest android bloat update? What vulnerabilities are exploited through a scenario you personally will experience?

    • Dude, you didn't even read the 7/1 release notes.

      Send me your # so I can text you a Banking RAT. I mean, so I can send you nudes.

  • by Snotnose ( 212196 ) on Tuesday July 02, 2019 @07:35PM (#58863960)
    That is what decided the future. By all accounts the 68k was the better processor, yet Intel won the war.

    I was writing assembly back then (early 80s, pre-PC), the Motorola chips were hella lot easier to program than the Intel chips. Hell, the Zilog chips (Z-80) beat hell out of Intel.
    • by oddtodd ( 125924 ) <<oddtodd67> <at> <gmail.com>> on Tuesday July 02, 2019 @07:46PM (#58864012)

      The IBM PC was Intel inside and when Compaq cloned the BIOS and unleashed the market it was game over.

      • by Anonymous Coward

        Many other companies focused on their design and neglected the fabs.

        As a result up until recently there was no competitor fab-wise to Intel in the consumer market. The end result was other processors were more expensive and slower due to their older and larger processes, making Intel both speed and cost conscious.

        Given the process hiccups Intel and others have had the past 3 generations or so, anything 28nm and below has a chance of being competitive in various market segments, and if raw speed rather than

        • by Luthair ( 847766 ) on Tuesday July 02, 2019 @08:43PM (#58864174)
          You forgot the part where Intel abused their monopoly having 'shortages' for parts to OEMs who sold AMD devices and secretly changed their commercial paid for compiler to provide only non-optimized code paths to non-Intel CPUs.
          • I remember seeing that crap as late as 2008ish, "Compile with ICC for the fastest build, but it seems to work better with Intel chips..." regarding various A/V codecs. Are they still doing this? I imagine they have only gotten more worried about AMD since then. After all the speculative execution attacks recently, I started to feel vindicated about supporting AMD exclusively during their dark period where they were "slower" than Intel (until they secured their CPUs anyway...)
            • by Luthair ( 847766 )
              The FTC didn't force them to stop in the agreement to stop the anti-trust investigation merely prominently disclose it, so I assume they're still doing it.
          • Their battle with AMD didn't affect the architecture war. Their compiler being optimised for their own architecture is not an abuse of monopoly.

            Yeah both are ethically considered dick moves but didn't affect in a legal sense the outcome being discussed right now.

      • by AmiMoJo ( 196126 )

        But why did IBM select the 8088 instead of the 68000? According to Bill Gates it was because the 68k wasn't quite ready when they were designing the machine, so they went with 8088. They wanted a 16 bit CPU but chose the '88 for the cheaper 8 bit external bus and because Intel gave them a good price on it.

        It's hard to say what effect using a different CPU would have made. The 68k line eventually died out, but Motorola might have continued it if they had been getting volume from PCs. Or maybe there would hav

        • There was a common perception at that time that the 68000 wasn't ready at that time.

          THAT PERCEPTION WAS FALSE TO FACT. Motorola was doing a lousy job of tooting their horn about production delivery schedules and quantities.

          I ran into this myself. I was investigating 16-bit business microcomputers in Summer of 1981. At that time, the industry perception was that the 68000 was not yet available. Wicat told me otherwise: THEY HAD BEEN *SHIPPING* 68000 SYSTEMS FOR OVER A YEAR, IN QUANTITY.

          The 8-bit data bus

        • IBM PC and 68k (Score:4, Informative)

          by DrYak ( 748999 ) on Wednesday July 03, 2019 @10:46AM (#58867110) Homepage

          But why did IBM select the 8088 instead of the 68000? According to Bill Gates it was because the 68k wasn't quite ready when they were designing the machine, so they went with 8088. They wanted a 16 bit CPU but chose the '88 for the cheaper 8 bit external bus and because Intel gave them a good price on it.

          "Weren't quite ready" in this context simply meant that there wasn't a prolific eco system of readily available off-the-shelves parts for peripherals and other extra functionality.

          68k was internally a 16/32 bit CPU running on a *16bit* bus. Meaning that this would require it to talk mostly to new 16bit memory mapped chips. There weren't many of them, they weren't widely tested, lots of functions would still be missing. Also, RAM would need to be 16bit too, etc.

          That's entirely acceptable when you're e.g. an arcade manufacturer and your hardware is filled to the brim with special purpose custom chips (random example: SEGA Megadrive/Genesis' VPD also acts as a 16bit bridge to the 8bit peripherals. It's a southbridge in addition of being a GPU) or you have a couple of genius guy who'll quickly wire-wrap a test solution and then built it out of PAL chips (Apple's MacIntosh - see posts on folklore.org ).

          But again IBM wanted their PC to be damn cheap to manufacture and only out of cheap, off-the-shelves parts, nothing expensive requiring custom design. (They were horrendously late on this home-PC market, and any time spent on designing some custom south bridge for the 68k would be delays making their entrance even later, and would rise the cost even more. Not acceptable)

          8088 was internally a 16 bits CPU running on a *8bit* bus with a separate *8bit* IO channel. Exactly the same configuration as both older Intel chips (such as the 8bit 8080) and competitors (Z80 has the same world view) and also numerous other (Motorola 6800 and MOS 650x are all 8bits, though memory mapped instead of a separate IO port).
          You could just grab any existing chip that worked on them and slap it next to a 8088 on the PC. It's already well known technology and will enable fast development. (Again, very critical, because IBM was very-very-very late comer on the market)
          (As a bonus, 8088's assembly reminds a bit of 8080 and Z80's if you squint at it, so port of popular software could be achieved almost by mass-renaming mneumonics in assembly source.)

          Which is what basically what IBM did : they just grabbed a bunch of already existing chips and duct-taped them together into the IBM PC, enabling them to quickly move into the already existing market, despite starting much later than everyone else.
          (cue in citation from floklore.org about engineers being disappointed when starting to dissect IBM PCs, because there's nothing interesting nor celver in them).

          This in turn made the explosion of clone PC market afterward :
          - First, IBM is a *big hname* company, so obviously they are the one you want to copy/emulate/be compatible with.
          - Second, the IBM PC is entirely made out of already available chips. No custom specially designed stuff involved. Means that anyone could buy the same chips as IBM did and build their own PC clone.

          Which is what most cloners did. Grabbed standard chips, and cobbled together a PC-compatible out of them.

          It's hard to say what effect using a different CPU would have made.

          Depending on how much custom silicon did IBM throw at it, it could be a world were the "[IBM] PC compatible" didn't emerge.
          That would have prevented cloners from simply gluing together off the shelves parts, if one critical piece happened to not be easy to find on said metaphorical shelf.

          So perhaps IBM PCs weren't what cloners would start to massively clone. (And lose out a bit on IBM's popularity as a mecanism to bring popularity onto clones)
          Maybe some *other* standard would have become the dominant one: maybe some 16bit evolution of the MSX standard? (Which, out of backward compatibility and due to its Z80

          • "Weren't quite ready" in this context simply meant that there wasn't a prolific eco system of readily available off-the-shelves parts for peripherals and other extra functionality.

            The geek can overthink questions like this.

            The "Holy Grail" in 1980 was a 16 bit CP/M system for small business use. The upgrade path in which every software niche would be filled with brand-name products within a yeat or two, The use of generuc parts, a $50 O/S and a modular design that paved the way towards massive improvements in sound, video and so on drove things along quite nicely. The MS-DOS PC was BTW a commercially viable product before the cloning of the IBM PC BIOS.

        • Andy Grove at Intel was worried about the rumored Motorola 68000 so when the 8086 team told him they needed six weeks more to implement memory management, he wouldn't give it to them. You can blame him, and not Bill Gates, for most of the Windows blue screens of the 1990s when software designed to run without memory management were put on later chips that had it.
    • by Anonymous Coward

      Sure, 68K was better in terms of user-facing instruction set (very regular), and adopted features like flat addressing before x86. Also performance, until about the 386.

      But none of that mattered in the end. Almost nobody write assembler by hand any more, so how "nice" 68K was to program compared to x86 hasn't mattered since forever. x86 adopted flat addressing. x86 was more I-cache efficient due to how compact (irregular!) the instructions were, which helped it for years, and it caught up and then far s

      • Still a few differences.

        68k internally had 32bits registers (and memory pointers).
        Extrenally it had 24bit address bus, meaning up to 16MiB worth of RAM and memory-mapped peripherals a directly visible to it.

        Starting at an era where 64KiB was a lot, it could evolve quite a lot until feeling constrained by memory.
        (the same CPU that operate 64KiB on a SEGA Megadrive/Genesis could operate 8MiB on a Palm IIIc).

        When newer 68k get an external 32bits bus, same 32bit pointers keep working (now addressing up to 4GiB

    • by alexo ( 9335 ) on Tuesday July 02, 2019 @08:38PM (#58864164) Journal

      That is what decided the future. By all accounts the 68k was the better processor, yet Intel won the war.

      I was writing assembly back then (early 80s, pre-PC), the Motorola chips were hella lot easier to program than the Intel chips. Hell, the Zilog chips (Z-80) beat hell out of Intel.

      The 8088 was significantly cheaper than the 68K, was available in larger quantities, was more mature due to being a slight modification of the earlier 8086, could use existing 8-bit bus designs and support chips, and the abundant at the time software written for earlier 8080/8085 and Z80 CPUs was easily portable to the 8088 due to similar architecture.

      Motorola had a cleaner CPU architecture but lost on logistics.

      • by epine ( 68316 ) on Tuesday July 02, 2019 @10:31PM (#58864612)

        Motorola had a cleaner CPU architecture but lost on logistics.

        The fetish on the cleanliness of your CPU architecture never really panned out. The death of x86 was predicted over and over again, but never came to pass.

        Modern x86 needs a few more transistors to achieve this, because of the erratic ISA and variable instruction encoding, but the net performance doesn't suffer much.

        Plus, most of the silicon and complexity is now in the cache hierarchy and memory controller. On the execution side, no architecture really enjoys having 200 instructions in flight because DRAM access time is fixed in amber. Nor are your relatively bulky x86 FMA execution units substantially different than hypothetical 68000 from 2019.

        And a few things even worked out in Intel's favour, like the read-modify-write instruction format (denser code, better icache hit rates, free macro-op fusion, allows you to keep your register set smaller, which in turn improves context switching speed).

        What really make x86 a PITA was the tiny segment register offset of 4 bits (giving you a 20-bit physical address) instead of 8 bits (giving you a 24-bit physical address).

        In hindsight, a 16 MB address space can dedicate a fixed 64 KB segment to every expansion slot, give the system itself up to 8 MB, and still have room for a 4 MB linear-address frame buffer (megapixel display at 32-bit colour) and maybe a smaller second display. 8 MB for the base system takes to the early days of virtual memory and demand paging. 16 MB Linux boxes were entirely viable in the early Pentium era.

        Intel didn't want the world to work this way, because they weren't making CPUs by then, and the extra address space would have encouraged people to spend more on memory and less on compute. In the 286 era, it was hard to spend on memory cost effectively, because you had to buy expensive memory expansion cards that worked in wonky ways, and only worked with specially coded applications.

        Just that single change alone would have eliminated most of the himem.sys and Plug and Play catastrophes.

        The compiler writers still would have bitched about having to support large model, and weird model, and small model. But that would have been fairly small fish.

        Also, as I recall, the real purity whores loved the Nat Semi 16032 most of all. It was reputed to give compiler writers orthogonality orgasms.

        The first chip in the series was originally referred to as the 16032, but later renamed 32016 to emphasize its 32-bit internals.

        This contrasts it with its primary competitor in this space, 1979's Motorola 68000. The 68k used 32-bit instructions and registers, but its ALU was only 16-bit. ... In contrast, the NS32000 has a 32-bit ALU, so that 16-bit and 32-bit instructions take the same time to complete.

        The 32016 first shipped in 1982 in a 46-pin DIP package and may have been the first 32-bit chip to reach mass production and sale. Although this post-dates the 68k by about two years, the 68k was not yet being widely used in the market and the 32016 generated significant interest.

        Unfortunately, the early versions were filled with bugs and could rarely be run at its rated speed. By 1984, after two years, the errata list still contained items specifying uncontrollable conditions that would result in the processor coming to a halt, forcing a reset.

        National changed its design methodology to make it possible to get the part into production and a design system based on the language "Z" was developed. The "Z" language is similar to today's Verilog and VHDL, but has a Pascal-like syntax and is optimized for two-phase clock designs.

        However, by the times the fruit of these efforts were being felt in the design, numerous 68k machines were already on the market, notably the Apple Macintosh, and the 32016 never saw widespread use. [redacted for brevity]

        So there's that, but I really think the 20-bit / 24-bit address space issue on the x86 turned out to be more important as our real cross to bear.

        • Modern x86 needs a few more transistors to achieve this, because of the erratic ISA and variable instruction encoding, but the net performance doesn't suffer much.

          In fact, Intel proved the opposite when the CPU's went super-scaler. While RISC processors were more than competitive prior to executing more than one operation at a time, the shortfall was code density and its knock-on effects. While Intels version of a read-modify-write operation was 6 bytes of machine code and was broken down into 3 micro-ops within the pipeline, the RISC equivalent was 14 bytes (6 + 2 + 6) and was actually three instructions that each needed to be decoded individually.

          RISC became dec

        • most of the silicon and complexity is now in the cache hierarchy and memory controller

          Common wisdom, but wrong. At least, it is wrong to minimize the amount of silicon the decoder takes up. Have you looked at it recently? It's big, it's easily comparable to the memory controller. All those garbage half thought through instructions Intel and AMD have been heaping on, Intel being the worst offender. I mean, my god, ever forgetting about all the brain damaged attempts at simd, how about sysenter? What a complete piece of crap. So many instructions like that. It's a big disgusting rambling mess,

        • by AmiMoJo ( 196126 ) on Wednesday July 03, 2019 @05:57AM (#58865940) Homepage Journal

          Modern x86 needs a few more transistors to achieve this, because of the erratic ISA and variable instruction encoding, but the net performance doesn't suffer much.

          Modern x86 CPUs treat the x86 instructions more like an intermediate bytecode similar to Java or .NET than machine code.

          They effectively recompile it on the fly, optimizing instruction order as they go and producing microcode that is what actually gets executed. As it turns out that's ultimately faster than things like RISC, because the optimization is tailored to the specific CPU and exceeds what any generic compiler is capable of. It also works very well for parallelizing code across multiple execution units, ALUs, FPUs etc.

          The only problem is that it requires a lot of transistors and a lot of power. Early Intel "low power" processors like Atom were very slow because they lacked most of that stuff, and still were not very low power really.

          That's why ARM is still very strong in any area where power consumption matters. Not just battery powered stuff, but appliances where generating massive amounts of heat is undesirable. Being simpler they are also very cheap to produce.

        • by DrYak ( 748999 )

          The fetish on the cleanliness of your CPU architecture never really panned out. The death of x86 was predicted over and over again, but never came to pass.

          Somebody hasn't been paying attention to what is running inside their pocket ~ :-D

          The single only reason why x86 didn't die but lasted so long boils down to one single reason: legacy code.
          There are tons of binary code written for x86.
          Meaning that being able to run them (at native speed, without relying on emulation) is desirable, and trumps out any other argument.

          Only when new classes of device not relying on this legacy code emerged - PDA, then smartphones, then tablets - only then did the world switch to

          • The single only reason why x86 didn't die but lasted so long boils down to one single reason: legacy code.

            I guess that explains why Apple switched FROM a legacy of 68k TO x86, huh?

            • I guess that explains why Apple switched FROM a legacy of 68k TO x86, huh?

              Notice how they also already have switched to PPC somewhere in between those, and are extremely likely to switch yet again to ARM.

              Apple is a relatively tiny market, with a very tigh top-down control.
              They can relatively easily switch and force their tiny market to follow, otherwise the devs will be missing out on the rabbid fanboi fanbase.

              They are also a very tiny fraction of the overall laptop/desktop/workstation/server super-family group, they don't have a strong influence there.

              Windows used to have a very

      • Also note that not only did Intel maintain backward compatibility after the 8086/8088, they also targeted a level of compatibility (source code compatible) with its forerunners. Thats why the low and high bytes of its 16-bit registers were also individual 8-bit registers.
    • by _merlin ( 160982 ) on Tuesday July 02, 2019 @11:36PM (#58864808) Homepage Journal

      Engineers at IBM wanted to use the 68k in what became the PC, but several factors meant the 8088 was a more rational source:

      • At the time, Moto was getting low yields on the 68k. This meant quantities were limited and prices were high. Sure this improved later, but IBM didn't have a crystal ball to be able to foresee this.
      • At the time, there was no second source for 68k processors. Intel already had second source agreements in place, and IBM had a license to manufacture the 8088 themselves if they needed to.
      • At the time, RAM was very expensive, so the higher code density of the 8088 was an advantage.
      • At the time, 16-bit peripherals were expensive. Having a variant with an 8-bit bus with the possibility to switch to a 16-bit bus version later was an advantage. The 68008 with an 8-bit data bus wasn't available until much later.

      If the PC had been designed a couple of years later, it might have used the 68k, and then we might have been living in a world with Moto as the abusive CPU monopoly, with some hacked-up 64-bit 68k derivative dominating the desktop. But at the time, there were good reasons for IBM to choose the 8088. I don't think they anticipated the clone market, and the PC becoming a "standard" the way it did.

      • Given this...

        At the time, 16-bit peripherals were expensive. Having a variant with an 8-bit bus with the possibility to switch to a 16-bit bus version later was an advantage. The 68008 with an 8-bit data bus wasn't available until much later.

        ...given that the wanted on purpose something were they can slap cheap off-the-shelves 8-bit parts without needing to develop custom 16bit silicon
        (mostly because they were late in the home computer market and didn't want to lose to much time nor money in that attempt)...

        But at the time, there were good reasons for IBM to choose the 8088. I don't think they anticipated the clone market, and the PC becoming a "standard" the way it did.

        ...in hindsight it's absolutely evident.

        - IBM is a big brand name, it was clear that they were going to be a good target to copy/emulate/clone
        - As said above, the IBM PC was on purpose - for time/cost reason - thrown together ou

    • Others have already answered your question. But do note that the part of Motorola which made the 68000 microprocessor was spun off into ON Semiconductor and Freescale [wikipedia.org] two decades ago. Motorola Mobility has nothing to do with them, other than once being part of the same company.
    • The story I heard was that Intel had chipsets readily available for the 8088, whereas the Motorola did not for their 68k.
  • by melted ( 227442 ) on Tuesday July 02, 2019 @07:44PM (#58864004) Homepage

    Same as Nokia. It's a Chinese company, with everything that entails: nonexistent tech support, not giving a shit about repeat business, shoddy quality.

    • by AmiMoJo ( 196126 )

      The only "Chinese" part is Motorola Mobility that was sold to Lenovo, a Taiwanese firm.

      And Lenovo is legendary for excellent tech support, long term support, caring about repeat business and bullet proof quality. Thinkpads are the machines of choice for people who want reliability and serviceability.

      Other remains of the old Motorola include ON Semi, NXP Semi, and Motorola Solutions which is American owned and I don't know much about.

      • by iserlohn ( 49556 )

        Lenovo is Mainland Chinese. It's HQ is in Beijing.

        • by AmiMoJo ( 196126 )

          Yes, you are right. My mistake. Well, that's even better. Proof positive that a Chinese company can produce the best laptops on the market, with the best support too.

    • What? My Nokia 6.1 is using the Android ONE system and I have security patch update every month. The quality is top notch, one piece aluminium.

    • The network side still exists and has bought out two of it's main competitors, making it one of the "big three" mobile network hardware makers.

      The cellphone side was sold to Microsoft, who then closed much of it down before selling what was left to a new company started by the old management of said division called HMD Global. That company is not only headquartered Finland, up until late last year it's headquarters were literally on the Nokia campus in Karaportti Espoo.

      You're also wrong on the quality
  • by Anonymous Coward

    The company used to be at the forefront of technology in everything digital, but buyouts, restructuring, and eventually becoming another OEM nameplate has left Motorola little more than a memory that old tech dudes like me will fondly look back with melancholy reflections of the good old days

    The set of American companies on that list is quite long.

    Motorola is on it, but as one of many. Xerox. Kodak. HP. ...

    • Dont forget the once behemoth AT&T, which now only exists in name only (the last remnants, and name, purchased by SBC, who rebranded themselves AT&T)
      • (the last remnants, and name, purchased by SBC, who rebranded themselves AT&T)

        After SBC purchased SNET, their marketing changed to emphasize their "100+ year heritage." I still shake my head slowly, not only because of the gaucheness but because SNET's service went from very modest and uninspired to downright poor practically overnight.

  • by 93 Escort Wagon ( 326346 ) on Tuesday July 02, 2019 @08:09PM (#58864090)

    Let's not blame the giant company who bought them, absorbed all their patents and then cut what was left adrift.

  • There is no "allegiance". No matter which brand you buy, the copyrights and patents go into the same pockets, and the product is from the same factory. If you want to correct the problem, take away the copyrights/patents when they drop support.

  • by skam240 ( 789197 ) on Tuesday July 02, 2019 @08:14PM (#58864102)

    When I was shopping around for a new phone a year and a half ago I found Motorola definitely had the mid/low tier done right. Aside from a slightly better camera the top tier phones selling for near or over $1k do nothing that I need them to do that a $300 Motorola can't do. Sure, I care about security updates a little but what am I doing on my phone I'm so worried about?

    Now if they would only release another smart phone with a built in physical keyboard. The only phone I miss owning is their droid series of phone https://www.google.com/url?sa=... [google.com] with its built in fold-out keyboard. Man, I'd pay Apple or Samsung money just to be able to feel the fucking keys I'm typing on.

    • Yep. Removable SD card for cloudfree storage/data copy. Removable battery on some models (glory be). Headphone jack. Unlike Apple crapple where all three are MIA.
  • Android One (Score:5, Interesting)

    by Uncle_Meataxe ( 702474 ) on Tuesday July 02, 2019 @08:29PM (#58864132)

    Try an Android One phone. I'm happy with the two Nokias I've bought for family members. They get updates every month for three years. Haven't missed a single one yet. So far have not had any experience with their tech support so can't comment on that. But note that Nokia is now owned by the Finnish company HMD Global, i.e., it's not Chinese.

    • Re:Android One (Score:4, Interesting)

      by Waffle Iron ( 339739 ) on Tuesday July 02, 2019 @09:28PM (#58864330)

      Funny enough, I recently bought a Motorola "Android One" phone direct from the manufacturer at fire sale prices.

      I had two primary requirements for a phone: (A) a real headphone jack, and (B) not totally dependent on the OEM for updates. The Android One program will certainly help with that for a while, and the relative lack of bloatware is nice. More importantly though, the phone has an unlocked boot loader, which I assume I'll eventually have to utilize. (I don't see any reason to ditch a phone after only three years of use.)

    • by sad_ ( 7868 )

      this, if you want your Android phone to have update support, be sure to buy one that's in the Android One program.
      if you don't well, don't complain about it, we know that all brands are crap with updating Android, no matter the price of the phone.
      when looking for a phone, this is my nr.1 requirement, it must be Android One compliant, don't care about the lens, screen, whatever, all those come second place.

  • by Anonymous Coward

    I know it's beyond most phone users, but this being /. it boggles the mind that posters care about the OS that ships with their phone. Motorola is one of the few that will unlock the bootloader and that has multiple devices supported by Android forks like LineageOS [lineageos.org].

    Before I bought my last phone I did a bunch of research and ended up with a MotoX4. I'm still happy with that decision and could care less when/if Motorola releases an Android update because I don't run their OS. LineageOS is great and, aside

  • Come on, Motorola hasn't been more than a husk since the RAZR flip phone (Yes I had one also).

    May as all lament the fall of Palm while you're at it, at the same level of sad as far as I'm concerned.

    • Motorola was just a husk of its former self years and years before they got into the flip phone game.

      Motorola once competed with Intel, the two of them accounting for the vast majority of all semiconductor fabrication in the world (by dollar value.)
  • Buy a Pixel (Score:4, Interesting)

    by surfdaddy ( 930829 ) on Tuesday July 02, 2019 @09:58PM (#58864488)

    You know, you can complain or you can buy the right hardware. If security is important to you (and I don't know why it shouldn't be), buy either an iPhone or a Pixel. Those are the phones that are going to give you the up-to-date software on a regular basis. This has been known for years so you must have bought your Motorola either not caring or not investigating that fact before purchase. I'm not trying to be harsh, it is just a fact. Even Samsung has gotten better at the security patches although not nearly as good as Google. And Samsungs are bloated with duplicate apps for a bunch of things, and you won't get a major OS upgrade for over six months after a Pixel will get it. You have choices.

    • I would jump on the Pixel phones if they had a microSD slot. I was really big on "native" Google hardware for timely updates, and got a Nexus 5 when it first came out. What I learned is that if you're just going to use it as a vanilla phone, it's fine. But if you're going to root it and customize settings or install your own firmware, the lack of a microSD slot is a huge impediment. You have to backup everything to the cloud, install the new firmware, then restore everything from the cloud. It's a hell
    • by AmiMoJo ( 196126 )

      To be fair to Samsung they have got a hell of a lot better when it comes to bloatware. Most of it has been removed, available via their app store instead. There are a few things still there, like the app store and Bixby, but even those can be disabled at least.

      Not perfect but far better than they used to be.

    • Got my wife an Android One phone. A Nokia 6.1.

      3 years of security updates, and it cost about $175 at the time.

      Seriously, it hasn't missed a monthly security patch yet.

  • Talk about how the once mighty have fallen! HP, Motorola, DEC - they were huge in their day and inspired many others to try and do the right thing when it came to technology and their consumers. Those days are over kids, I am so sad to say.
  • by rsilvergun ( 571051 ) on Tuesday July 02, 2019 @10:19PM (#58864572)
    and a user serviceable battery. That's all I need to know. In 2019 I don't need or want cutting edge. I'm not playing Fortnite on my bloody phone. I just want a good radio/GPS, fast enough for 1080p video (and I'm quite happy with 480p when I can get it) and enough space to load my apps (64 gigs with a Smart Card slot's plenty ).
  • 1st Rectangular Color CRT's,
    1st totally solid state Color TV,
    Motorola Quasar "Works in a Drawer" modular TV's, Now THAT was Motorola....

  • by epine ( 68316 ) on Tuesday July 02, 2019 @10:46PM (#58864676)

    From the data sheet, the Moto G7 Power XT1955-5 would exactly suit my needs, if it was available in Canada yet. This particular SKU is VoLTE capable on 700 MHz, and I might finally get enough signal indoors not to have to sit by the window for every darn call. It would still xmit at very high power, so I'd need the big fat battery after all.

    But only if the XT1955-5's IMEI is on my local tower's VoLTE white list. Which you can't count on when you buy an unlocked phone, not without making annoying inquiries to your local phone carrier, involving so many deferrals, hand-offs, and holds it could take years off your life.

  • by Anonymous Coward

    For me, the problem with the "modern era" Motorola phones is their choice of battery supplier. Every Motorola phone user I know of (2 Moto Zs, 1 Moto Z Play) has had major battery failures after about a year (Z play battery swelled so much the screen detached from the frame, one Moto Z battery holds no charge at all and the other holds only a little - this is only after a year of light use).

  • Do you want some cheese to go with that whine? You whine as if Motorola owes you something, you are the president of the Motorola fanboi network, or you have a giant Motorola tattoo on your face. Get over it.

  • I remember watching a documentary about land mines in Cambodia. They open up an antipersonnel mine, and there it is a Motorola brand chip. Seemed common knowledge, at least among the bomb defusers.

    IMO Motorola then began a concerted cover-up, joining various coalitions against landmines and loudly expressing shock to the controlled oligopoly that was the media in 1994 that their product had any association with land mines. It must be repurposed Chinese washing machines! A whole batch of those chips mus

  • by Anonymous Coward

    both the big grand Android platform updates and the important but overlooked security patch updates

    This is why I keep saying Android is literally dead to me and I will no longer buy anything that runs it.

    What you get is fragmented abandon-ware that the company stops supporting the moment they ship it.

    They put a lot of time and energy into skinning it, branding it, and building things to track you and make money from you ... they put zero effort into maintaining it or giving a shit if you get any updates.

    And

  • Apple releases the iPhone. Motorola's response? Check out this new Razor we have, but in GOLD..or HOT PINK!

  • Can't agree more on that. For the last two years I've been cursing at Motorola (Now Lenovo, or crap, it's the same thing). I also have problems with Lenovo in my line of work. IT infrastructure for the last 25 years. They are LOUSY. It seems IBM left the half of the incompetents for Lenovo and kept the best ones for them to work with their Risc hardware. With Motorola is so clear that after the Lenovo brand started to show in the phones, the thing is not the same anymore. Bugs, slow, slow slow updates and

What good is a ticket to the good life, if you can't find the entrance?

Working...