POLL: x86 vs ARM vs RISC-V; What is your favourite CPU ISA?

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Heartbreaker

Diamond Member
Apr 3, 2006
4,228
5,228
136
You make it sound as if people explaining why they prefer an ISA over other ones are childish

If you are just interested in complaining about things that have absolutely ZERO impact on your life, I do consider that kind of childish. I see way too much hot air vented on "informing" everyone that "x86 is the worse architecture ever"...

It was not about what is the best ISA for end users at the moment,

Actually there was no context given at all, so it could be about that.

I interpreted it that way, I'd bet others did as well.

Favorite is reasonably the one where I spend my money, and time on. Hence I chose x86.
 

Nothingness

Platinum Member
Jul 3, 2013
2,424
755
136
If you are just interested in complaining about things that have absolutely ZERO impact on your life, I do consider that kind of childish. I see way too much hot air vented on "informing" everyone that "x86 is the worse architecture ever"...
You're barking at the wrong tree. I explained why I think x86 is worse than the other contenders here from an ISA point of view. I don't find that any less childish than saying x86 is my favorite because it runs my game.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,228
5,228
136
You're barking at the wrong tree. I explained why I think x86 is worse than the other contenders here from an ISA point of view. I don't find that any less childish than saying x86 is my favorite because it runs my game.

If you aren't aren't just complaining when it doesn't actually affect you (as most are who complain) then it doesn't apply to you. But there are too many of these discussion are people just complaining when it doesn't affect them at all, and my comment was generically aimed at those who do so.

BTW, Why are your programming ARM in Assembly?

The last time I programmed in Assembly was microcontrollers in the early 1990's. These had less than 16 KB of PROM storage and less than 512 bytes of RAM (often less than 128 bytes), so we had to wring out every byte by hand.

It's EXTREMELY rare to require assembly programming on modern high performance cores today.

Back in the early 1990's our shop exclusively used Motorola 8 bit microcontrollers, but on one project we ran out of parts allocation for them and could only get an Intel Architecture microcontroller. It wasn't as clean and simple as Motorola architecture, where everything was just all memory mapped together. With Intel everything was segmented (ROM, RAM, Ports).

But calling the Intel architecture, even at the assembly level an "abomination" is laughable. Sure it's very slightly more complicated, but it might have taken me all of 2 days to get used to it. After that it was a non issue.

But this was pre-internet, and exaggerated complaining about everything wasn't a major pastime yet.
 

Nothingness

Platinum Member
Jul 3, 2013
2,424
755
136
BTW, Why are your programming ARM in Assembly?
I work in a CPU design team doing modeling of upcoming CPUs. So I'm confronted with all the intricacies of ISA's and with their implementation in silicon.

It's EXTREMELY rare to require assembly programming on modern high performance cores today.
I definitely agree with that (even though outside of my day job, I sometimes need assembly language for high performance computational number theory code).

But calling the Intel architecture, even at the assembly level an "abomination" is laughable. Sure it's very slightly more complicated, but it might have taken me all of 2 days to get used to it. After that it was a non issue.
Being complicated is not what makes something horrible, it makes things funny. But x86 is overly complicated due to its roots. It's getting better with time, but it still shows that it was designed to be compatible at source level with 8085.

You might consider laughable calling x86 ISA an abomination, but IMHO that proves you don't know enough of x86; and that's not an issue as far as I'm concerned, knowing too much of it is not worth the pain. As I previously wrote, look at shr reg, cl, and tell me it's sane.
 
Jul 27, 2020
16,363
10,380
106
Waow. I guess you wouldn't tell us where you are working at though.
If his CPU is available for sale and runs something useful, it could be worthy of being added to my collection. That of course depends on whether I'm able to afford it.
 

Nothingness

Platinum Member
Jul 3, 2013
2,424
755
136
Indeed I won't tell. I will just say the likelihood you used a device which contains a CPU I was involved in is close to 100%

Enough bragging, let's get back to it: x86 STINKS!
An interesting discussion about shr cl on RWT: https://www.realworldtech.com/forum/?threadid=216194&curpostid=216214

EDIT - Of course involvement doesn't mean I was a major contributor, just that I was in the team. I'm no Keller
 

Nothingness

Platinum Member
Jul 3, 2013
2,424
755
136
YOU NO FUN!

Prohibited to tell due to employment contract clause?
Yes. Most companies are strict about that. I'm already very careful not telling things that are not public (I basically search the Web for information to double check). I follow every rule my company dictates but who knows I'm sure several other people in this forum are also not revealing what semiconductor company they are working for (and as far as I can say they also are quite cautious about what they write).
 

Tuna-Fish

Golden Member
Mar 4, 2011
1,355
1,552
136
Interesting. That's quite a lot of information actually.

I might guess...

It's a lot less info than you'd think. There are tens if not hundreds of different microcontrollers that are found in millions/billions of devices that we have all used. We like to talk about the big chips that go into large computers and cell phones, but there is a lot of long tail of devices that never get more than a few tens of kB of memory, and while they are less sexy, in a way they are just as critical.

An USB power brick, a microwave oven, and a digital wall clock each has a CPU much more powerful than the ones that were used to get to the moon and land on it.
 
Jul 27, 2020
16,363
10,380
106
Indeed I won't tell. I will just say the likelihood you used a device which contains a CPU I was involved in is close to 100%
So it's a teensy weensy ubiquitous CPU found almost everywhere in a device that everyone at one time or another has used.

Hmmm....

You didn't say ISA. You said CPU.

Sounds to me like a StrongARM CPU in some storage device (HDD most likely).

But it could be also be some tiny CPU taking care of the NVMe protocol in SSDs.

And if it's not a storage related CPU, then it's gotta be some minuscule almost invisible CPU taking care of some mundane task in a smartphone.
 
Reactions: dr1337
Jul 27, 2020
16,363
10,380
106
@Nothingness

A question for you since I'm hopeful that you will have a good answer for it.

Why isn't there some open source tool that takes an EXE as source and spits out a rewritten EXE with all kinds of badly-coded-bottlenecks removed? In other words, something that analyzes the binary code of an EXE and replaces the bad instructions with identical instructions that are known not to give seizures to a particular CPU family? So suppose we give this tool a CRYSIS.EXE and it adds multiple execution paths to the newly rewritten CRYSIS-OPTIMIZED.EXE which detects the CPU family at startup and executes the appropriate codepath? Then CRYSIS would easily run at max possible fps on Pentium 4, AMD Bulldozer, Rocket Lake and maybe even Lakefield!
 

FlameTail

Platinum Member
Dec 15, 2021
2,356
1,274
106
I'm sure several other people in this forum are also not revealing what semiconductor company they are working for (and as far as I can say they also are quite cautious about what they write).
Not everybody here is working for a semiconductor company, right?

There are some basement dwelling cavemen as well....
 

Nothingness

Platinum Member
Jul 3, 2013
2,424
755
136
@Nothingness

A question for you since I'm hopeful that you will have a good answer for it.

Why isn't there some open source tool that takes an EXE as source and spits out a rewritten EXE with all kinds of badly-coded-bottlenecks removed? In other words, something that analyzes the binary code of an EXE and replaces the bad instructions with identical instructions that are known not to give seizures to a particular CPU family? So suppose we give this tool a CRYSIS.EXE and it adds multiple execution paths to the newly rewritten CRYSIS-OPTIMIZED.EXE which detects the CPU family at startup and executes the appropriate codepath? Then CRYSIS would easily run at max possible fps on Pentium 4, AMD Bulldozer, Rocket Lake and maybe even Lakefield!
That's a complex problem that requires decompilation, optimization, etc.

Binary instrumentation tools can be used for runtime optimization. DynamoRIO used to be able to do that. See section 9.2 of Derek Bruning PhD thesis where he presents speedup he got on IA-32 for SPEC CPU2000. I don't know if anyone is still using it for dynamic code optimization.

As far as static code optimization of binaries goes, it's much more complex as you have to statically identify code paths. Digital FX!32 tool that was used to run IA-32 on Alpha was using a combination of static and dynamic optimizations to efficiently translate code.

I guess Rosetta2 also does some limited static optimization (limited because being aggressive might change behavior and break compatibility).
 

Nothingness

Platinum Member
Jul 3, 2013
2,424
755
136
So it's a teensy weensy ubiquitous CPU found almost everywhere in a device that everyone at one time or another has used.

Hmmm....

You didn't say ISA. You said CPU.

Sounds to me like a StrongARM CPU in some storage device (HDD most likely).

But it could be also be some tiny CPU taking care of the NVMe protocol in SSDs.

And if it's not a storage related CPU, then it's gotta be some minuscule almost invisible CPU taking care of some mundane task in a smartphone.
I was involved in several projects ranging from small embedded DSP (last century) up to application CPUs. So different ISAs
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,228
5,228
136
I definitely agree with that (even though outside of my day job, I sometimes need assembly language for high performance computational number theory code).

That looks more like using the tools of your job, in your hobby. It's probably around 1 in a 10,000 programmers that uses Assembly code today, and obviously that number shrinks vastly in the general public.

I worked in Telecom from the early 90's to 2009, much of it on low level switching gear like optical interface cards directly setting up and controlling all the interfacing chips, and after about 1993, I never saw assembly programming again, and even in the early 90's it was only because I was using low end 8 bit microcontrollers where I needed to squeeze out every byte. So even 30 years ago when I was doing it, Assembly was kind of rare.

It's almost circular now. The only people that need to be concerned with the architecture enough to get down the Assembly level, are people like you involved in the design of the architecture.

Which is also why (present company excluded) most of the complaints about x86 architecture are from people that have no clue about the architecture, and are not impacted in any way by the architecture.

Which makes these, "Let's attack x86 architecture" threads tedious.
 

kschendel

Senior member
Aug 1, 2018
264
193
116
Why isn't there some open source tool that takes an EXE as source and spits out a rewritten EXE with all kinds of badly-coded-bottlenecks removed?

Doing a peephole optimization is pretty simple; however, any decent compiler should be doing it as well, so there's not much hope of improvement there.

The big wins are algorithmic 99.999% of the time, and that's not something you can figure out easily at all. You'd have to decompile the binary, then somehow magically figure out what it's trying to accomplish, and plug in a better way. That's very difficult for people working from commented source, never mind computer programs working from uncommented binary.

BTW, I realize that in my original response to this thread, I should have included the DEC VAX. It was a lot of fun to write assembler in, even if it wasn't an easily implemented ISA from a hardware standpoint. Writing x86 assembly is depressing, fortunately I rarely have to deal with it.
 
Jul 27, 2020
16,363
10,380
106
BTW, I realize that in my original response to this thread, I should have included the DEC VAX. It was a lot of fun to write assembler in, even if it wasn't an easily implemented ISA from a hardware standpoint. Writing x86 assembly is depressing, fortunately I rarely have to deal with it.
Help us understand why DEC VAX assembly was fun and why x86 assembly is depressing. We (or even just I) would like to feel the despair without actually touching the turdy code
 

Nothingness

Platinum Member
Jul 3, 2013
2,424
755
136
One thing I've constantly read in articles complaining about x86 is that it doesn't have enough registers. I assume they mean General Purpose registers coz this dang ISA has more than 556 registers!

The way he counts is silly: he counts sub registers as distinct registers. This makes no sense in most cases (exception H/L parts of 16-bit registers that can be accessed as independent 8-bit registers).

The lack of registers in the ISA is a pain mostly for assembly language programmers. From a performance point of view it doesn't matter too much thanks to out of order execution and register renaming. But yeah as a programmer it's a pain, in particular when some registers have a dedicated usage (should I mention SHR again? ).
 

Tuna-Fish

Golden Member
Mar 4, 2011
1,355
1,552
136
BTW, Why are your programming ARM in Assembly?
I've written assembly for a real product less than 10 years ago!

It was a cellphone app that did some audio processing in software. The routine that did the work was really simple and very amenable to SIMD, and took enough time and power on older models that we wanted it in SIMD, but for some reason autovectorization was making a total mess of the code. We tried massaging it into a form that the compiler liked, didn't work. We tried intrinsics, ran into a different problem (compiler was allocating too many registers, forcing excessive spills). In the end I went screw it and wrote the function in assembly, it turned out really nice.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |