Can a computer invent it's own instruction set architecture?

This is an archive of a topic from NESdev BBS, taken in mid-October 2019 before a server upgrade.
View original topic
Can a computer invent it's own instruction set architecture?
by on (#234208)
Take an existing CPU instruction set, and taking a big chunk of code from human programmers, and have a computer analyze how frequent every instruction is used. If it detects the repeated groups of instructions, the new instructions get added to the instruction set, and either remove infrequent instructions or add a prefix byte.
Re: Can a computer invent it's own instruction set architect
by on (#234209)
That's an interesting question. Computerphile videos are all into machine learning, and I guess this fits the bill -- have a neural network iterate over many different instruction [types] and choose the best n for the job.
Re: Can a computer invent it's own instruction set architect
by on (#234211)
This would be the part of the Terminator movie when the machine becomes smart enough to wipe 99% of humanity off the planet.
Re: Can a computer invent it's own instruction set architect
by on (#234217)
Doubt you'd gain much performance just by tweaking the instruction set. See for example x86 (a terrible ISA) having better perf than Itanium. And less instructions (RISC) tends to be better than more.

Computers are clearly the solution to chip design, but there are so many factors it's a really tough problem.
Re: Can a computer invent it's own instruction set architect
by on (#234220)
Regarding thread starter: There are people who make a point of studying "optimal" ISAs. You can look for discussions related to the invention of the RISC-V architecture to find out people looking how to do this. There's also the "Mill" architecture, which is doing some really wacky things ... so wacky it's hard to evaluate. (But the inventor's lecture series on youtube on how the Mill architecture works does present some fascinating ideas)

pubby wrote:
And less instructions (RISC) tends to be better than more.
Seemed to be.

RISC is the right way to get "any computer" for cheapest, but the past twenty years have shown that simpler and more orthogonal instruction sets are actually not very useful for performance. The silicon cost of superscalar architectures eats a large amount of die space.

Ideally, you'd have an ISA with an infinite number of registers, and every instruction has complete orthogonality, and every instruction fits in 0 bits. Something has to give for a real-world thing; orthogonality is often a first victim. Fancy-seeming instructions (like bit test and bit set), despite being redundant with other instructions, are too critical for real-world application performance to not include as first-class operations. The performance of signal processing instructions (such as multiply-and-accumulate, or various SIMD things) is also paramount. Before long you discover you've built a weird CISC ISA that sure isn't RISC by anything but the most generous definitions of the term—it's just that it's a very different set of first-class instructions than the x86 or 68k had.
Re: Can a computer invent it's own instruction set architect
by on (#234234)
There is a paper on Huffman-compressing instructions, then running that with realtime decoding, without decompressing the entire thing. IIRC it was on ARM, and achieved decent performance with code size about halved. You should find it on arxiv.
Re: Can a computer invent it's own instruction set architect
by on (#234235)
calima wrote:
There is a paper on Huffman-compressing instructions, then running that with realtime decoding, without decompressing the entire thing. IIRC it was on ARM, and achieved decent performance with code size about halved. You should find it on arxiv.

Is that any advantage on THUMB instruction set which also halves code size (but typically hurts performance) ?

As for the original quesiton, it's complex but I don't think that's the case - a computer could be useful in designing instruction set for other computers at design time, but not that much to modify their own instruction set - that would require a FPGA reprogramming itself at runtime which is technically possible but ridiculously complex for little gain. What modern CPUs however really does is that they transcode X86 instructions into some internal instruction set in real-time before executing it, because supposedly that's more performant thatn directly run X86 instructions (something which was given up somewhere arround the Pentium IV). I could be wrong or have misunderstood something.

In any case, the beauty and simplicity of the 6502 is missed on modern CPUs :)
Re: Can a computer invent it's own instruction set architect
by on (#234262)
Bregalad wrote:
As for the original quesiton, it's complex but I don't think that's the case - a computer could be useful in designing instruction set for other computers at design time, but not that much to modify their own instruction set


I meant "it's own ISA" as in "it's own invention".