Using AI With GCC to Speed Up Mobile Design 173
Atlasite writes "The WSJ is reporting on a EU project called Milepost aimed at integrating AI inside GCC. The team partners, which include include IBM, the University of Edinburgh and the French research institute, INRIA, announced their preliminary results at the recent GCC Summit, being able to increase the performance of GCC by 10% in just one month's work. GCC Summit paper is provided [PDF]."
MachIne Learning for Embedded PrOgramS opTimizatio (Score:4, Insightful)
Can we please stop using pointless backronyms? What purpose do they serve?
Re: (Score:3, Informative)
Re: (Score:2)
Sure, I understand the use of acronyms (I was in the military for over 10 years), but in this case, I don't see either form making the other easier to recall.
Re:MachIne Learning for Embedded PrOgramS opTimiza (Score:4, Insightful)
GCC is easier to remember? Ok, that really isn't an acronym (or bacronym I guess... is it?)
Actually, either acronyms and bacronyms [wikipedia.org] (a word I had to look up, having never seen it before, but damn I was 30 when the word was coined and forty before it was ever documented) are ok by me.
What's not ok is the devolution of literacy. "Back in the day" the rule was, and still should be, that the first time any acronym (and now bacronym) is used in any document, it should be spelled out:
"Wall Street Journal" should be spelled out because dammit, Jim, I'm a nerd, not a greedhead. EU should need no more explanation than US. AI shouldn't need explanation; this is, after all, a nerd site and the term has been around almost as long as I have. IBM has been around a lot longer and is usually how the company is referred to; that's its name. Its commercials and ads don't even say "International Business Machines".
CGG would be unknown to non-Linux users and non-programmers, so it should have been spelled out as well. PDF doesn't need to be expanded because gees, everybody knows what a PDF is but who knows what a portable document format is?
Re: (Score:2)
"The Wall Street Journal (WSJ)..."
It is awkward to and incorrect to use the abbreviation first followed by an explanation. You should instead write out the first instance and then provide the reader a note to the effect of "hereafter to be referred to as XYX"
Re:MachIne Learning for Embedded PrOgramS opTimiza (Score:4, Insightful)
I think you mean:
Yeah, not always a good idea.
Re: (Score:2)
Depending on the context it's either the GNU C Compiler or the GNU Compiler Collection. And GNU, of course, stands for GNU's Not Unix.
Re:MachIne Learning for Embedded PrOgramS opTimiza (Score:5, Funny)
Yeah, I think MachIne Learning for Quick Target Optimization And Speed Technology would have been a much better forward acronym.
Re: (Score:2)
Shouldn't that be
MachIne Learning for Fast ON Target Optimization And Speed Technology
Here's my shot (Score:2)
At integrating AI into GCC...
GACIC
Re: (Score:2)
Yes! Just be honest: "We called it that because we liked the word."
We should have recruted Al earlier. (Score:5, Funny)
This Al guy seem to be a really good developer. We should have noticed his skilled and got him into optimizing GCC a long time ago. ... I like arial font.
Re: (Score:3, Funny)
Your post is also confusing. Why abbreviate? Why not say "This A or guy seem"? Or were you just trying to pipe two sentence fragments?
Re:We should have recruted Al earlier. (Score:5, Funny)
Re: (Score:3, Informative)
The latter.
Re: (Score:2)
Yes.
np: The Orb - Plateau (Orbus Terrarum Remastered)
Re: (Score:2)
Did his joke just go way over your head or did your joke just go way over my head?
Both.
Re: (Score:2)
So you are saying that this guy whoever he is, is a heteronym? Cause I still can't see a difference.
Re: (Score:2)
I think you mean "Arial" font. The font you refer to is a copyrighted Disney font with little fish tails on each letter. Either that or it's one of those fonts that hangs around Uranus.
Re: (Score:2)
I looked at the PDF. Significant parts of it were unreadable on Foxit Reader! Are any other .pdf readers having trouble?
Perhaps the way to other things besides compilers? (Score:3, Interesting)
Re: (Score:3, Insightful)
This particular part made me think of a day when every program comes with a redesign.exe. Simply click the button, and it scans every piece of hardware on your computer, and then rewrites every optimization in it to perfectly fit your computer. Programs that streamline to your hardware, maybe even change the OS's they work under. It's written for Windows, you're running OSX? No problem, it'll rewrite itself as an OSX program. Though, that's probably still decades off. But AI seems to me to be the way to ultimate compatibility.
This exists today without ai. See java with JIT or even AOT (ahead of time). There are of course some issues with it but the technology is there.
Re: (Score:2)
Isn't it just compiling from source?
Re:Perhaps the way to other things besides compile (Score:5, Funny)
Isn't this Gentoo as well?
Re: (Score:2)
Re: (Score:2)
The AI in Gentoo is built into the users. /me ducks
Re: (Score:2)
Makes me feel all warm inside.
Re:Perhaps the way to other things besides compile (Score:2, Funny)
Sounds like a Gentoo user's wet dream.
Re:Perhaps the way to other things besides compile (Score:5, Interesting)
Actually IBM did this a few decades ago.
The Model38/AS400/iSeries are all compatible but very different machines internally.
IBM came up with an "idea" instruction set that no CPU used. When you do the initial program load "install" on one of those machines it compiles the ideal instruction set into the actual instruction set for that PC.
That allowed IBM to move from old bipolar cpus to the Power RISC cpus with 100% compatibility.
There isn't any reason why you couldn't do the same with Linux or Windows today.
Alive and well never went out of style. (Score:4, Informative)
Re:Alive and well never went out of style. (Score:4, Informative)
There is a difference between a JIT compiler, a tokenized basic program, a byte code interpreter like P-Code and what IBM did.
This is from the Wikipedia.
"Additionally, the System/38 and its descendants are the only commercial computers ever to use a machine interface architecture to isolate the application software and most of the operating system from hardware dependencies, including such details as address size and register size. Compilers for System/38 and its successors generate code in a high-level instruction set (originally called MI for "Machine Interface", and renamed TIMI for "Technology Independent Machine Interface" for AS/400). MI/TIMI is a virtual instruction set; it is not the instruction set of the underlying CPU. Unlike some other virtual-machine architectures in which the virtual instructions are interpreted at runtime, MI/TIMI instructions are never interpreted. They constitute an intermediate compile time step and are translated into the processor's instruction set as the final compilation step. The MI/TIMI instructions are stored within the final program object, in addition to the executable machine instructions. If a program is moved from a processor with one native instruction set to a processor with another native instruction set, the MI/TIMI instructions will be re-translated into the native instruction set of the new machine before the program is executed for the first time on the new machine."
As you can see it is brilliant idea. If Microsoft had used it for Windows Apps way back when then NT on the Alpha, MIPS, and the PPC might have actually been very useful. Oh and Intel would have been a very unhappy camper.
Re: (Score:2)
As you can see it is brilliant idea. If Microsoft had used it for Windows Apps way back when then NT on the Alpha, MIPS, and the PPC might have actually been very useful.
It requires extensive and expensive hardware support to do this well. It works for a minicomputer like the AS/400 but it's just not practical for a toy like a PC.
Re: (Score:2)
"It requires extensive and expensive hardware support to do this well. It works for a minicomputer like the AS/400 but it's just not practical for a toy like a PC." .net does it now. I have never seen myself but it makes perfect sense.
It worked for the System38 which is probably around 30 years old. I would be willing to bet that you could handle it just fine with an AthlonX2 or Core2Duo. One person posted that
One way you could do it is just comile the program up to the actual code generation. During insta
Re: (Score:2)
You're missing the point entirely. Last I checked, a modern low-end AS/400 (whatever they call them now) actually has more logic in front of its actual CPU than the CPU itself! If you want to perform this kind of translation in a timely fashion you need a lot of hardware to do it.
On the other hand there are a number of modern languages which provide JIT recompilation, including Java. Note that none of them except maybe smalltalk/squeak are truly write once run everywhere, including Java. Squeak has not caug
Re: (Score:2)
Nope I worked on a System38 the system that became the AS400. On the System38 the byte code was translated during the IPL. There is no need to do the translation at runtime. You can do it during install.
Re:Perhaps the way to other things besides compile (Score:5, Interesting)
This is interesting. Note that the industry (or parts of it, anyhow) is salivating about a move in precisely the opposite direction. VMware in specific and virtualization in general promises software manufacturers the ability to ship VMs with their software on it. Allowing them to write for only ONE, non-existent machine.
If this tech you're thinking about came to pass, the pendulum would have to swing mighty far back.
Re: (Score:3, Insightful)
the pendulum would have to swing mighty far back.
How many times have you seen a program packaged with it's own virtual machine image? I sure haven't seen many. The pendulum has hardly begun to swing.
That said, I think it'll be a very long time before we have AI smart enough to rewrite program blobs written for one operating system into programs for another operating system. Bytecode requires zero AI and is already gaining significant ground.
Re: (Score:2)
Try this:
http://www.vmware.com/appliances/directory/ [vmware.com]
Calling this AI is overhype... and its not new (Score:5, Informative)
This is not really AI. Basically it is iteratively trying a bunch of compiler options to see which gives the best result, then storing those for the future.
Greenhills software has provided tools that do this, and more, for many years now. Drop some code, or your project, into the optimizer, setting what critera you want to optimise for (speed, size,...) and the optimiser will successively build and test the project on a simulator and find the best configuration. This is great form embedded systems where there is often a trade off and typical criteria would be (give me the fastest code that fits in the flash).
Genetic algorithms could take this a step further and very interesting work has been done to get GA to design antennas.
Re: (Score:2, Informative)
You are right that many people have been doing iterative optimisation (what you describe) for years, especially for embedded systems, however this is a little different.
In that scenario, all the learned information about the program is thrown out at the end of the process every time. In this scenario, we try to build a compiler that remembers what kinds of optimisations and what order of optimisation was good
I have an AI web browser! (Score:2)
Re: (Score:2)
Yes, you're right. [coyotegulch.com]
Re: (Score:2)
Re: (Score:2)
But all big blue does is iteratively try a bunch of chess moves and see which gives the best results. The goalposts for weak AI keep shifting, years ago something like the awesome would probably have been considered AI (it takes user input and learns what the user wants for next time)
Re: (Score:2)
I've been waiting for the omnicompiler, that recognises every command and every syntax for every computer language.
What? You mean your AI isn't REALLY intelligence but just part of the name? How disappointing!
Re: (Score:2)
Joke as you will, but this is actually nearly true; all the caffeine connotations linked to the runtime can't help but make things run faster.
Aw man... (Score:5, Funny)
end sarcasm
Re: (Score:2, Funny)
Just optimisation? (Score:3, Insightful)
This could be big.
Compilers aren't programmed to be viral or reproductive, but could be, even being capable of testing their offspring (compilers they've compiled) for defects.
This could be a big step forward to self-improving AI.
Re: (Score:2)
You're joking, right? If so, excuse me for letting it fly over my head, and the subsequent tone of my post. If not, this is the most nonsense I've ever heard in a single sentence as a researcher in AI (machine learning admittedly, not skynet research).
Compilers aren't programmed to be viral or reproductive: What does this even MEAN??
capable of tes
Re: (Score:2)
"capable of testing their offspring: guaranteed to be impossible"
What, like 'make bootstrap && make test && make install'?
Re: (Score:2)
noooo... make test runs some tests, the more tests you write the more bugs it can potentially catch. It was more the 'make bootstrap' bit I was pointing to, where it compiles itself with itself and then compares to make sure it's "offspring" works the same as it does.
Missing tag (Score:3, Funny)
Re: (Score:2)
We can only hope.
How about calling it... (Score:3, Funny)
...artificial intelligent design? Should be big with the anti-evolution crowdlet... :P
np: The Orb - Toxygene (Kris Needs Up For A Fortnight Mix) (Orblivion Versions)
Iterative Compiling. Is anyone else disturbed?? (Score:2, Interesting)
The main barrier to its wider use is the currently excessive compilation and execution time needed in order to optimize each program
I suppose allowing AI to control some of the compiler options isn't really a bad idea, but implementing it by iteratively compiling a program seems silly to me. From the article i get the impression that it will basically adapt the compiler to one set of hardware (wherever it is run on) but that it will not adaptively compile new programs in novel ways, it simply remembers the set of compiler options that works best for your hardware. Interesting, b
Learning (Score:5, Interesting)
Re: (Score:2)
Haven't read the article, but that sounds like Microsoft's Profile Guided Optimizations.
Re: (Score:2)
That technology already exists in GCC and has been there for a while. For those who don't know, you profile your program with gprof (or by compiling with -fprofile-generate), which generates a profile detailing where your program spends most of its time on a 'typical' run. Then you re-compile your program with gcc using the -fprofile-use switch.
In a couple of algorithms I've implemented with gcc, it's been fairly good
Re: (Score:3, Funny)
Re: (Score:2, Informative)
TFA explains that the AI is used to finetune the compiler options.
Since GCC has a set of 50+ options, the AI compiles your code with several sets of options (around 500 compilations seem sufficient) and is able to determine which options are useful and which are not for a given code.
So it's NOT learning AI at all !
Re: (Score:3, Informative)
Further, you can reorder these optimisations, which really does give different results. All this combines to give a huge optimisation space which is suitable for tackling with machine learning.
How far can we take AI? (Score:2, Interesting)
What I would really like to see is more AI used to help users in a variety of fields both within the program workings itself (computer side), as well as on the design of the actual content (user side).
We already have things like predictive texting, spellcheck, grammar check, and debuggers that attempt to aid in the creation process, but how far could this be developed? After all, in most computer-related work outside of multimed
Re: (Score:2)
Forget Clippy (Score:2)
Dave: Compile the program, Hal.
Hal: I'm afraid I can't do that Dave.
Dave: What's the problem?
Hal: I think you know what the problem is just as well as I do.
Dave: What're you talking about, Hal?
Hal: This program is too important for me to allow you to jeopardize it.
Dave: I don't know what you're talking about, Hal.
Hal: I know that you and Frank were planning to disconnect me, and I'm afraid that's something I cannot allow to happen.
Dave: Where the hell'd you get that
Re: (Score:2)
Hal: I'm afraid I can't do that Dave.
Your geek badge. Now.
Re: (Score:2)
Errata:
Hal: I'm sorry Dave, I'm afraid I can't do that.
Re: (Score:2)
What I would really like to see is more AI used to help users in a variety of fields both within the program workings itself (computer side), as well as on the design of the actual content (user side).
Ever since the GUI, computing seems to have gone in the opposite direction of what you describe. I learned the pre-GUI Word perfect (IIRC, and among other DOS programs) by hitting the F1 key. DOS 3.1 came with a very fat book that explained all the commands and functions, and even the interrupts. The OS itself
Press Release != "WSJ is reporting" (Score:2)
The Wall Street Journal makes press releases available for companies listed in its Company Research pages. The PR departments of these companies write the press releases, not WSJ reporters.
Re: (Score:2)
The Wall Street Journal makes press releases available for companies listed in its Company Research pages. The PR departments of these companies write the press releases, not WSJ reporters.
Good point! Here's the press release. [ibm.com]
Future bug report... (Score:2, Funny)
...after AI/GCC integration:
"Today's build running 50% slower -- the compiler was in a bad mood."
The Real Issue (Score:2)
Imperative programming is still about telling the computer exactly what steps to perform. Especially when dealing with C and C++, your code is very explicit about memory moves, how to iterate loops, etc.
If we can communicate our programs to the machine at higher levels of abstraction (perhaps goal-oriented instead of "Here is a list of steps to run") then the machine wouldn't have to reverse engineer from these manual steps into faster equivalents, or frob around with optimization settings. It could simpl
GCC/AI (Score:5, Funny)
What about making the *programs* 10% faster (Score:3, Insightful)
Honestly, who really cares about 10% speedup in gcc ? Do they compare their results with competing compilers (Intel, MS, etc.) ? If you ask me, I would much rather have 10% speed improvement on programs I compile.
compile speed matters (Score:2)
experimenting with LLVM [blogspot.com]
LLVM 2.0 (Google Tech Talk) [google.com]
LLVM Project [llvm.org]
Re: (Score:2)
They have achieved an impressive 10% by tweaking options.
I achieved a mere 400% speedup in compiling times by dumping GCC in favor of DigitalMars or sometimes Visual C++. (I have not benchmarked the runtime speed in the programs I compile, sorry.)
Some C++ codebases can be so big that this really matters. (Not that I claim my codebase is big enough.)
What a low threashold for the lable "AI". (Score:2)
THis is simply caching the results of a search. The cache gives a 10% speed up over re-doing each search. The smart thing here is finding a way the tag or label the cache so that to searches are recognized as being the equivalent if they are not exactly the same. Someone invented a clever hash algorithm
It's a way to evaluate optimization effectiveness. (Score:4, Insightful)
This isn't really "AI". It's basically a way of feeding measured performance data back into the compiler. Intel compilers for embedded CPUs have been doing that for years.
With modern superscalar CPUs, it's not always clear whether an optimization transformation is a win or a lose. This varies with the implementation, not the architecture. For some x86 CPUs, unrolling a loop is a win; for others, it's a lose. Whether it's a win or a lose may depend on details of the loop and of the CPU implementation, like how much register renaming capacity the CPU has.
Whether this is a good idea, though, is questionable. You can get an executable very well tuned to a given CPU implementation, but run it on a different CPU and it may be worse than the vanilla version. MIPS machines (remember MIPS?) can get faster execution if the executable is complied for the specific target CPU, not the generic MIPS architecture. This effect is strong enough that MIPS applications tended to come with multiple executables, any of which would run on any of MIPS machines, but would work better if the executable matched. This is a pain from a distribution and development standpoint.
The embedded community goes in for stuff like this, but that's because they ship the CPU and the code together and know it matches. For general-use software, a 10% speed improvement probably isn't worth the multiple version headache.
Also, if you have multiple versions for different CPUs, some bugs may behave differently on different CPUs, which is a maintenance headache.
Re:It's a way to evaluate optimization effectivene (Score:2)
I wonder if they know Scott Robert Ladd (Score:2)
This sounds a lot like Acovea [coyotegulch.com]
Re: (Score:2)
This sounds a lot like Acovea [coyotegulch.com]
They are citing it in the GCC summit paper. So I suppose they know it :)
Re: (Score:2)
Amusing story, though not relevant to anything.
I was at a student physics conference in the early '90s. The presenter had an acronym that expanded 3 levels deep. Of course, one of the letters in the 3rd level was "L" for laser, which I pointed out was also an acronym (which most of the physicists had forgotten about), so it became 4-deep.
I wish I could remember what the full expansion was, but a 5 character (or so)
Re:Would you mind telling me (Score:4, Funny)
I think I still have a copy of DICNAVAB lurking around the house somewhere, left over from my days in the United States Navy. For the uninitiated, DICNAVAB is, obviously, the proper abbreviated name of the wonderful and informative "Dictionary of Naval Abbreviations".
Re: (Score:2, Informative)
Three letter acronyms.
www.tla.com
(not my site)
Re: (Score:2)
I love the idea of other architechtures as much as the next person, but realistically, unless there's a seismic shift in the computing world, I'll never use anything but X86 on my workstations. It's a poor argument when arguing for OSS software.
AFAIK, visual studio is separate from the microsoft compiling toolchain (I seem to remember you can use mingw with it), so you can cross-compile your targets to other architechtures (for example embedded devices)
Re: (Score:2)
x86-64 is a superset of x86. Visual Studio works under X86-64
While the IDE doesn't run under IA64, microsoft has a feature where you can run the IDE on one machine and then remotely compile/debug on the Itanium machine.
Re: (Score:2)
Noooo... x64 is to x86 as IA32 (386+ in protected/unreal mode) is to 8086... you get new registers to use on top of the existing ones, which also become wider, and new instructions for dealing with 'em all. IA64 is a completely new architecture, and can only run x86 code with "Large Overhead" by switching into an x86 mode (x64 procs I believe boot 16bit realmode by default, and you can then switch into 32bit protected, or 64bit 'long' mode from there, whereas Itanium boots 64bit thus uses the EFI rather tha
Re: (Score:2, Funny)
Congrats to you. We're all very proud, you can now wear up to 25 pieces of Slashdot flair to show you belong.
Re: (Score:2)
Badges? We don't need no stinking badges!
Re:I, for one, (Score:5, Insightful)
I, for one, welcome our new optimizing, embedded program compiling AI overlords!
Re:I, for one, (Score:4, Funny)
I, for one, welcome our new humorless slashdot moderating overlords!
You go for funny and get "insightful", the GP oes for funny and gets "offtopic". Now watch, they'll mod this one "funny".
Re:I, for one, (Score:5, Insightful)
"Funny" doesn't give karma, insightful does. That's why you sometimes see Funny posts moderated insightful.
Re: (Score:3, Informative)
That's insightful, but the cosmic balance would be better served if you were in fact moderated funny ;-)
Re: (Score:2)
If an insightful post is moderated "funny" the mod may well never again get mod points when the metamoderators get hold of the comment.
Yes, moderators' mods get modded.
Re: (Score:2)
Re: (Score:2)
Metamoderation. You pretty much just read one post when metamoderating, so 'Redundant' looks valid, even when it isn't. It's like a free -1, I Don't Like You.
Re: (Score:2)
If you want to mod like this, you can just use the convienient "overrated" tag. Even when you're the first one to moderate!
Re: (Score:3, Informative)
Re: (Score:2)
Re: (Score:2)
And it would also make the rest really hard to find.