Tuesday, March 10, 2009

Good news everyone!

A quick update from the JIT front. As of yesterday, we're now able to translate a highly-experimental Python interpreter that contains JIT. It mostly crashes immediately, mostly due to some unsupported operations in the assembler backend, but for a carefully crafted program, we're able to get massive speedups. For something as complex as:

  i = 0
  while i < 10000000:
   i = i + 1

our JIT is about 20x faster than CPython. That's still about 3x slower than Psyco, but looking at assembler code it's obvious that we can speed it up a lot. These are very good news, since we don't encode python semantics at all in the JIT. The JIT is automatically generated from the Python interpreter source code. This means we should be able to expand it to handle more complex python programs relatively quickly (interested assembler experts needed!).

This is actually the fifth incarnation of JIT that happened over the last two years. It's by far simpler and more promising than any of the previous approaches. Expect more details soon!



Anonymous said...

Very exciting news indeed.

Zemantic dreams said...

This is exciting. The world is waiting.

(I am still sad that Psyco development was discontinued and never ported to 64bit)

nekto0n said...

Great news!
Activity in blog shows that project is full of enthusiasm.

Anonymous said...

wow, that's really great =)

Eric van Riet Paap said...

Congratulations! Very nice to read about these milestones.

I did not follow llvm development but does anyone know if they made some arrangements by now that would enable the PyPy JIT generator to leverage their optimizers?

Harold Fowler said...

Wow, you are right, that is good news.


Anonymous said...

I'm wondering why something like this would be faster than CPython? New to the whole python scene so I'm really just curios.

René Dudfield said...

nice one :)

In the mean time... I wrote an optimized version of that program for CPython:

i = 10000000

CPython is 10000000x faster than the pypy jit!!!!!!

Tim Wintle said...


This is very exciting.

@Anonymous - it's because the standard python interpreter doesn't use a JIT, which makes dynamic languages quite slow.

Anonymous said...

I'm waiting for production solution!