I know it's not quite the same, but can't you use the GL_ARB_compute_shader? Assuming you have access to version 4.3.
Recent graphics hardware has become extremely powerful and a strong desire
to harness this power for work (both graphics and non-graphics) that does
not fit the traditional graphics pipeline well has emerged. To address
this, this extension adds a new single-stage program type known as a
compute program. This program may contain one or more compute shaders
which may be launched in a manner that is essentially stateless. This allows
arbitrary workloads to be sent to the graphics hardware with minimal
disturbance to the GL state machine.
In most respects, a compute program is identical to a traditional OpenGL
program object, with similar status, uniforms, and other such properties.
It has access to many of the same resources as fragment and other shader
types, such as textures, image variables, atomic counters, and so on.
However, it has no predefined inputs nor any fixed-function outputs. It
cannot be part of a pipeline and its visible side effects are through its
actions on images and atomic counters.
OpenCL is another solution for using graphics processors as generalized
compute devices. This extension addresses a different need. For example,
OpenCL is designed to be usable on a wide range of devices ranging from
CPUs, GPUs, and DSPs through to FPGAs. While one could implement GL on these
types of devices, the target here is clearly GPUs. Another difference is
that OpenCL is more full featured and includes features such as multiple
devices, asynchronous queues and strict IEEE semantics for floating point
operations. This extension follows the semantics of OpenGL - implicitly
synchronous, in-order operation with single-device, single queue
logical architecture and somewhat more relaxed numerical precision
requirements. Although not as feature rich, this extension offers several
advantages for applications that can tolerate the omission of these
features. Compute shaders are written in GLSL, for example and so code may
be shared between compute and other shader types. Objects are created and
owned by the same context as the rest of the GL, and therefore no
interoperability API is required and objects may be freely used by both
compute and graphics simultaneously without acquire-release semantics or
object type translation.
An alternative would be to turn your thinking upside down. Start of with a very high resolution mesh in your editor, and use some method to reduce the number of triangles (Delaunay triangulation, for example) in patches where they aren't needed (and at different LODs). That way you can still use the high res normal (baked into a texture) to give fake detail on distant terrains.
Oogst - do you carry on rendering when your game loses focus? If so, try to stop calling swapbuffers (and pause the game) when you lose focus. I had a similar problem a long time ago, and that fixed it for me.
It's not just the appearance of the code that matters: the second example shows a clarity of thought when tackling the problem domain, the first feels more like a mental hit and miss approach (i.e. you started coding before you knew what you wanted to code).
It happens to everyone - sometimes you just can't see a clean solution to a problem. I often find just walking away from the computer for a few moments can help, or in more complex cases getting out a pen and paper and writing down what you want to achieve, be it in English or via a flow chart or similar.
"Lua is a tiny and simple language, partly because it does not try to do what C is already good for, such as sheer performance, low-level operations, or interface with third-party software. Lua relies on C for those tasks. What Lua does offer is what C is not good for: a good distance from the hardware, dynamic structures, no redundancies, ease of testing and debugging. For that, Lua has a safe environment, automatic memory management, and great facility to handle strings and other kinds of data with dynamic size."
In this modern age, battery life is just as important as absolute performance.
If you're starting from scratch, I'd have to seriously recommend downloading Visual Studio 2013 Community Edition, and learning C#. There are loads of online resources, but better still buy a good C# book.
After several months of hard graft getting your head around programming, look into using an existing game engine, such as Unity or Unreal (both of which can be used with C#).