Maximizing windows in OS X

So my mouse was acting crazy, and basically it caused me to shrink a window down to like 2x10 pixels... impossible to drag back to maximum size. After some research, I found an AppleScript that can be used to maximize windows:
tell application "System Events"
if UI elements enabled then
set FrontApplication to (get name of every process whose frontmost is true) as string
tell process FrontApplication
click button 2 of window 1
--button 2 is the green "zoom" button for all applications
--window 1 is always the frontmost window.
end tell
else
tell application "System Preferences"
activate
set current pane to pane "com.apple.preference.universalaccess"
display dialog "UI element scripting is not enabled. Check 'Enable access for assistive devices'"
end tell
end if
end tell

I then downloaded QuickSilver (by blacktree), and bound the script to the keys COMMAND+Z. Now I can just select any window, hit COMMAND+Z, and the window will maximize again. WHEW! What a life-saver.

To add the script to QS, you do the following:
1. Goto the Triggers tab
2. Hit the small + icon at the bottom left
3. Select (add) hotkey
4. Click on the Command-Key column, and set your Hot Key
5. Double-click on the Command column
6. For Select an Item, select the apple-script you just created
7. The Action should automatically be 'Run'
8. Save and exit

Test out your script by pressing the hotkey... it worked for me :)

OpenGL: Copy PixelBuffers to GL Texture

NOTE: some source is omitted, but should be relatively easy to re-enter.

1. Create the pixel buffer

EGLint configAttributes[] = {
EGL_RED_SIZE, 5,
EGL_GREEN_SIZE, 6,
EGL_BLUE_SIZE, 5,
//EGL_ALPHA_SIZE, 8,
EGL_DEPTH_SIZE, 16,
EGL_STENCIL_SIZE, 0,
EGL_LUMINANCE_SIZE, EGL_DONT_CARE,
EGL_SURFACE_TYPE, SURFACE_TYPE,
EGL_RENDERABLE_TYPE, EGL_OPENGL_ES_BIT,
EGL_BIND_TO_TEXTURE_RGB, EGL_TRUE,
EGL_NONE
};

EGLint pbufferAttributes[] = {
EGL_WIDTH, width,
EGL_HEIGHT, height,
EGL_COLORSPACE, COLOR_SPACE,
EGL_ALPHA_FORMAT, ALPHA_FORMAT,
EGL_TEXTURE_FORMAT, EGL_TEXTURE_RGB,
EGL_TEXTURE_TARGET, EGL_TEXTURE_2D,
EGL_NONE
};

eglDisplay = eglGetDisplay(EGL_DEFAULT_DISPLAY);
eglInitialize(eglDisplay, &major, &minor);
eglChooseConfig(eglDisplay, configAttributes,
&eglConfig, 1, &numConfigs);
eglContext = eglCreateContext(eglDisplay,
eglConfig, NULL, NULL);
eglSurface = eglGetCurrentSurface(EGL_DRAW);
eglPbuffer = eglCreatePbufferSurface(eglDisplay,
eglConfig, pbufferAttributes);

2. Create empty pbuffer texture (theSource) to render GL onto

glGenTextures(1, &theSource);
glBindTexture(GL_TEXTURE_2D, theSource);
glTexImage2D(GL_TEXTURE_2D, GL_RGB, width, height,
0, GL_RGB, GL_UNSIGNED_SHORT_5_6_5, NULL);
glTexParameterf(GL_TEXTURE_2D,
GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameterf(GL_TEXTURE_2D,
GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameterf(GL_TEXTURE_2D,
GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameterf(GL_TEXTURE_2D,
GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);

3. Drawing to the pixel buffer surface

eglMakeCurrent(eglDisplay, pbuffer, pbuffer, eglContext);
glBindTexture(GL_TEXTURE_2D, theSource);
// draw code, glDraw*, etc
eglMakeCurrent(eglDisplay, eglSurface, eglSurface, eglContext);

4. Creating the target texture (theCopy)

glGenTextures(1, &theCopy);
glBindTexture(GL_TEXTURE_2D, theCopy);
glTexImage2D(GL_TEXTURE_2D, GL_RGB, width, height,
0, GL_RGB, GL_UNSIGNED_SHORT_5_6_5, NULL);
glTexParameterf(GL_TEXTURE_2D,
GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameterf(GL_TEXTURE_2D,
GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameterf(GL_TEXTURE_2D,
GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameterf(GL_TEXTURE_2D,
GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);

5. Copying the pbuffer to the tagret texture (theCopy)

eglBindTexImage(eglDisplay, pbuffer, EGL_BACK_BUFFER);
glActiveTexture(GL_TEXTURE0 + theCopy);
glBindTexture(GL_TEXTURE_2D, theCopy);
glCopyTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, 0, 0, 256, 256, 0);

UTF8 in Java

Java seems to already assume UTF8 encoding.

String test = "ÀÁÂÃÄ";
System.out.println(test); // prints out ÀÁÂÃÄ
System.out.println(test.length()); // prints out 5

Here's how to convert the above UTF8 string into series of bytes:

byte[] bytes = test.getBytes("UTF8");

And here's how to convert those bytes back into a Java string:

String utf8 = new String(bytes, "UTF8");

Both of these must be wrapped by try-catches in case your system doesn't recognize UTF8 encoding, which would be pretty rare...

UTF8 Encoding

in utf8, words can be 1-4 bytes long.

each word uses the first couple bits of each byte to signify that it's a command. there's probably some clever mathematics involved, but I'm not going to spend the time to go figure it out.

see http://en.wikipedia.org/wiki/UTF-8

U+0000 - U+007F stands for the simplest case, ascii values 0-127. all these values are a single byte, and the way you can tell is its first bit will be 0.

VALUE: 0xxxxxxx

U+0080 - U+07ff are the 2-byte characters. the first byte will begin with 110, and the second byte will begin with 10.

VALUE: 110yyyxx 10xxxxxx

U+0800 - 0xFFFF are the 3-byte characters. the first byte will begin with 1110, the second with 10, and the third again with 10.

VALUE: 1110yyyy 10yyyyxx 10xxxxxx

U+10000 - U+10FFFF are the 4-byte characters. the first byte will begin with 11110, and each byte thereafter begins with the 10 bits.

VALUE: 11110zzz 10zzyyyy 10yyyyxx 10xxxxxx

Where x: lowest 8-bits
Where y: middle 8-bits
Where z: highest 5-bits

The maximum value that can be represented is x^(5+8+8). However, for some reason, the UTF8 standard only uses 0x00 - 0x10 for the highest bits. meaning the maximum value for any UTF8 (right now) is 0x10FFFF.

See the next post for how to handle these values in Java.