A Unity user has submitted a bug report for a system that features an NVidia Geforce 3. When the Geforce 3 was released, NVidia had just acquired 3DFx. It was 2001. That’s a 10 years old video card we are talking about.
Anyways, my answer to the question in the title is no. But why? Here are a few reasons:
- Our minimum OpenGL requirement for Unity support is version 1.4. From what I could gather on the web, the Geforce 3 is at best capable of OpenGL 1.3.
- Unity needs support for OpenGL framebuffer object. I can say with certainty, that extension is not available on the Geforce 3.
- The Geforce 3 is a video card that is hard to find these days. Remember, it was released in 2001. You would also need an AGP desktop system to host it. It is hard to get Unity support on a graphics hardware that we don’t have on hand.
There are other factors such as computational power, on-board memory… Today’s base level GPU power and feature set is much higher than what was available 10 years ago. Unity experience on a graphics card such as the Geforce 3 would have to be severely downgraded. It wouldn’t look or feel like Unity anymore.
Yet, there is still hope. Unity 2D was designed for hardware that do not have the base requirements to run the OpenGL accelerated version of Unity. There is still the performance concern but at least a Geforce 3 won’t be left out in cold.
glBindBufferARB was defined in the OpenGL extension GL_ARB_vertex_buffer_object. That was in OpenGL version 1.4.
Starting with OpenGL 1.5, GL_ARB_vertex_buffer_object was accepted into the core OpenGL API and so glBindBufferARB became glBindBuffer. glBindBufferARB is still available and performs the same function as glBindBuffer. Yet, in source code, you have to be careful not to use glBindBuffer if your program is going to run on a GPU that only supports OpenGL 1.4. Otherwise, your program will crash.
I am stuck to using glBindBufferARB instead of glBindBuffer because Unity must run on systems such as the Intel GMA 950 which only supports OpenGL 1.4 on Linux. You may think that all is find when your code runs on a systems with OpenGL 2.0+, only to see it crash on a system with a lower version of OpenGL. So be careful!
I have been doing some prototyping with shaders lately… Here is some work in progress. A blurring algorithm followed by noise distortion.
We are currently adding graphics hardware capability detection for Unity. We are trying to detect a system ability to run Unity. Unity requires that your GPU supports a set of opengl extesions such as:
- Frame buffer object
- ARB programs (vertex and fragment)
- Non-power of two textures
- and a few more…
If your system GPU is no more than 4 to 5 years old, these requirements are very trivial. The reason we need to detect the graphics capability of your system is, we want to enable Unity desktop experience on as many systems as possible. If a system does not have the required hardware support, we want to gracefully fallback to other alternatives best suited for that system.
A Dell mini 9 will will run Unity without problems. I have one such system and I frequently make sure that Unity runs fine on it. One feature that I often check, is the responsiveness of the launcher. When dragged, the launcher returns to an optimum position. The animation that takes places during this step has to be smooth and consistent.
I have enabled Unity on and ATI FireGL 5200 with no support for non-power of two textures. On that system, we rely on the OpenGL ARB rectangle texture extension. Rectangle textures have limitations but so far we haven’t met a feature in Unity that is limited by it.
We will add more requirements as we identify the features that we want in Unity. There is also the issues of drivers. It is not enough to have the hardware capabilities, drivers have to support them also. This is a subject for another article.