So I bought a SATA cable/power cable adapter for the BD drive. I also went ahead and got the Nvidia GeForce GT 710 1GB, which is just about the lowest Nvidia card I could find that was also supported. It was only like $35, but I wanted to see how it would compare to my 9 year old ATI Radeon HD 2600 XT 256MB and make sure before I invested in something considerably chunkier that I could get drivers and what not to work.
Results are mixed.
Whereas the ATI could run YouTube at least at 1440x60FPS, the Nvidia stutters. This is shocking because this card is SUPPOSED to be 4K capable. However, I do not have a 4K monitor, and it has a max resolution of 1920x1080x60FPS... But that would not explain why the ATI could handle 1440, at least internally, seemingly so. This was an experiment because not all of my video sources are even 1080p, and there's nothing actually made in 1440p, let alone 4K or 5K, so it's moot. I'm concerned because if this has something to do with native support for the ATI but not the Nvidia, I'm not sure I want to shell out for some kind of major GeForce GTX Infinity Numbers Here only to have there be a compatibility bottleneck.
However, on the positive side, there's drastic, noticeable improvement in system animations, application launching, web loading, etc. These things are also just plain sharper and more real looking. The downside of this, however, is that the sharpness is almost a bit too much for certain photos or videos, where I can see crisp jaggies. Maybe I just need to adjust the sharpness settings on my monitor. But for the many things that look better like this, I am not sure I want to lose them.
I THINK for my overall purposes the GT 710 is better than the Radeon HD 2600. But clearly neither are going to be optimal in a few years.
EDIT: Further testing reveals it's the driver. When using Nvdia's Web Drivers with the ATI Radeon, I get the same issue with 1440p, but I also get same lack of quickness and sharpness the ATI doesn't have versus the GT 710. When I use the stock OS X drivers, I get 1440p, but the same lack of quickness and sharpness. I'm going to hazard a guess here that what's happening is that Nividia Drivers are trying to make the cards render better quality? Maybe I could play the "1440p" because it was being downscaled on the ATI with the stock drivers? I can clearly see a quality difference on the GT 710, which means it must be pushing more pixels, which means that it's probably actually trying to push 1440p, and thus can't do it, where as the ATI under stock drivers isn't even trying.
Or, you know, I could be totally wrong about everything. But if I have this right then, the question is, what am I looking at as far as the next upgrade? 4K monitor or better GeForce with my current 1920x1080p display?
Given quickness and sharpness, I'm going to say the $35 GT 710 was an acceptable price to pay for the improvement it offers over the stock card, but it's not something permanent. I'll use it for now. With the ATI though, that gives me 4-5 displays possible, which is hilarious, as it's possible I won't even go up from one.