I created a massive (7GB) PDF in R, and now every computer/program combination I try and open it with craps out. I'm not even sure why as the entire file fits easily in RAM on more than one of the machines I used.
I've tried Adobe Reader on Windows and OS X, and also QuickLook and Preview on os x; this was on machines with up to 16GB ram apiece, but every time the OS or application just crashes.
I can pretty easily get ahold of a Windows 8, Ubuntu (any version), or OS X 10.8-10.9 machine as need be.
I'm fairly confident that the file is fine, I've created other files in the same way, but with smaller datasets and those files opened fine.
Unfortunately I don't think I can split the file as it's just one big plot, so there's only one page in the PDF and I don't know of any way to split it without opening it. And generating many PDFs from smaller chunks of the input data isn't an option for me as I was supposed to be finding a program that would plot arbitrarily large datasets. Well, R succeeded in plotting the data (the plot is viewable while the R session is still running even), but the saved version of the output is pretty much unusable.
EDIT: So the SVG wound up being only 260MB. I'm guessing that R makes really inefficient pdfs which just isn't a problem with smaller datasets. The SVG is a little slow to open but it does open and that's all that I needed, thanks everyone.
If the first person to suggest an SVG wants to submit that as an answer, I'll accept it.