View Full Version : PS: Question on huge size images
05-02-2010, 06:50 PM
I have a question on huge maps and file sizes. I have a request to do an A0-sized map but I ran some tests and I dont think I my hardware/software can support this. I have decent stuff (vista / i7 cpu / 64bit / 6gb ram) and currently use PS CS3. Converting some old maps (15 layers or so) to this size was possible but sort of sort-of blew up my computer, and I couldn't even save the image afterwards. Also scrolling/panning took about 2 minutes with every step or so.
I saw loads of tuning tips on PS sites I may be able to go through these (although I'm no PC tuning expert) and see what will work, but maybe any of you have experience with these large-map-sizes and pc-configuration (so I at least know it is possible to get this working, before I try loads of stuff)? Maybe its the CS3 version as I read CS4/5 has better performance with large files (and using the 64bit)...? any info appreciated!
05-02-2010, 07:05 PM
The biggest res I've been able to manage reasonably (vista/AMD 4-core 64-bit/6g ram) was 4950*3150. Even then I had to keep my total number of layers very small, under 10 or so.
05-02-2010, 07:30 PM
Hm, the biggest image I've ever made was a city map with six or seven layers (max) with a resolution of 32000x32000 at ~90 dpi (or so I think as I tried it some time ago). I used our old workstations (Vista/6*Intel 2-core x64/12 GB RAM). The file (.bmp for curiousitys sake) was at least 20 Gigs big and I think it took about two hours to save it. But I would never do this again! I would work at the map and draw the important stuff, then divide it into several parts and try it then. Or maybe with vector graphic programs but I never tried it ... I'm sorry if that wasn't of any help. : /
Edit: Done in PS CS3 & Vue 7 with external eSATA-drives
05-02-2010, 07:30 PM
The largest map that I've worked on has been 10,000 by 7500 on a Mac with 4GB or Ram and using a 32 bit version pf PS CS4. It was slow, and a little jittery, but worked fine. One thing I found to be a huge help was having an external swap drive - in this case a fast (7200rpm) drive attached by firewire. That makes a large difference for large file sizes. Also, turning off maximise compatibility when you save it helps speed up the save time to something reasonable.
05-02-2010, 07:32 PM
Well you won the challenge and got a copy of ViewingDale didn't you so there should be few limits as to how big you will be able to map with that. I have a 160,000 pixel square map running (ok so its slow) but it can do it. That burns multiple 10's of Gig HDD space but it does not map quite the same way as PS or a bitmap editor so you may find that you will have to adjust your usual techniques. You will have to tile a really big image but the app will take the chore out of the composite of the tiles. What DPI are you aiming for - what final pixel size image ? 46inches at 300 dpi is only 14K pix which is no big deal at all. Because your 64bit you have higher max sizes than 32bit too. 6Gb is more than me not that it will make much difference anyway.
05-03-2010, 02:51 AM
My star chart map is 7500x5000 and that has around 300 layers. It does run rather slow, when zooming and such, but its not unbearable. And my machine is about 3 years old (Vista/AMD 4400/4GB.
I've recently upgraded to PS CS5 and it definitely runs smoother than previous versions.
05-03-2010, 04:37 AM
If you want a resolution of 300dpi, you'd end up with a file of about 14000 x 9000 pixels. That's darn big. I strongly suggest to go the vector way, and give Illustrator or Xtreme a try. Obviously, you'll need to learn some new techniques to get good results.
05-03-2010, 06:39 AM
Crikey this thread is more interesting than I would have given. Obviously my idea of what is big is way different to everyone elses. My star chart challenge entry (Map for print) was 8192x12288 pix and the map I made for Galrion commission was 12288x16384. The MeDem map is 40K square. I would say 4K is a good size, 8K is quite big but you need to get to about 20K before it gets app bustingly big. After that I would say you absolutely have to tile tho I would recommend tiling at 8K squares. Thats only a 2x2 set to cover your 14K square image.
Gidde, if your machine is dying at 5Kx3K with 6Gb of RAM then something must be wrong ! I mean were talking a factor of 100x as big differences here. 5kx3k = 15Mpix and a big digi camera can take photos of that size and it doesnt have 6Gb of ram on board ! 15Mpix x 4 bytes per pix = 60Mb per layer so thats 100 layers of that to fill 6Gb. Sounds like a setup issue.
16Kx16K = 256Mpix = 1Gb per layer or undo buffer if your on an old style app like PS or Gimp etc. This is more of a problem for these apps esp in 32bit land.
05-03-2010, 08:08 AM
i agree redrobes, but all my geek-fu plus a thread here failed to solve it :( my latest map hates me if i have more than 20 or so layers, and it's not even that big.
i do need to clarify though ... my machine doesn't die, gimp does. i can zip through a ton of other stuff no problem while gimp chugs away.
05-03-2010, 08:37 AM
I found that gimp had horrible memory management as soon as maps start getting large. One thing to check is the dimensions of your layers. PS only uses memory for the areas of the layers that have something in them (I think). Gimp uses memory for all pixels in a layer, even if those pixels are totally transparent. You want to make sure that any layer in gimp is as small as possible in dimensions, rather than allowing all the layers to be the size of the image. I found that helped a lot when I was using it.
05-03-2010, 08:38 AM
Are you running a 64 bit version of Gimp ? Cos at 20 layers and maybe some hidden undo buffers perhaps its getting close to the 3-4Gb limit of 32bit apps. If it were me, id run up the sysinternals process viewer and see how much memory and how many handles etc its burning with that map open. You can look at the pagefile delta and see if its spending all its time page file thrashing too. Well, a topic for another thread but something is not quite right there. I guess being on topic tho it is worth ensuring that your system is tuned up to cope with big maps if your heading into a big map job. Just seems odd to me that PS and Gimp don't handle large images very well. I thought PS had a large image mode or is that just for the file save options. With all the resources of Adobe, cant they figure it out ? What do the pro design outfits do when customer asks for a huge billboard with photos on it up close like in an expo or something ?
05-03-2010, 02:31 PM
Guess I'm not the only one struggling with big files. All thanks so much for the input so far. (and you're right RR, i still need to really testdrive your ViewingDale software, shame on me :( ) ... I'm afraid that for this I wont have time to learn another tool (like viewingdale or a vectorish program). I did some tuning and it now works with 3 to 4 layers at 300dpi, but after that, things start going down hill fast. As far as I can tell it appears to be a CS3 (+32 bit?) thing in that it can only use about 3GB of my 6GB ram. My CPU is almost asleep but my memory is overpacked (at least, that's what these little graphs of my 'resource manager' tell me ;) ... The saved PS-file itself was not even 3GB so i would think it could run completely in memory... hmm... very confusing all this to me :( ... So i guess it's either upgrading to 64bit CS4/5 and/or buying a fast external drive ... to shoot an elephant, one needs a canon, it seems... cheers
05-03-2010, 02:47 PM
I have CS3 myself, so I looked for you. Here is what I got:
Allocating memory above 2 GB with 64-bit processors
Photoshop CS3 is a 32-bit application. When it runs on a 32-bit operating system, such as Windows XP Professional and some versions of Windows Vista, it can access the first 2 GB of RAM on the computer.The operating system uses some of this RAM, so the Photoshop Memory Usage preference displays only a maximum of 1.6 or 1.7 GB of total available RAM. If you are running Windows XP Professional with Service Pack 2, you can set the 3 GB switch in the boot.ini file, which allows Photoshop to use up to 3 GB of RAM.
Important: The 3 GB switch is a Microsoft switch and may not work with all computers. Contact Microsoft for instructions before you set the 3 GB switch, and for troubleshooting the switch. You can search on the Microsoft support page for 3gb for information on this switch.
When you run Photoshop CS3 on a computer with a 64-bit processor (such as a, Intel Xeon processor with EM64T, AMD Athlon 64, or Opteron processor) running a 64-bit version of the operating system (Windows XP Professional x64 Edition or Windows Vista 64-bit) and with 4 GB or more of RAM, Photoshop will use 3 GB for it's image data. You can see the actual amount of RAM Photoshop can use in the Let Photoshop Use number when you set the Let Photoshop Use slider in the Performance preference to 100%. The RAM above the 100% used by Photoshop, which is from approximately 3 GB to 3.7 GB, can be used directly by Photoshop plug-ins (some plug-ins need large chunks of contiguous RAM), filters, or actions. If you have more than 4 GB (to 6 GB), then the RAM above 4 GB is used by the operating system as a cache for the Photoshop scratch disk data. Data that previously was written directly to the hard disk by Photoshop is now cached in this high RAM before being written to the hard disk by the operating system. If you are working with files large enough to take advantage of these extra 2 GB of RAM, the RAM cache can speed performance of Photoshop. Additionally, in Windows Vista 64-bit, processing very large images is much faster if your computer has large amounts of RAM (6-8 GB).
The default RAM allocation setting is 55%. This setting should be optimal for most users. To get the ideal RAM allocation setting for your system, change the RAM allocation in 5% increments and watch the performance of Photoshop in the Performance Monitor. You must quit and restart Photoshop after each change to see the change take effect.
The available RAM shown in the Performance preference automatically deducts an amount that is reserved for the operating system from the total RAM in your computer. You shouldn't set the percentage of RAM to be used by Photoshop to 100% because other applications which run at the same time as Photoshop (for example, Adobe Bridge) need a share of the available RAM. Some applications use more RAM than you might expect. For example, web browsers can use 20-30 MB of RAM, and music players can use 20-50 MB RAM. Watch the Performance Monitor to view the RAM allocations on your computer.
Watch your efficiency indicator while you work in Photoshop to determine the amount of RAM you'll need to keep your images in RAM. The efficiency indicator is available from the pop-up menu (choose Show > Efficiency) on the status bar of your image and from the Palette Options on the Info Palette pop-up menu. When the efficiency indicator goes below 95-100%, you are using the scratch disk. If the efficiency is around 60%, you'll see a large performance increase by changing your RAM allocation or adding RAM.
Hello mind, are you there somewhere? *sigh*
Maybe I should drop the link for you, guys ...
Click me! (http://kb2.adobe.com/cps/401/kb401088.html)
05-03-2010, 03:58 PM
Antheon, that's great info! I had seen it but didnt understand ... I tried understanding it again (and I still dont :) ) and I somehow 'tweaked' the 'RAM thingie' and now have it running with 10 layers and it's pretty smooth :D have some rep! cheers!
05-03-2010, 04:07 PM
I'm glad to hear it was of help. If you haven't looked at the link, I suggest you to do it because the page is about CS3 handling large files with further tuning tips. : )
05-03-2010, 10:14 PM
I've been working on a large map for a pen&paper RPG I've been designing. I'll have to check the specs when I get home, but in my experience the amount of RAM in relation to the saved size of the file has made a huge difference, as has 64-bit. A previous PSD version of the map on a computer with 2-4GB of RAM and PS CS3 would take at least 30 min to open and a good hour to save. Everything was slow and jittery like so many of you have described.
I started from scratch for my current iteration, and this file also marks the first time I discovered the 4GB file-size limit for PSD files. As I said, I'll need to check the exact dimensions, but I believe the image is somewhere around 37800x21600 (42"x24" @ 900px/in) with at least 30-40 layers. My new computer (Core i7 920, 6GB DDR3 RAM, 2x 1TB SATA HDD, PS CS4, Win7 x64) handled it with no problem. Everything is smooth, opening took around 2min and saving around 8-10min. The PSB file was over 7GB, so it would not all fit in memory. I recently upgraded to 12GB of RAM and that has shown a marked improvement. The ever-growing 7.5+ GB file opens in 1-2min, saves in about 5-6min, and everything runs smoothly. PS is generally using 7.5-8.5GB of memory in the resource monitor. I typically have several instances of IE, Word, Excel, and a video playing on a second monitor while working on this.
My suggestion for anyone working with large PS files would be to make sure that the amount of RAM available to PS is at least equal to the size of the saved file. This may mean increasing the amount of RAM as well as upgrading to a 64-bit version of your OS and PS. Of course, you also want to make sure you've got a fair amount of free space for your scratch disk.
otherwise djespek, sketch the whole thing up and divide it into 4 or more documents for the fine details, then flatten the layers before putting the results together in the end. And if you think of changing stuff, my bet would be on changing to windows 7 instead of Vista - cause that takes a lot less resources (using Vista myself but thinking of upgrading when I get rich *lol*)
05-04-2010, 07:23 AM
Hi Natai & welcome to the guild. Good info there.
Heres the bottom line. Computers run with (nowadays) 32bit or 64bit. To be fully 64bit you need a) a 64 bit OS - Windows XPx64 Vista64 or 7(64bit). All the Linuxes have been 64bit for years. Not sure about Macs. You also need b) a 64bit CPU which is required for a 64bit OS and c) The app your running has to be 64bit. If you have all three then your 64 bit.
With 32 bit all your memory pointers are 32bit. This means that 1^32 = 4Gb. I.e. you cannot point to memory > 4Gb because you have run out of indexes. Now these pointers are virtual RAM not physical RAM. I wont go into that much but the effect is that EACH app can access up to 4Gb so you can blow 4Gb out of a 6Gb system and run out and yet still play some video and browse the web with another app. If you run out of memory in an app it pages it off to the harddrive. You can in theory install a RAM drive and point your pagefile at the ramdrive and then the app will page it out to a different apps 4Gb of space so one app is kinda using more than 4Gb. But lets face it this is all a whole lot of hassle and you should not install more than 4Gb on a 32bit system.
With 64bit the pointers are 64bit and therefore absolutely huge - like count of atoms in universe type numbers so its never going to run out. Each app can now access as much RAM as you can get into the PC. There is no limit any more.
Your HDD is very very slow and, being generous, typically has a sustained write speed of about say 50Mb per sec. So 1Gb takes about 20secs and so about 3Gb per min at full sustained maxed out rate. Therefore if you have 6 or 8 Gb of RAM there is not much point in having a pagefile cos your system hangs for multiple tens of seconds when it must page out. So with small amount of RAM systems having a pagefile is ok, multi gig RAM systems and theres no point. Compare with RAM write speed at a few gig per second. Theres always been about a 100:1 speed ratio between ram and HDD.
The PSD file is very likely to be compressed like a PNG. I dont know this for sure but as a rule its quicker to compress with the CPU and save the smaller data to HDD than not to compress and save that time but wait longer for the HDD to save it. Therefore I would guess that the amount of RAM used by an image set with all its layers would be bigger than the PSD file. Perhaps as a rule having RAM = PSD file size is not such a bad thing. My suggestion tho is to run the process viewer and see how much RAM its using.
So heres my top tips. If in 32 bit land get 4Gb or so and no more. If you want to go bigger then preferably upgrade to 64 bit bit if not then either get more ram and install a RAM drive and put all swap data onto the RAM drive. Or, get some super fast 10,000 rpm drives and RAID them into a striped array and try to get the read/write speed up and use that as the pagefile drive. Or if you have the dosh, get solid state flash drives and use them for that job. Upgrading is easier tho ;)
Or change to an app which doesn't use memory like PS does... ;)
thats a good run down redrobes - very nice... defintly makes me think about the specs for my next machine ... in about 2020 or so... ;)
concerning photoshop - the rule of thumb I learned some years ago is that it uses about 3 times the ram as the size of the image, so if you have a 1 gig image, it uses 3 gigs to handle it.
05-04-2010, 10:33 AM
Thanks for the welcome, Redrobes.
Okay, my file specs were definitely off a bit. The image is 20122x11634 @ 450 px/in. Apparently the file size was way off too, I must have been thinking about the old version of the file. The PSB file is about 3.7GB saved, and PS consumes around 7.75GB in memory while I'm working with it. So tilt's point is a good one - go with 2-3 times the amount of RAM relative to the saved file size.
Definitely a good explanation of the 64-bit and RAM limitations. A couple of other points I seem to remember from when I was researching prior to transitioning from 32-bit to 63-bit Vista: because of how the pointers work the most RAM any given application can use on a 4GB 32-bit system will probably be in the neighborhood of 3.2 to 3.7GB. Most 32-bit OS will only actually detect 3.7 to 3.8GB of available RAM, depending on your setup. Sometimes the type and configuration of your video card will also affect this, as some setups allow the video card to use a portion of system RAM as a sort of virtual VRAM. So, bottom line if you're working with files larger than about 1.5GB, you will probably benefit from upgrading to 64-bit. Just be sure to keep in mind what Redrobes posted - you're entire system has to be 64-bit, the CPU, OS and the application.
For the most part (with the exception of the VRAM and memory pointer issue) the video card has very little impact. CS4 enables GPU support to increase draw speeds on the monitor, but I think that's about it.
However, that may be changing now. The new DX11 cards enable the computer to offload computing tasks to the graphics card. This isn't something CS4 is really capable of taking advantage of, but I've heard CS5 might be able to utilize this feature. I'm not sure how much of a performance boost you would see, though it might not be that great if only the GPU itself is used as PS doesn't really use that much CPU power. It would probably be of greater benefit if this setup allowed PS to utilize the VRAM on the card. Anybody tried anything like to see if it makes a difference? And then there's the issue of cost: upgrading to CS5 aside, the new nvidia DX11 card start at over $400. Upgrading to 64-bit with more RAM is definitely a better option.
05-04-2010, 02:03 PM
In an OS you have user space and kernel space and a while ago MS didn't think that we would get to 4Gb so divided it up evenly at 2Gb each. The kernel has to have some RAM with critical lock on it cos its the low level stuff and you machine will die without it. Anyways, all the apps you run apart from heavy device drivers are all in the user space. So on the old setups you (user apps) can get at 2Gb. Then when we got near to 4Gb MS realized that the 2,2 split was not very sensible cos the kernel does not need a full 2Gb to run so split it differently. Now I have heard all sorts of numbers here but yours are about the norm - 3.5Gb or so for user and 0.5 for kernel. Your right too in that if you happen to have certain graphics cards that share system ram then it can eat into some of that and reserve more kernel and give less user. Most cards have their own video ram and need only an "aperture" as a kind of cache to transfer data from system ram into video ram. So bit more detail there but your right on the money.
With the cards tho, its been possible for some time to use them as processing engines since we had programmable shaders in them. Tho the programs were very limited indeed and simple. As time has gone on the shader complexity went up then we had a full scale shader language built into OpenGL 2. So basically your graphics cards needed to run arbitrary code. nVidia and ATI both made cpus that now run arbitrary code with interfaces that anyone can use - right now thats CUDA for nVidia and the Close to the Metal / Stream for ATI I think its called. Its down to the card not the DX version tho. Theres a newer non company specific interface done by the same people who did OpenGL called OpenCL which is the open compute language and the idea is that once nVidia and ATI both have unrestricted licensed drivers for these then we can write in OpenCL and run on either ATI or nVidia. Not sure where that is now but last I heard you still had to be a registered dev on nVidia to get access. Well when it happens then graphics intensive apps like PS will make more use of these API's and accelerate in hardware some of the processing which will speed things up a lot. I would say tho that in the meantime you can expect your CPU to be in overdrive when doing large area computes like a blur. If the image is so large your into pagefile land then as said earlier your at that 100:1 HDD ratio and your CPU does 1% work and HDD does 99%+ so you fall off the performance cliff. Its in these instances where you get 100x speed increase if you can prevent the system paging ! I.e. its the point where you must stay behind in order to work effectively. So my tip is to tile the image and stay behind that point or it gets painful.
05-04-2010, 02:17 PM
I've been manipulating imagery in 21601 × 10801 resolution recently. My record. This is using PS CS4 on my 2-year-old base MacBook(2.1 GHz Core 2 Duo, 2 GB ram on Mac 10.6.3.
A little slow generally. Flexify takes a couple minutes. Not too bad. Saving onto my 1.5 TB external drive is a bit excruciating, but that's 'cause the drive is cheap and slow. With my old 1 GHz single core PowerBook and CS1 8192 x 4096 was more of a challenge than this(actually painful in extremis dei, amen). I like my little bargain basement mac laptop.
05-04-2010, 06:23 PM
My gawd. 20k x 10k?? I'm incredibly jealous of everyone posting in this thread :(
05-15-2010, 10:05 PM
I recently completed a very large map which was 9200px x 7200px using Photoshop and Illustrator CS4. Total of 1800 mb. It has about 40 layers; although many of those were solid fills with an alpha channel, or placed PDF's from Illustrator. (The illustrator file, with all the linework, and text, was around 2 gigs. Breaking it up this way made it much more digestible to Pshop.
My biggest tip is to use PSB format. (Not the PSD) This is the photoshop 'large image format' and it makes a huge difference in performance. I was having lots of problems initially w/ Photoshop freaking out before switching to PSB. After that, it was always able to deal with the big file and performance improved significantly.
05-17-2010, 11:57 AM
Just a note that i'm working with 30k by 25k image with 27 layers in Photoshop CS4 (64-bit) and it's running like a champ. With CS4 the trick is that it offloads OpenGL functions to the video card to improve performance. Only an issue if you have many windows running (I open each element into a separate window and then composite into another window entirely) but there's one possibly bottle neck.
The next thing to remember when using huge files in photoshop is to zoom in to what you need rather than working with the whole image all the time. Newer editions of Photoshop finally started realizing they didn't need to spend memory on complete renders of things completely off screen.
Next, make sure you have a good scratch disk. Allocating space for a scratch disk increases photoshop's performance drastically.
Lastly, remember to use Smart Objects. Anything that repeats in your file and is made of multiple layers needs to be reduced to a smart object. That way Photoshop only processes the object once, and then applies transformations/filters on each instance of the object. I'll use Smart Objects for even simple items with 2 layers (city markers come to mind as a good example).
All in all, my specs aren't too different from the OP's.
NVidia 280GTX (1GB on-board VRAM)
What I probably have that's drastically different is my drive setup. I have a system drive, an apps drive, several storage drives and space that serves as a 'work space' area that I never fill. That gets used for that scratch space mentioned earlier.
Pixelpusher: I've never tried PSB. I may give it a shot. Performance is fine now, but I'll always take a tweak.
05-17-2010, 12:01 PM
Good info. Sounds like Adobe have finally woken up.
05-17-2010, 01:27 PM
Good info. Sounds like Adobe have finally woken up.
More they finally learned to work with modern memory management.
That said, using a 2500 pixel smudge brush? Still makes it chug. Really though, the smudge brush that has to do samples across multiple layers and averaging as it draws in real time. It's always been one of the functions that eats Photoshop if you use it much.
05-22-2010, 05:07 PM
Interesting thread. Here's from the Photoshop help: "With a few exceptions (for instance Large Document Format (PSB), Photoshop Raw, and TIFF), most file formats cannot support documents larger than 2 GB."
"The Large Document Format (PSB) supports documents up to 300,000 pixels in any dimension. All Photoshop features, such as layers, effects, and filters, are supported. (With documents larger than 30,000 pixels in width or height, some plug-in filters are unavailable.)"
30 000 x 30 000 pixels will be about 100 x 100 inch on a 300 dpi resolution image. That is a pretty big image :)
30 000 x 30 000 pixels will be about 100 x 100 inch on a 300 dpi resolution image. That is a pretty big image :)
thats sligthly bigger than my desktop printer can print - so I guess I'm safe *lol*
Powered by vBulletin® Version 4.2.3 Copyright © 2016 vBulletin Solutions, Inc. All rights reserved.