Page 3 of 3 FirstFirst 123
Results 21 to 29 of 29

Thread: PS: Question on huge size images

  1. #21
      Redrobes is offline
    Software Dev/Rep Redrobes's Avatar
    Join Date
    Dec 2007
    Location
    England
    Posts
    4,775
    Blog Entries
    4

    Default

    In an OS you have user space and kernel space and a while ago MS didn't think that we would get to 4Gb so divided it up evenly at 2Gb each. The kernel has to have some RAM with critical lock on it cos its the low level stuff and you machine will die without it. Anyways, all the apps you run apart from heavy device drivers are all in the user space. So on the old setups you (user apps) can get at 2Gb. Then when we got near to 4Gb MS realized that the 2,2 split was not very sensible cos the kernel does not need a full 2Gb to run so split it differently. Now I have heard all sorts of numbers here but yours are about the norm - 3.5Gb or so for user and 0.5 for kernel. Your right too in that if you happen to have certain graphics cards that share system ram then it can eat into some of that and reserve more kernel and give less user. Most cards have their own video ram and need only an "aperture" as a kind of cache to transfer data from system ram into video ram. So bit more detail there but your right on the money.

    With the cards tho, its been possible for some time to use them as processing engines since we had programmable shaders in them. Tho the programs were very limited indeed and simple. As time has gone on the shader complexity went up then we had a full scale shader language built into OpenGL 2. So basically your graphics cards needed to run arbitrary code. nVidia and ATI both made cpus that now run arbitrary code with interfaces that anyone can use - right now thats CUDA for nVidia and the Close to the Metal / Stream for ATI I think its called. Its down to the card not the DX version tho. Theres a newer non company specific interface done by the same people who did OpenGL called OpenCL which is the open compute language and the idea is that once nVidia and ATI both have unrestricted licensed drivers for these then we can write in OpenCL and run on either ATI or nVidia. Not sure where that is now but last I heard you still had to be a registered dev on nVidia to get access. Well when it happens then graphics intensive apps like PS will make more use of these API's and accelerate in hardware some of the processing which will speed things up a lot. I would say tho that in the meantime you can expect your CPU to be in overdrive when doing large area computes like a blur. If the image is so large your into pagefile land then as said earlier your at that 100:1 HDD ratio and your CPU does 1% work and HDD does 99%+ so you fall off the performance cliff. Its in these instances where you get 100x speed increase if you can prevent the system paging ! I.e. its the point where you must stay behind in order to work effectively. So my tip is to tile the image and stay behind that point or it gets painful.

  2. #22
      su_liam is offline
    Guild Artisan su_liam's Avatar
    Join Date
    Aug 2007
    Location
    Port Alberta, Regina(IRL: Eugene, OR)
    Posts
    701

    Default

    I've been manipulating imagery in 21601 × 10801 resolution recently. My record. This is using PS CS4 on my 2-year-old base MacBook(2.1 GHz Core 2 Duo, 2 GB ram on Mac 10.6.3.

    A little slow generally. Flexify takes a couple minutes. Not too bad. Saving onto my 1.5 TB external drive is a bit excruciating, but that's 'cause the drive is cheap and slow. With my old 1 GHz single core PowerBook and CS1 8192 x 4096 was more of a challenge than this(actually painful in extremis dei, amen). I like my little bargain basement mac laptop.

  3. #23
      Gidde is offline
    Community Leader Gidde's Avatar
    Join Date
    May 2009
    Location
    Michigan, USA
    Posts
    2,992

    Default

    My gawd. 20k x 10k?? I'm incredibly jealous of everyone posting in this thread

  4. #24
    Guild Novice
    Join Date
    Jun 2009
    Posts
    5

    Default

    Hi Djekspek,

    I recently completed a very large map which was 9200px x 7200px using Photoshop and Illustrator CS4. Total of 1800 mb. It has about 40 layers; although many of those were solid fills with an alpha channel, or placed PDF's from Illustrator. (The illustrator file, with all the linework, and text, was around 2 gigs. Breaking it up this way made it much more digestible to Pshop.

    My biggest tip is to use PSB format. (Not the PSD) This is the photoshop 'large image format' and it makes a huge difference in performance. I was having lots of problems initially w/ Photoshop freaking out before switching to PSB. After that, it was always able to deal with the big file and performance improved significantly.

  5. #25
      minimal is offline
    Guild Novice minimal's Avatar
    Join Date
    Aug 2009
    Posts
    21

    Default

    Just a note that i'm working with 30k by 25k image with 27 layers in Photoshop CS4 (64-bit) and it's running like a champ. With CS4 the trick is that it offloads OpenGL functions to the video card to improve performance. Only an issue if you have many windows running (I open each element into a separate window and then composite into another window entirely) but there's one possibly bottle neck.

    The next thing to remember when using huge files in photoshop is to zoom in to what you need rather than working with the whole image all the time. Newer editions of Photoshop finally started realizing they didn't need to spend memory on complete renders of things completely off screen.

    Next, make sure you have a good scratch disk. Allocating space for a scratch disk increases photoshop's performance drastically.

    Lastly, remember to use Smart Objects. Anything that repeats in your file and is made of multiple layers needs to be reduced to a smart object. That way Photoshop only processes the object once, and then applies transformations/filters on each instance of the object. I'll use Smart Objects for even simple items with 2 layers (city markers come to mind as a good example).

    All in all, my specs aren't too different from the OP's.

    2.4Ghz Quad-core
    6GB RAM
    NVidia 280GTX (1GB on-board VRAM)

    What I probably have that's drastically different is my drive setup. I have a system drive, an apps drive, several storage drives and space that serves as a 'work space' area that I never fill. That gets used for that scratch space mentioned earlier.

    Pixelpusher: I've never tried PSB. I may give it a shot. Performance is fine now, but I'll always take a tweak.

  6. #26
      Redrobes is offline
    Software Dev/Rep Redrobes's Avatar
    Join Date
    Dec 2007
    Location
    England
    Posts
    4,775
    Blog Entries
    4

    Default

    Good info. Sounds like Adobe have finally woken up.

  7. #27
      minimal is offline
    Guild Novice minimal's Avatar
    Join Date
    Aug 2009
    Posts
    21

    Default

    Quote Originally Posted by Redrobes View Post
    Good info. Sounds like Adobe have finally woken up.
    More they finally learned to work with modern memory management.

    That said, using a 2500 pixel smudge brush? Still makes it chug. Really though, the smudge brush that has to do samples across multiple layers and averaging as it draws in real time. It's always been one of the functions that eats Photoshop if you use it much.

  8. #28
      Carnifex is offline
    Professional Artist Carnifex's Avatar
    Join Date
    Feb 2008
    Location
    Sweden
    Posts
    220

    Default

    Interesting thread. Here's from the Photoshop help: "With a few exceptions (for instance Large Document Format (PSB), Photoshop Raw, and TIFF), most file formats cannot support documents larger than 2 GB."
    and:
    "The Large Document Format (PSB) supports documents up to 300,000 pixels in any dimension. All Photoshop features, such as layers, effects, and filters, are supported. (With documents larger than 30,000 pixels in width or height, some plug-in filters are unavailable.)"

    30 000 x 30 000 pixels will be about 100 x 100 inch on a 300 dpi resolution image. That is a pretty big image

  9. #29
      tilt is offline
    Community Leader Facebook Connected tilt's Avatar
    Join Date
    May 2010
    Location
    Trelleborg, Sweden
    Posts
    4,657
    Blog Entries
    2

    Default

    Quote Originally Posted by Carnifex View Post
    30 000 x 30 000 pixels will be about 100 x 100 inch on a 300 dpi resolution image. That is a pretty big image
    thats sligthly bigger than my desktop printer can print - so I guess I'm safe *lol*
    regs tilt
    :: My art on Deviant Art :: My mapping blog tilts fantasy maps :: My work Catapult - Perry & Gehrke - EasyTruckIT ::
    :: Finished Maps :: WIP Cartographia - Breakwater -Market -Lands of Twilight -Battle City :: Competion maps Iron Giant ::
    :: FREE Tiles - Compasses :: Other Taking a commision - Copyright & Creative Commons ::
    Works under CC licence unless mentioned otherwise

Page 3 of 3 FirstFirst 123

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •