ChangeScreenBuffer() quit problem - is it OK to use windows?

This forum is for general developer support questions.
User avatar
ChrisH
Beta Tester
Beta Tester
Posts: 920
Joined: Mon Dec 20, 2010 9:09 pm
Contact:

Re: ChangeScreenBuffer() quit problem - is it OK to use wind

Post by ChrisH »

@broadblues
Yes, you COULD interpret it like that. I was merely HOPING that interpretation was wrong, because it makes things more complicated (although I've now worked-around it).

HOWEVER, the following quote seems to suggest that you ARE supposed to be able to use windows on a double-buffered screen (unless gadgets can be attached directly to the screen, without a window?)
Only a small subset of gadgets are supportable in double-buffered screens. These gadgets are those whose imagery returns to the initial state when you release them (e.g., action buttons or the screen's depth gadget). To use other kinds of gadgets (such as sliders or string gadgets) you need to put them on a separate screen, which can be an attached screen.
User avatar
ChrisH
Beta Tester
Beta Tester
Posts: 920
Joined: Mon Dec 20, 2010 9:09 pm
Contact:

Re: ChangeScreenBuffer() quit problem - is it OK to use wind

Post by ChrisH »

Just to summarise what I posted on Amigans.net . AmigaOS*3* seems to have a similar problem when ChangeScreenBuffer() is called an *odd* number of times. This suggests it is an OS *limitation*, rather than an OS bug. So no need for me to file a BugZilla report.

HOWEVER, it also appears perfectly fine to create screen buffers after opening a window, as well as close a window after destroying screen buffers, and in fact write to the window after destroying screen buffers.
User avatar
Hans
AmigaOS Core Developer
AmigaOS Core Developer
Posts: 703
Joined: Tue Dec 21, 2010 9:25 pm
Location: New Zealand
Contact:

Re: ChangeScreenBuffer() quit problem - is it OK to use wind

Post by Hans »

@ChrisH
I have just replied on amigans.net. For the benefit of others:

@all
As Chris said, this is an OS limitation. You should not be rendering via windows on double-buffered screens. ChangeScreenBuffer() was designed to be used with direct rendering to the screen's back-bitmap only. Windows have no concept of double-buffered screens, and they will not change which buffer they render to in response to ChangeScreenBuffer() (they have a private rast-port with a pointer to the target bitmap). The only use for a window, is to capture input events. ChangeScreenBuffer() has no idea which buffer you want to render to next, so it has no way of knowing what your next target buffer will be. While the next render buffer would be obvious in a double-buffered situation, ChangescreenBuffer() supports having even more buffers, and does not enforce buffer swap order.

If there is a need to render and use windows on a double-buffered screen, then a new intuition function (e.g., ChangeScreenRenderBuffer()) would have to be created. Personally, I like that idea; being able to render windows and gadgets to a double-/triple-buffered window does have its uses.

Hans

P.S. Whatever you do, please do NOT go poking around in the Windows and Screen structure to try manually working around this problem. Those structures could change, so treat them as a black box and use OS functions.
http://hdrlab.org.nz/ - Amiga OS 4 projects, programming articles and more. Home of the RadeonHD driver for Amiga OS 4.x project.
User avatar
ChrisH
Beta Tester
Beta Tester
Posts: 920
Joined: Mon Dec 20, 2010 9:09 pm
Contact:

Re: ChangeScreenBuffer() quit problem - is it OK to use wind

Post by ChrisH »

Hans wrote:P.S. Whatever you do, please do NOT go poking around in the Windows and Screen structure to try manually working around this problem. Those structures could change, so treat them as a black box and use OS functions.
I assume you were hinting that people should NOT try changing the bitmap used by windows (in a misguided attempt to get windows to work with double-buffering)?

Rather than closing & reopening the whole screen to dis/en-able double-buffering, my solution was to render into the screen's bitmap at the location of my (fixed-position) backdrop window, while double-buffering is enabled (as well as ensuring that the original screen bitmap is visible when disabling double-buffering). This might not work as expected if any 'foreign' windows ever open above the window (while double-buffering is enabled), but I'll cross the bridge if I come to it...
User avatar
broadblues
AmigaOS Core Developer
AmigaOS Core Developer
Posts: 600
Joined: Sat Jun 18, 2011 3:40 am
Location: Portsmouth, UK
Contact:

Re: ChangeScreenBuffer() quit problem - is it OK to use wind

Post by broadblues »

This might not work as expected if any 'foreign' windows ever open above the window (while double-buffering is enabled), but I'll cross the bridge if I come to it...
'Foriegn windows' can only open on a public screen and only custom screens can / should be double buffered, so in principle that shouldn't be possible.
User avatar
ChrisH
Beta Tester
Beta Tester
Posts: 920
Joined: Mon Dec 20, 2010 9:09 pm
Contact:

Re: ChangeScreenBuffer() quit problem - is it OK to use wind

Post by ChrisH »

broadblues wrote:only custom screens can / should be double buffered
Something else to add to the missing list of limitations of ChangeScreenBuffer() !

Having thought about it further, there are cases where I might open (GUI) windows on the screen. So I will have to switch back to "fake" double-buffering in those cases (i.e. WaitTOF() + BltBitMap()).
Last edited by ChrisH on Thu Jun 07, 2012 10:32 am, edited 1 time in total.
User avatar
Hans
AmigaOS Core Developer
AmigaOS Core Developer
Posts: 703
Joined: Tue Dec 21, 2010 9:25 pm
Location: New Zealand
Contact:

Re: ChangeScreenBuffer() quit problem - is it OK to use wind

Post by Hans »

@ChrisH
ChrisH wrote:
broadblues wrote:only custom screens can / should be double buffered
Something else to missing list of limitations of ChangeScreenBuffer() !
I imagine that allowing public screens to be double-buffered would be a nightmare. Those "foreign" programs would have no knowledge of the double-buffering, and so wouldn't know when the buffer swap was going to occur. It also isn't practical to redraw windows every frame. So, they'd all have to render to off-screen bitmaps which would then be composited onto the screen. At this point, the benefit of having a double-buffered screen would be almost gone. I say "almost," because it would still eliminate tearing artifacts. To top it off, the decision of when the screen compositing should be handled would be a complex/impossible one, unless intuition took over full maintenance of that screen (i.e., it calls ChangeScreenBuffer(), etc., and not your program). All-in-all, I don't think that relaxing this restriction would be a sensible thing to do.

Thinking about this a bit more, the same reasons also apply to why having any windows on double-buffered screens isn't particularly practical. Those windows would either have to be redrawn every frame (impractical), or rendered to an off-screen bitmap, and then blitted in before swapping the buffer (in which case double-buffering is almost redundant). Multiple windows and application-controlled double-buffering don't work together.

Please do take the time to submit bugzilla tickets against the SDK documentation for those items that you think need clarification.

Hans
http://hdrlab.org.nz/ - Amiga OS 4 projects, programming articles and more. Home of the RadeonHD driver for Amiga OS 4.x project.
User avatar
ChrisH
Beta Tester
Beta Tester
Posts: 920
Joined: Mon Dec 20, 2010 9:09 pm
Contact:

Re: ChangeScreenBuffer() quit problem - is it OK to use wind

Post by ChrisH »

Hans wrote:I imagine that allowing public screens to be double-buffered would be a nightmare.
I never intended to suggest it SHOULD allow it. I have edited my post to say what I had intended.
Please do take the time to submit bugzilla tickets against the SDK documentation for those items that you think need clarification.
Already done.
Post Reply