[Zope-dev] Streaming Response

Danny W. Adair danny@adair.net
Tue, 08 Apr 2003 12:20:31 +1200


# 
################################################################################################
# I first posted this to zope@zope.org, without success. Although it is not 
about the development of # Zope itself, it deals with its internals, so now 
I post it here.
# I hope everyone is alright with that. :-)
# 
################################################################################################


Hi,

I would like to show status messages while calling a couple of functions 
from within a Python Script: While the script is running, I want to 
successively output messages like

------------------------
Doing this...ok
Doing that...ok
Done
------------------------
and then either redirect or display a button to move on.

1. I do _not_ want to use JavaScript
2. I do _not_ want to use the meta tag "refresh"

I would like to use Response.write() to push my status messages on the fly.
Since I don't know which function calls will succeed and which will fail, I 
do not know "Content-Length" in advance. Therefore, I would like to use a 
streaming/chunked response as specified in HTTP 1.1.

How do I do this in Zope?

I tried to do "response.setHeader('Content-Type', 
'multipart/x-mixed-replace')" but couldn't get it working. (Also, it looks 
as if this will only work in Netscape but not IE. True? Then unfortunately 
I can't go that route anyway.)

I also tried to leave the header alone and just "Response.write()" the 
multipart content-type and its boundary and the content-type's of the 
parts, but ZPublisher just returned the plain text...

I guess the answer lies somewhere in "Transfer-Encoding: chunked" but I 
can't figure out how to actually do it. :-)

Maybe one of you guys working on HTTPResponse.py (Fred, Chris, Brian, Toby, 
Andreas) knows how to accomplish this? Anyone else? I appreciate the 
smallest hint. ;-)

Thank you very much for your help,
Danny

P.S.: Please include my address in your reply ("Reply All") since I am 
currently not subscribed to this list. Thank you.