hc-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Oytun Turk <oytun_t...@yahoo.com>
Subject Using InputStreamEntity for streaming data from Http server to client
Date Mon, 01 Dec 2008 08:46:29 GMT
Hi,

We are trying to implement an HTTP server in our open source
text-to-speech synthesis system Mary (mary.dfki.de) based on 
HttpComponents. We have been able to complete all functionality 
with one bit missing: 
Streaming audio from HTTP server to HTTP clients.

For this purpose, we use InputStreamEntity, pass an input 
stream in which the audio data is written, and set the content 
length to -1. Note that the final size of the audio data is 
unknown at synthesis time and we want the clients to receive 
audio from server as it becomes available at the server.

The above approach works OK when the whole auido content is 
received by the client. However, we have problems in getting 
streaming to work properly. We have tried different audio formats 
and focused on streaming mp3 since it´s a common streaming format 
supported by most browsers and plug-ins.

To simulate the problem, I have generated a simple server 
(by modifying the nio example NHttpServer.java) and 
an exemplar html page that connects to it.
The sample code is available in the following links as zip or rar files:

http://www.dfki.de/~otuerk/http/streamingAudio.zip

http://www.dfki.de/~otuerk/http/streamingAudio.rar


The content is as follows:

SimpleStreamingServer/Server.java: Server simulating streaming
SimpleStreamingServer/Test.html: Html page that connects to Server
SimpleStreamingServer/Client.java: Java client that connects to Server
SimpleStreamingServer/src/1.mp3: Sample mp3 file to be streamed

To test this, Server should be run first.
Then, upon connection from the client 
(i.e. by opening "Test.html" in a web browser or running Client.main), 
the server responds with content read from the mp3 file. 
Note that, normally, we will *not* be reading from a file 
but directly sending the mp3 stream as the server generates it.
I have based this scenario on "reading from a file"
to keep things simple here...

The reading and response generation thread is implemented 
in the class "SimpleStreamingServer\src\AudioStreamer.java" 
within its run() method. Here, mp3 audio data is read in small chunks 
and an artificial delay to simulate processing delay is added using 
sleep().

The questions are:

(1) Is "InputStreamEntity" the correct entity type to generate 
the http response for our purposes? 
Could someone please comment if it´s being used properly 
in handle() function in "Server.java", i.e.:

... //between lines 142-150
PipedOutputStream pos = new PipedOutputStream();
PipedInputStream pis = new PipedInputStream(pos);
AudioStreamer streamer = new AudioStreamer(response, pos, "1.mp3");

streamer.start();
InputStreamEntity body = new InputStreamEntity(pis, -1); 
body.setContentType("audio/x-mp3");
response.setEntity(body);
...

Here we want to write the audio output to "pos" in AudioStreamer.run() 
which will then be piped to the corresponding input stream "pis". 
Then, "pis" goes into the InputStreamEntity and sent to the client.


(2) If the above approach for generating the response entity is wrong, 
is there another way to respond the client so that it starts receiving 
data with minimum delay, without having to wait the server thread 
to complete?


(3) The java based "Client" receives the packages from the "Server" 
and writes them to disk:

...
//Change file path to get this working on another computer
String outFile = "d:\\received.mp3"; 
FileOutputStream outputStream = new FileOutputStream(new File(outFile));
HttpEntity entity = response.getEntity();

InputStream stream = entity.getContent();
byte[] vals = new byte[1];
int counter = 0;
int packageNo = 0;
while (stream.read(vals)==vals.length)
{
   counter++;
   outputStream.write(vals);
   if (counter==4000)
   {
     System.out.println("Received package #"+String.valueOf(++packageNo));
     counter = 0;
   }
}
        	
entity.consumeContent();
outputStream.close();
stream.close();
...

This works fine but the client starts receiving the data after the server 
has sent all audio data, not as it becomes available 
within the input stream ("pis"), i.e. the stream that has been passed 
to the InputStreamEntity.
Are we doing something wrong on the client-side to receive 
the streaming data properly?

Thank you very much in advance...

Oytun Turk, DFKI Speech Group


      

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@hc.apache.org
For additional commands, e-mail: dev-help@hc.apache.org


Mime
View raw message