ArchiveOrangemail archive

nginx mailing list English


nginx.nginx.org
(List home) (Recent threads) (4 other Nginx lists)

Subscription Options

  • RSS or Atom: Read-only subscription using a browser or aggregator. This is the recommended way if you don't need to send messages to the list. You can learn more about feed syndication and clients here.
  • Conventional: All messages are delivered to your mail address, and you can reply. To subscribe, send an email to the list's subscribe address with "subscribe" in the subject line, or visit the list's homepage here.
  • Moderate traffic list: up to 30 messages per day
  • This list contains about 44,810 messages, beginning Feb 2005
  • 4 messages added yesterday
Report the Spam
This button sends a spam report to the moderator. Please use it sparingly. For other removal requests, read this.
Are you sure? yes no

Upstream with multiple request reply

Ad
Bertrand Paquet 1326746439Mon, 16 Jan 2012 20:40:39 +0000 (UTC)
Hi all,

I'm trying to write an upstream which have to do multiple request / reply
with backend, for one frontend request.

Currently,
- I'm writing first request to backend in create_request handler, write
data into r->upstream->request_bufs, and returning NGX_OK
- Nginx call me on process_header. I can read data from
r->upstream->buffer. I can loop on process_header if I return NGX_AGAIN.
But I'm not able to send more data to backend. I try to add buffer in
r->upstream->request_bufs, in r->upstream->request_bufs->next, nothing work.

Anybody know how to send data to backend in the process_header callback ?

Regards,

Bertrand
Maxim Dounin 1326748824Mon, 16 Jan 2012 21:20:24 +0000 (UTC)
Hello!On Mon, Jan 16, 2012 at 09:40:26PM +0100, Bertrand Paquet wrote:

> Hi all,
> 
> I'm trying to write an upstream which have to do multiple request / reply
> with backend, for one frontend request.
> 
> Currently,
> - I'm writing first request to backend in create_request handler, write
> data into r->upstream->request_bufs, and returning NGX_OK
> - Nginx call me on process_header. I can read data from
> r->upstream->buffer. I can loop on process_header if I return NGX_AGAIN.
> But I'm not able to send more data to backend. I try to add buffer in
> r->upstream->request_bufs, in r->upstream->request_bufs->next, nothing work.
> 
> Anybody know how to send data to backend in the process_header callback ?The upstream module is designed to handle "single request - single 
response" model, it's not capable of sending multiple requests to 
backend.

Maxim Dounin
Bertrand Paquet 1326750364Mon, 16 Jan 2012 21:46:04 +0000 (UTC)
Hi,

Arg, it's not a good news :)

Do you think I can do ngx_http_subrequest to do some additional request to
back end ?
How to say to nginx to wait for subrequest before calling create_request ?

Regards,

BertrandOn Mon, Jan 16, 2012 at 22:20, Maxim Dounin  wrote:

> Hello!
>
> On Mon, Jan 16, 2012 at 09:40:26PM +0100, Bertrand Paquet wrote:
>
> > Hi all,
> >
> > I'm trying to write an upstream which have to do multiple request / reply
> > with backend, for one frontend request.
> >
> > Currently,
> > - I'm writing first request to backend in create_request handler, write
> > data into r->upstream->request_bufs, and returning NGX_OK
> > - Nginx call me on process_header. I can read data from
> > r->upstream->buffer. I can loop on process_header if I return NGX_AGAIN.
> > But I'm not able to send more data to backend. I try to add buffer in
> > r->upstream->request_bufs, in r->upstream->request_bufs->next, nothing
> work.
> >
> > Anybody know how to send data to backend in the process_header callback ?
>
> The upstream module is designed to handle "single request - single
> response" model, it's not capable of sending multiple requests to
> backend.
>
> Maxim Dounin
>
> _______________________________________________
> nginx mailing list
> 
> http://mailman.nginx.org/mailman/listinfo/ngi...
>
Maxim Dounin 1326752250Mon, 16 Jan 2012 22:17:30 +0000 (UTC)
Hello!On Mon, Jan 16, 2012 at 10:45:58PM +0100, Bertrand Paquet wrote:

> Hi,
> 
> Arg, it's not a good news :)
> 
> Do you think I can do ngx_http_subrequest to do some additional request to
> back end ?
> How to say to nginx to wait for subrequest before calling create_request ?If you are ok with *independent* requests to backend (that is, you 
just need to requests two happen, but don't need to implement some 
complex protocol), than using subrequests is right way to go.

See e.g. addition filter module sources for simple subrequest 
usage example, or ssi module sources for more complex one.

Maxim Dounin> 
> Regards,
> 
> Bertrand
> 
> 
> On Mon, Jan 16, 2012 at 22:20, Maxim Dounin  wrote:
> 
> > Hello!
> >
> > On Mon, Jan 16, 2012 at 09:40:26PM +0100, Bertrand Paquet wrote:
> >
> > > Hi all,
> > >
> > > I'm trying to write an upstream which have to do multiple request / reply
> > > with backend, for one frontend request.
> > >
> > > Currently,
> > > - I'm writing first request to backend in create_request handler, write
> > > data into r->upstream->request_bufs, and returning NGX_OK
> > > - Nginx call me on process_header. I can read data from
> > > r->upstream->buffer. I can loop on process_header if I return NGX_AGAIN.
> > > But I'm not able to send more data to backend. I try to add buffer in
> > > r->upstream->request_bufs, in r->upstream->request_bufs->next, nothing
> > work.
> > >
> > > Anybody know how to send data to backend in the process_header callback ?
> >
> > The upstream module is designed to handle "single request - single
> > response" model, it's not capable of sending multiple requests to
> > backend.
> >
> > Maxim Dounin
> >
> > _______________________________________________
> > nginx mailing list
> > 
> > http://mailman.nginx.org/mailman/listinfo/ngi...
> >

> _______________________________________________
> nginx mailing list
> 
> http://mailman.nginx.org/mailman/listinfo/ngi...
bigplum 1326770101Tue, 17 Jan 2012 03:15:01 +0000 (UTC)
Bertrand Paquet 1326796146Tue, 17 Jan 2012 10:29:06 +0000 (UTC)
Hi all,

Finally, I do not use subrequest, I write the following code, inspired from
upstream module.
I call ngx_http_upstream_send_another_request from process_header. It's
working fine.

Feel free to comment my code, if you think I can have problem, memory
allocation, segfault or others.

Regards,

Bertrand


static void
ngx_http_upstream_send_another_request_dummy_handler(ngx_http_request_t *r,
ngx_http_upstream_t *u)
{
    ngx_log_debug0(NGX_LOG_DEBUG_HTTP, r->connection->log, 0,
                   "http upstream send another request dummy handler");
}

static ngx_int_t
ngx_http_upstream_send_another_request(ngx_http_request_t *r,
ngx_http_upstream_t *u);

static void
ngx_http_upstream_send_another_request_handler(ngx_http_request_t *r,
ngx_http_upstream_t *u)
{
    ngx_log_debug0(NGX_LOG_DEBUG_HTTP, r->connection->log, 0,
                   "http upstream send another request handler");

    ngx_http_upstream_send_another_request(r, u);
}

static ngx_int_t
ngx_http_upstream_send_another_request(ngx_http_request_t *r,
ngx_http_upstream_t *u)
{
    ngx_int_t          rc;
    ngx_connection_t  *c;

    c = u->peer.connection;

    ngx_log_debug0(NGX_LOG_DEBUG_HTTP, c->log, 0,
                   "http upstream send another request");
    //
    // if (!u->request_sent && ngx_http_upstream_test_connect(c) != NGX_OK)
{
    //     ngx_http_upstream_next(r, u, NGX_HTTP_UPSTREAM_FT_ERROR);
    //     return;
    // }

    c->log->action = "sending request to upstream";

    rc = ngx_output_chain(&u->output, u->request_sent ? NULL :
u->request_bufs);

    u->request_sent = 1;

    if (rc == NGX_ERROR) {
        return rc;
    }

    if (c->write->timer_set) {
        ngx_del_timer(c->write);
    }

    if (rc == NGX_AGAIN) {
       ngx_log_debug0(NGX_LOG_DEBUG_HTTP, c->log, 0,
                       "ngx_output_chain return NGX_AGAIN");

        u->write_event_handler =
ngx_http_upstream_send_another_request_handler;

        ngx_add_timer(c->write, u->conf->send_timeout);

        if (ngx_handle_write_event(c->write, u->conf->send_lowat) !=
NGX_OK) {
          return NGX_ERROR;
        }

        return NGX_AGAIN;
    }

    /* rc == NGX_OK */

    if (c->tcp_nopush == NGX_TCP_NOPUSH_SET) {
        if (ngx_tcp_push(c->fd) == NGX_ERROR) {
            ngx_log_error(NGX_LOG_CRIT, c->log, ngx_socket_errno,
                          ngx_tcp_push_n " failed");
            return NGX_ERROR;
        }

        c->tcp_nopush = NGX_TCP_NOPUSH_UNSET;
    }

    ngx_add_timer(c->read, u->conf->read_timeout);

// #if 1
//     if (c->read->ready) {
//
//         /* post aio operation */
//
//         /*
//          * TODO comment
//          * although we can post aio operation just in the end
//          * of ngx_http_upstream_connect() CHECK IT !!!
//          * it's better to do here because we postpone header buffer
allocation
//          */
//
//          return u->process_header(r);
//     }
// #endif

    u->write_event_handler =
ngx_http_upstream_send_another_request_dummy_handler;

    if (ngx_handle_write_event(c->write, 0) != NGX_OK) {
        return NGX_ERROR;
    }

    return NGX_AGAIN;
}On Tue, Jan 17, 2012 at 04:14, bigplum  wrote:

> Hi,
>
> This demo maybe helpful.
> https://github.com/bigplum/nginx-http-nphase-...
>
> Posted at Nginx Forum:
> http://forum.nginx.org/read.php?2,221329,2213...
>
> _______________________________________________
> nginx mailing list
> 
> http://mailman.nginx.org/mailman/listinfo/ngi...
>
Home | About | Privacy