Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Cloudflare Implementation #49

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
sideshot opened this issue Jun 6, 2025 · 2 comments
Closed

Cloudflare Implementation #49

sideshot opened this issue Jun 6, 2025 · 2 comments
Labels
discussion package:agents-core question Further information is requested

Comments

@sideshot
Copy link

sideshot commented Jun 6, 2025

Is this the intended setup for a Cloudflare Worker? I'm particularly interested in how the streaming is handled.

I'm running into an issue when following the cookbook instructions for the prompt: "Before any tool, tell the user 'I will be right back'." The tool initiates correctly, but there's no response afterward—it just ends the session.

/*────────────────────────  Manual SSE chat handler  ─────────────────*/
async function handleChat(
  req: Request,
  env: Env,
  ctx: ExecutionContext
): Promise<Response> {
  /* Parse JSON body */
  const { message, lastResponseId, trace_id } = (await req.json()) as {
    message: string;
    lastResponseId?: string;
    trace_id?: string;
  };

  /* Get or generate trace_id for grouping conversations */
  const traceId = trace_id || `trace_${Math.random().toString(36).substring(2, 34)}`;

  /* Initialise tracing/exporter once */
  await initTracing(env.OPENAI_API_KEY);

  /* Dynamically import SDK pieces that may use crypto or fs */
  const { Agent, Runner, withTrace } = await import("@openai/agents");

  /* Build agent & runner */
  const agent = new Agent({
    name: "Site Assistant",
    instructions: "You will answer questions by using the search_tool(query).",
    outputType: "text",
    model: "gpt-4.1",
    // tools: [searchTool],
  });

  const runner = new Runner({
    tracingDisabled: false,          // default
  });

  let runResult: any;
  await withTrace('cf-assistant-ui', async () => {
    runResult = await runner.run(agent, message, {
      stream: true,
      previousResponseId: lastResponseId,
    });
  }, {
    traceId: traceId,
    groupId: traceId, // Use same ID for grouping conversations
  });

  /* Create manual SSE stream */
  const stream = new ReadableStream({
    async start(controller) {
      const encoder = new TextEncoder();
      
      try {
        for await (const ev of runResult) {
          // Handle raw model stream events which contain the delta text
          if (ev.type === 'raw_model_stream_event') {
            // Check if this is an output_text_delta event
            if (ev.data && ev.data.type === 'output_text_delta' && ev.data.delta) {
              const textDelta = (ev as any).data.delta;
              controller.enqueue(encoder.encode(`event: text-delta\ndata: ${textDelta}\n\n`));
            }
            // Handle completion
            else if (ev.data && ev.data.type === 'response_done') {
              controller.enqueue(encoder.encode(`event: lastResponseId\ndata: ${ev.data.response.id}\n\n`));
              controller.enqueue(encoder.encode(`event: trace_id\ndata: ${traceId}\n\n`));
              controller.enqueue(encoder.encode('event: done\ndata: [DONE]\n\n'));
              break;
            }
          }
        }
      } catch (error) {
        console.error('Error processing stream:', error);
        controller.enqueue(encoder.encode(`event: error\ndata: ${String(error)}\n\n`));
      } finally {
        controller.close();
      }
    }
  });

  

  return new Response(stream, {
    headers: {
      'Content-Type': 'text/event-stream',
      'Cache-Control': 'no-cache',
      'Connection': 'keep-alive',
      'Access-Control-Allow-Origin': '*',
    }
  });
}

I'd be happy to add the full script if you prefer. The reason for the tracing extra code was from debugging why the tracing stopped for me.

@sideshot sideshot added the question Further information is requested label Jun 6, 2025
@dkundel-openai
Copy link
Collaborator

Hey @sideshot! We are still improving the Cloudflare Worker setup. @threepointone has done some helpful contributions to it and #50 should further improve the situation including tracing set up.

@dkundel-openai
Copy link
Collaborator

I think there were multiple issues you were running into:

  1. combined tool call and response would result in the tool not being executed
  2. Cloudflare workers support being not ideal

Both should be fixed now so I'm going to close this ticket

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
discussion package:agents-core question Further information is requested
Projects
None yet
Development

No branches or pull requests

3 participants