Using OpenZiti in distributed surveillance system

Thanks immensely for the example, it really does help us help you. I tried it out and I can also replicate the exception:

  File "/home/cd/.local/lib/python3.10/site-packages/openziti/zitilib.py", line 219, in check_error
    raise Exception(err, msg)
Exception: (11, 'unexpected error')

But it returns Hello, world! properly for me:

curl http://predalertui.simple/
Hello, world!

There definitely should be no error, for sure, but what sort of problem is this creating for you now? We'll have to figure out why/where that error is popping out but we can move forward now with whatever sort of problem you're having?

1 Like

@papiris Thank you for your reports and code samples.
I am working on making Ziti service/bind side more robust and usable by various frameworks. I hope to get an update soon.

1 Like

This, and the (not-so-minimal) code in my project repo returns static content (html, js, css, JSON) properly for me as well, although it seems to take a rather long time to do so when testing in a browser.
The problem for my usecase I believe lies with using fastapi.responses.StreamingResponse. I'll build on the minimal example to try to validate this guess.

Thanks a bunch! This issue is kind of a blocker in my thesis work, so I'm happy to contribute in any way that enables progress on it.

Thanks. It'll definitely help us to have a minimal example.

Back again with a minimal example!

Turns out that Starlette (and thereby FastAPI)'s StreamingResponse is fairly finicky.
With some effort, I finally managed to use StreamingResponse in a normal, simple, FastAPI app.
I'm not sure why the video displays kinda worked in my non-zitified webapp, but there's likely some bug in my implementation that borked it once Ziti entered the mix.

This is a minimal (working!) example of successfully zitified (uvicorn+FastAPI) webapp using StreamingResponse:

import openziti
import uvicorn
from fastapi import FastAPI
from fastapi.responses import JSONResponse
from fastapi.responses import StreamingResponse
import asyncio


app = FastAPI()


@app.route("/")
async def indexpage(request):
    return JSONResponse({"message": "Hello World"})


async def slow_numbers(minimum, maximum):
    yield '<html><body><ul>'
    for number in range(minimum, maximum + 1):
        yield '<li>%d</li>' % number
        await asyncio.sleep(0.5)
    yield '</ul></body></html>'


@app.route("/stream")
async def stream_func(request):
    generator = slow_numbers(1, 15)

    return StreamingResponse(generator, media_type='text/html')


@openziti.zitify(bindings={
    ("127.0.0.1", 8443): {
        'ztx': "./predalert_server1.json", 'service': "predalert_ui"}
    })
def run_webapp():
    uvicorn.run(
        "main:app",
        host='127.0.0.1',
        port=8443
        )


if __name__ == "__main__":
    run_webapp()

Visit "/stream" to see the magic :sunglasses:

Next on my todo list is reworking my webapp to actually work when zitified, since this example shows it must be possible

1 Like

magical!

This worked for me zitified. Are you saying it still doesn't work for you like shown? EDIT: i see now you meant to work on YOUR app, not this one. i got it :slight_smile: CONGRATS! :slight_smile: (i updated 1,15 to 1,256 so it'd run longer)

pred-example

I still see this is all:

Exception in callback BaseSelectorEventLoop._accept_connection(<function Ser...x7f409ab8d360>, <openziti.dec...27.0.0.1', 0)>, None, <Server socke...0.0.1', 0)>,)>, 2048, None)
handle: <Handle BaseSelectorEventLoop._accept_connection(<function Ser...x7f409ab8d360>, <openziti.dec...27.0.0.1', 0)>, None, <Server socke...0.0.1', 0)>,)>, 2048, None)>
Traceback (most recent call last):
  File "/usr/lib/python3.10/asyncio/events.py", line 80, in _run
    self._context.run(self._callback, *self._args)
  File "/usr/lib/python3.10/asyncio/selector_events.py", line 159, in _accept_connection
    conn, addr = sock.accept()
  File "/home/cd/.local/lib/python3.10/site-packages/openziti/zitisock.py", line 119, in accept
    fd, peer = zitilib.accept(self.fileno())
  File "/home/cd/.local/lib/python3.10/site-packages/openziti/zitilib.py", line 276, in accept
    check_error(clt)
  File "/home/cd/.local/lib/python3.10/site-packages/openziti/zitilib.py", line 219, in check_error
    raise Exception(err, msg)
Exception: (11, 'unexpected error')
INFO:      - "GET /stream HTTP/1.1" 200 OK
1 Like

I had a lot of trouble trying to get proper video/image streams using StreamingResponse, so went looking for alternatives.
First thought was WebRTC, but that doesn't seem to be supported in OpenZiti yet. Had a look at WebSockets, but noticed that WS is disabled on the edge router in my NF CloudZiti network; without a simple option to enable.

Last option was Server-Sent Events (SSE), which turned out to be pretty elegant :dancer:
Here's a minimal working, zitified example webapp using SSE in two different scenarios; for text/log stream and for an image stream:

import openziti
from fastapi import FastAPI, Request
from fastapi.responses import HTMLResponse
import asyncio
from sse_starlette.sse import EventSourceResponse

from hypercorn.config import Config
from hypercorn.asyncio import serve
import signal
import cv2 as cv
import base64

shutdown_event = asyncio.Event()

# for hypercorn
address = "localhost"
port = 8443
video_file = "lambs-and-cat.mp4" # Change to local file
identity = "./predalert_server2.json" # ChangeMe
service_name = "predalert_service_2" # ChangeMe

config = Config()
config.bind = [f"{address}:{port}"]
config.errorlog = "-" #stderr
config.loglevel = "DEBUG"

# Point these variables to  SSL cert/key files for HTTP/2 in hypercorn,
# which is necessary to support more than 6 concurrent SSE connections per domain
# config.certfile = "ssl_cert.pem"
# config.keyfile = "ssl_key.pem"


app = FastAPI()


@app.get("/")
async def indexpage(request: Request):
    html_content = """
    <html>
        <head>
            <style>
                #numbers {
                    background-color: black;
                    color:white;
                    height:600px;
                    overflow-x: hidden;
                    overflow-y: auto;
                    text-align: left;
                    padding-left:10px;
                }
            </style>
        </head>

        <body>

            <h1>Numbers:</h1>
            <div id="numbers">
            </div>
            <script>
                var source = new EventSource("/stream-sse");
                source.onmessage = function(event) {
                    document.getElementById("numbers").innerHTML += event.data + "<br>";
                };
            </script>

        </body>
    </html>
    """
    return HTMLResponse(content=html_content, status_code=200)


async def text_message_generator(request):
    for number in range(1, 256 + 1):
        if await request.is_disconnected():
            print("client disconnected!!!")
            break
        yield number
        await asyncio.sleep(0.1)


@app.get('/stream-sse')
async def text_message_stream(request: Request):
    event_generator = text_message_generator(request)
    return EventSourceResponse(event_generator)


@app.get("/video")
async def videopage(request: Request):
    html_content = """
    <html>
        <head>
            <title>SSE Image Streaming</title>
        </head>

        <body>
            <img id="image" src="#" alt="Streamed Image" >
            <script>
                const imageElement = document.getElementById('image');
                const eventSource = new EventSource('/vid-stream-sse'); // Replace '/stream' with your SSE endpoint

                eventSource.onmessage = function(event) {
                    const imageData = event.data;
                    imageElement.src = 'data:image/webp;base64,' + imageData; // Change datatype if necessary, always ;base64
                };

                eventSource.onerror = function(error) {
                    console.error('EventSource failed:', error);
                    eventSource.close();
                };
            </script>

        </body>
    </html>
    """
    return HTMLResponse(content=html_content, status_code=200)


async def vid_message_generator(request: Request):
    stream = cv.VideoCapture(video_file)
    # loop over frames
    while True:
        if await request.is_disconnected():
            print("client disconnected!!!")
            break
        # read frame from provided source
        (ret, frame) = stream.read()

        # break if video is done
        if not ret:
            break

        # handle WEBP encoding
        _ , image = cv.imencode(".webp", frame)

        # encode webp to base64 string in utf-8 format
        base64_image = base64.b64encode(image).decode("utf-8")

        yield base64_image
        await asyncio.sleep(0.001)


@app.get('/vid-stream-sse')
async def vid_message_stream(request: Request):
    event_generator = vid_message_generator(request)
    return EventSourceResponse(event_generator)


def _signal_handler(*_: any) -> None:
    shutdown_event.set()


@openziti.zitify(bindings={
    (address, port): {
        'ztx': identity, 'service': service_name}
    })
def run_webapp():
    loop = asyncio.get_event_loop()
    loop.add_signal_handler(signal.SIGTERM, _signal_handler)
    loop.run_until_complete(
        serve(app, config, shutdown_trigger=shutdown_event.wait)
    )


if __name__ == "__main__":

    run_webapp()

You'll need these:
pip install hypercorn opencv-python-headless sse_starlette fastapi


Yeah, 11, 'unexpected error' seems persistent across all implementations I've tried.

Firstly, noice wrt SSE! :slight_smile:

I would think that it depends on one's definitions. I'd expect WebRTC to work just fine through OpenZiti but it doesn't work without using the overlay network itself. So for example, I just searched up this GitHub - shanet/WebRTC-Example: A dead simple WebRTC example project. I cloned it and ran it and exposed it from my AWS vpc, added an identity for myself and -- i mean -- it seems to work just fine :point_down:

It's just not DIRECTLY peer to peer, it's peer to peer facilitated by OpenZiti... There might be some complexities I don't understand but -- it seemed fine to me in the 60 seconds I did all this... :slight_smile:

You're making some great progress though! Looking forward to whatever you make.

1 Like

Audio even worked :slight_smile: I didn't know it captured audio too lol

1 Like

Cool! Guess I'll try webRTC as well, to compare the performance with SSE (which seems kiiinda on the slow side)

Mostly nailed down the issue; Ziti-SDK-py doesn't play nice with multiprocessing.Manager.Queue objects (at least the way I use them).

Below is a no-longer-very-minimal example webapp that serves you your own webcam over ziti, if you visit {yourdomain.ziti/video}.
This new example is getting to be rather close to my actual project, so I'm considering putting the same license on it. The point of doing my project as AGPL was to make sure my efforts for secure, freedom- and privacy-respecting surveillance software wouldn't be co-opted into proprietary products...
If you'll allow it, I'd like this example to be AGPLv3 :slight_smile:

The software consists of:

  • main process, which creates an mp.Manager.Queue to be shared between the processes, as well as a mp.Pipe to be shared. It also starts the other processes.
  • frame_producer process, which fetches frames from your webcam and puts them in both the queue and pipe.
  • web_app process, which tries to consume both the shared queue and pipe; sending the images to the frontend at {http://yourdomain.ziti/video}
  • web_config, which is a class that eases sharing of variables and objects between routes in web_app.

Put them all in the same directory with a __init__.py, and make sure all dependencies are installed in your venv.

When the program doesn't use ziti, some well placed print statements give this:

q_img in run_webapp: <queue.Queue object at 0x7f31142f2650>
pipe_img_recv in run_webapp: <multiprocessing.connection.Connection object at 0x7f31197da650>
q_img in frame_producer: <queue.Queue object at 0x7f31142f2650>
pipe_img in frame_producer: <multiprocessing.connection.Connection object at 0x7f311423ac90>

^-- all is well!

Whereas if ziti is used, either with decorator or monkeypatch, this is produced:

q_img in run_webapp: <AutoProxy[Queue] object, typeid 'Queue' at 0x7ff6e338da90; '__str__()' failed>
pipe_img_recv in run_webapp: <multiprocessing.connection.Connection object at 0x7ff6e142a510>
q_img in frame_producer: <queue.Queue object at 0x7ff6dc933f10>
pipe_img in frame_producer: <multiprocessing.connection.Connection object at 0x7ff6e0bd4290>

^--- queues broken, so not very nice...


main.py:

import multiprocessing as mp
import time

import webapp
import frame_producer


def main():
    manager = mp.Manager()
    q_img = manager.Queue(maxsize=5)
    pipe_img_recv, pipe_img_send = mp.Pipe(duplex=False)

    webapp_process = mp.Process(
        target=webapp.run_webapp,
        args=(q_img,
              pipe_img_recv
              ),
        daemon=True
        )
    webapp_process.start()
    video_process = mp.Process(
        target=frame_producer.produce,
        args=(q_img,
              pipe_img_send
              ),
        daemon=True
        )
    video_process.start()


    while True:
        time.sleep(1)
    webapp_process.join()
    video_process.join()


if __name__ == "__main__":
    main()

frame_producer.py:

import cv2 as cv
import base64
import queue
import time
from multiprocessing import Queue, Pipe


def produce(q_img: Queue, pipe_img_send: Pipe):
    stream = cv.VideoCapture(0)
    print(f"q_img in frame_producer: {q_img}")
    print(f"pipe_img in frame_producer: {pipe_img_send}")

    while True:
        (ret, frame) = stream.read()

        # break if video is done
        if not ret:
            break

        _, image = cv.imencode(".webp", frame, params=[cv.IMWRITE_WEBP_QUALITY, 5])

        # encode jpeg to base64AutoProxy[Queue] object, typeid 'Queue' at 0x7ff6e338da90;str()' failed string in utf-8 format
        base64_image = base64.b64encode(image).decode("utf-8")

        try:
            q_img.put(base64_image, timeout=1)
        except queue.Full:
            print("q_img full")
        except Exception as e:
            print(f"frameproducer queue: {e}")

        try:
            pipe_img_send.send(base64_image)
        except Exception as e:
            print(f"frameproducer pipe: {e}")
        time.sleep(1)

webapp.py:

import openziti
from fastapi import FastAPI, Request
from fastapi.responses import HTMLResponse
import asyncio
from sse_starlette.sse import EventSourceResponse

from hypercorn.config import Config
from hypercorn.asyncio import serve
import signal
import sys

import webconfig

shutdown_event = asyncio.Event()

# for hypercorn
address = "localhost"
port = 8443
identity = "./predalert_server2.json"
service_name = "predalert_service_2"
config = Config()
config.bind = [f"{address}:{port}"]
config.errorlog = "-" #stderr
config.loglevel = "DEBUG"

# Point these variables to  SSL cert/key files for HTTP/2 in hypercorn,
# which is necessary to support more than 6 concurrent SSE connections per domain
# config.certfile = "ssl_cert.pem"
# config.keyfile = "ssl_key.pem"

web_config = webconfig.webConfigHolder()


app = FastAPI()


@app.get("/")
async def indexpage(request: Request):
    html_content = """
    <html>
        <head>
            <style>
                #numbers {
                    background-color: black;
                    color:white;
                    height:600px;
                    overflow-x: hidden;
                    overflow-y: auto;
                    text-align: left;
                    padding-left:10px;
                }
            </style>
        </head>

        <body>

            <h1>Numbers:</h1>
            <div id="numbers">
            </div>
            <script>
                var source = new EventSource("/stream-sse");
                source.onmessage = function(event) {
                    document.getElementById("numbers").innerHTML += event.data + "<br>";
                };
            </script>

        </body>
    </html>
    """
    return HTMLResponse(content=html_content, status_code=200)


async def text_message_generator(request):
    for number in range(1, 256 + 1):
        if await request.is_disconnected():
            print("client disconnected!!!")
            break
        yield number
        await asyncio.sleep(0.1)


@app.get('/stream-sse')
async def text_message_stream(request: Request):
    event_generator = text_message_generator(request)
    return EventSourceResponse(event_generator)


@app.get("/video")
async def videopage(request: Request):
    html_content = """
    <html>
        <head>
            <title>SSE Image Streaming</title>
        </head>

        <body>
            <img id="image" src="#" alt="Streamed Image" >
            <script>
                const imageElement = document.getElementById('image');
                const eventSource = new EventSource('/vid-stream-sse'); // Replace '/stream' with your SSE endpoint

                eventSource.onmessage = function(event) {
                    const imageData = event.data;
                    imageElement.src = 'data:image/webp;base64,' + imageData; // Change datatype if necessary, always ;base64
                };

                eventSource.onerror = function(error) {
                    console.error('EventSource failed:', error);
                    eventSource.close();
                };
            </script>

        </body>
    </html>
    """
    return HTMLResponse(content=html_content, status_code=200)


async def vid_message_generator(request: Request):
    q_img = web_config.read_key("q_img")
    pipe_img_recv = web_config.read_key("pipe_img_recv")

    # loop over frames
    while True:
        if await request.is_disconnected():
            print("client disconnected!!!")
            break

        # retrieve image from frame_producer_process
        try:
            base64_image = q_img.get()
        except Exception as e:
            print(f"couldn't get img from q due to: {e}")
        else:
            yield base64_image

        try:
            base64_image = pipe_img_recv.recv()
        except Exception as e:
            print(f"couldn't get img from pipe due to: {e}")
        else:
            yield base64_image

        await asyncio.sleep(0.001)


@app.get('/vid-stream-sse')
async def vid_message_stream(request: Request):
    event_generator = vid_message_generator(request)
    return EventSourceResponse(event_generator)


def _signal_handler(*_: any) -> None:
    shutdown_event.set()


@openziti.zitify(bindings={
    (address, port): {
        'ztx': identity, 'service': service_name}
    })
def run_webapp(q_img, pipe_img_recv):

    web_config.modify_key("q_img", q_img)
    print(f"q_img in run_webapp: {q_img}")

    web_config.modify_key("pipe_img_recv", pipe_img_recv)
    print(f"pipe_img_recv in run_webapp: {pipe_img_recv}")

    loop = asyncio.get_event_loop()
    loop.add_signal_handler(signal.SIGTERM, _signal_handler)
    loop.run_until_complete(
        serve(app, config, shutdown_trigger=shutdown_event.wait)
    )


if __name__ == "__main__":

    run_webapp()

webconfig.py:

# This file is part of Predalert
# Copyright (c) 2023-2024 Jacob Dybvald Ludvigsen (contributions@ingeniorskap.no)
# SPDX-License-Identifier: AGPL-3.0-or-later


class webConfigHolder:
    """
    Holds a dict of objects and variables that will be available to all
    functions in web_ui
    """

    def __init__(self):
        self.web_config = {}

    def update_config(self, input_dict):
        self.web_config.update(input_dict)

    def modify_key(self, key, value):
        self.web_config[key] = value

    def read_key(self, key):
        return self.web_config.get(key)

With this example, you'll get a number of couldn't get img from q due to: [Errno 107] Transport endpoint is not connected due to the queue mismatch, and the image queue will fill completely in frame_producer.

You'll still be able to see your webcam on the website, since the Pipe works. Hope we can figure out a way to make queues work as well :slight_smile:

Is it easy enough to turn all this into another repo that we could clone/run/debug? You can use any license you like AGPLv3 is fine with me. I do think having the minimal example in a repo somewhere will not only make it easier for me to check out and run but also if changes happen, we could pick them up easier than copy/paste from discourse. :slight_smile:

Would you mind pushing it into a repo somewhere? I would be happy to then add instructions to it how to get it all running locally using ziti cli commands, which would help me try it out. I'd like you to put that repo up though so you can add a license to it and 'own' it?

Sure, made a repo here :slight_smile: https://gitlab.com/papiris/ziti-streaming-webapp

I tried it out and followed the README but ended up with:

ModuleNotFoundError: No module named 'tomllib'

I'll give it another go later. Maybe I missed a step.

Which python version do you have installed?
I've got python 3.11.8, and apparently tomllib is a new built-in library in python 3.11.
I guess the minimum python version for the example (and my webapp) needs to be bumped to 3.11...

Thanks... 3.10 is what ubuntu gave me. I used apt to install 3.11 and it uses 3.11.0rc1 but that gets past the toml issue. I'm no Python expert, I just write it from time to time. What's your preferred method to install 3.11? I got the service up and running and I'm able to reproduce the error.

Here's how you can use the ziti to reproduce:

# get the latest ziti cli and put it on your path:
source /dev/stdin <<< "$(wget -qO- https://get.openziti.io/ziti-cli-functions.sh)"; getZiti yes

# start a locally running ziti instance
ziti edge quickstart --home /tmp/persistence-dir

# start a new console, run the getZiti command or put ziti on your path however you want
# then make a directory for two identities
mkdir -p /tmp/persistence-dir/identities

ziti edge create identity server -o /tmp/persistence-dir/identities/server.jwt
ziti edge enroll /tmp/persistence-dir/identities/server.jwt

ziti edge create identity client -o /tmp/persistence-dir/identities/client.jwt
ziti edge create config pred.int intercept.v1 '{"addresses":["pred.ziti"],"portRanges":[{"high":80,"low":80}],"protocols":["udp","tcp"]}'
ziti edge create service pred --configs pred.int 
ziti edge create service-policy pred.bind Bind --service-roles @pred --identity-roles @server
ziti edge create service-policy pred.dial Dial --service-roles @pred --identity-roles @client

# update the toml:
[ziti]
identity = "/tmp/persistence-dir/identities/server.json" # relative path to the .json identity file
service = "pred" # name of ziti service to bind to


# start the app
python -m ziti_streaming_webapp

# add the identity to your tunneler of choice then go to http://pred.ziti/

@ekoby, I set ZITI_LOG=9 and reproduced. I don't see anything in the logs but maybe you'll see something. I DM'ed the full logs to you since Discourse only allows 32k chars but at 7s is when I made it to the browser to go to /video...

(343135)[        7.482]   TRACE ziti-sdk:channel.c:838 on_channel_data() ch[0] on_data [len=325]
(343135)[        7.482]   TRACE ziti-sdk:channel.c:525 process_inbound() ch[0] <= ct[ED73] seq[2] len[36] hdrs[269]
(343135)[        7.482]   TRACE ziti-sdk:channel.c:535 process_inbound() ch[0] completing msg seq[2] body+hrds=36+269, in_offset=0, want=305, got=305
(343135)[        7.482]   TRACE ziti-sdk:channel.c:546 process_inbound() ch[0] message is complete seq[2] ct[ED73]
(343135)[        7.482]   DEBUG ziti-sdk:bind.c:343 on_message() received msg ct[ed73] code[0] from quickstart-router
(343135)[        7.482]   DEBUG ziti-sdk:zitilib.c:859 on_ziti_client() incoming client[client] for service[pred]/fd[23]
(343135)[        7.482]   DEBUG ziti-sdk:zitilib.c:890 on_ziti_client() server[23] no active accept: putting connection in backlog and sending notify
(343135)[        7.483]   DEBUG ziti-sdk:zitilib.c:1046 Ziti_accept() fd[23] waiting for future[0x3626500]
(343135)[        7.483]   DEBUG ziti-sdk:zitilib.c:1011 do_ziti_accept() server[23]: pending connection[client] for service[pred]
(343135)[        7.483]   DEBUG ziti-sdk:channel.c:212 ziti_channel_add_receiver() ch[0] added receiver[1]
(343135)[        7.483]   TRACE ziti-sdk:connect.c:1078 ziti_accept() conn[0.1/Accepting] ch[0] => Edge Accept parent_conn_id[0]
(343135)[        7.483]   TRACE ziti-sdk:channel.c:400 ziti_channel_send_for_reply() ch[0] => ct[ED74] seq[2] len[4]
(343135)[        7.483]   TRACE ziti-sdk:channel.c:330 on_channel_send() ch[0] write delay = 0.000d q=1 qs=60
(343135)[        7.483]   TRACE ziti-sdk:channel.c:838 on_channel_data() ch[0] on_data [len=56]
(343135)[        7.483]   TRACE ziti-sdk:channel.c:525 process_inbound() ch[0] <= ct[ED70] seq[3] len[0] hdrs[36]
(343135)[        7.483]   TRACE ziti-sdk:channel.c:535 process_inbound() ch[0] completing msg seq[3] body+hrds=0+36, in_offset=0, want=36, got=36
(343135)[        7.483]   TRACE ziti-sdk:channel.c:546 process_inbound() ch[0] message is complete seq[3] ct[ED70]
(343135)[        7.483]   TRACE ziti-sdk:connect.c:937 connect_reply_cb() conn[0.1/Accepting] accepted
(343135)[        7.483] VERBOSE ziti-sdk:connect.c:79 conn_set_state() conn[0.1/Accepting] transitioning Accepting => Connected
(343135)[        7.483] VERBOSE ziti-sdk:zitilib.c:492 connect_socket() connecting client socket[25]
(343135)[        7.483] VERBOSE ziti-sdk:zitilib.c:514 connect_socket() connected client socket[25] <-> ziti_fd[29]
(343135)[        7.483]    INFO ziti-sdk:zitilib.c:834 on_ziti_accept() bridging socket for fd[25]
(343135)[        7.483]   DEBUG ziti-sdk:conn_bridge.c:95 ziti_conn_bridge() br[0.1] connected
(343135)[        7.483]   DEBUG ziti-sdk:zitilib.c:846 on_ziti_accept() completing accept future[0x3626500] with fd[25]
(343135)[        7.483]   DEBUG ziti-sdk:zitilib.c:1049 Ziti_accept() fd[23] future[0x3626500] completed err = 0
(343135)[        7.483]   DEBUG ziti-sdk:zitilib.c:1057 Ziti_accept() fd[23] future[0x3626500] completed with caller client
(343135)[        7.483]   TRACE ziti-sdk:connect.c:734 flush_connection() conn[0.1/Connected] starting flusher
(343135)[        7.483]   DEBUG ziti-sdk:zitilib.c:1067 Ziti_accept() fd[23] future[0x3626500] returning clt[25]
(343135)[        7.483]   TRACE ziti-sdk:channel.c:330 on_channel_send() ch[0] write delay = 0.000d q=1 qs=68
(343135)[        7.483]   TRACE ziti-sdk:connect.c:215 on_write_completed() conn[0.1/Connected] status 0
(343135)[        7.484] VERBOSE ziti-sdk:connect.c:777 flush_to_client() conn[0.1/Connected] 0 bytes available
(343135)[        7.484]   TRACE ziti-sdk:connect.c:764 flush_to_service() conn[0.1/Connected] flushed 0 messages
(343135)[        7.484]   TRACE ziti-sdk:connect.c:727 on_flush() conn[0.1/Connected] stopping flusher
(343135)[        7.484]   DEBUG ziti-sdk:zitilib.c:1046 Ziti_accept() fd[23] waiting for future[0x3626cc0]
(343135)[        7.484]   DEBUG ziti-sdk:zitilib.c:1031 do_ziti_accept() fd[23] is_blocking[0]
(343135)[        7.484]   DEBUG ziti-sdk:zitilib.c:1033 do_ziti_accept() no pending connections for server fd[23]
(343135)[        7.484]   DEBUG ziti-sdk:zitilib.c:1049 Ziti_accept() fd[23] future[0x3626cc0] completed err = 11
(343135)[        7.484]   DEBUG ziti-sdk:zitilib.c:1067 Ziti_accept() fd[23] future[0x3626cc0] returning clt[-1]
(343135)[        7.484]   TRACE ziti-sdk:channel.c:838 on_channel_data() ch[0] on_data [len=92]
(343135)[        7.484]   TRACE ziti-sdk:channel.c:525 process_inbound() ch[0] <= ct[ED72] seq[4] len[24] hdrs[48]
(343135)[        7.484]   TRACE ziti-sdk:channel.c:535 process_inbound() ch[0] completing msg seq[4] body+hrds=24+48, in_offset=0, want=72, got=72
(343135)[        7.484]   TRACE ziti-sdk:channel.c:546 process_inbound() ch[0] message is complete seq[4] ct[ED72]
(343135)[        7.484]   TRACE ziti-sdk:connect.c:734 flush_connection() conn[0.1/Connected] starting flusher
(343135)[        7.484]   TRACE ziti-sdk:connect.c:1264 process_edge_message() conn[0.1/Connected] <= ct[ED72] edge_seq[1] body[24]
(343135)[        7.484] VERBOSE ziti-sdk:connect.c:825 conn_inbound_data_msg() conn[0.1/Connected] processing crypto header(24 bytes)
(343135)[        7.484] VERBOSE ziti-sdk:connect.c:828 conn_inbound_data_msg() conn[0.1/Connected] processed crypto header
(343135)[        7.484] VERBOSE ziti-sdk:connect.c:777 flush_to_client() conn[0.1/Connected] 0 bytes available
(343135)[        7.484]   TRACE ziti-sdk:connect.c:764 flush_to_service() conn[0.1/Connected] flushed 0 messages
(343135)[        7.484]   TRACE ziti-sdk:connect.c:727 on_flush() conn[0.1/Connected] stopping flusher
Exception in callback BaseSelectorEventLoop._accept_connection(<function sta...x7fda4bf7b740>, <openziti.dec...27.0.0.1', 0)>, None, <Server socke...0.0.1', 0)>,)>, 100, None, None)
handle: <Handle BaseSelectorEventLoop._accept_connection(<function sta...x7fda4bf7b740>, <openziti.dec...27.0.0.1', 0)>, None, <Server socke...0.0.1', 0)>,)>, 100, None, None)>
Traceback (most recent call last):
  File "/usr/lib/python3.11/asyncio/events.py", line 80, in _run
    self._context.run(self._callback, *self._args)
  File "/usr/lib/python3.11/asyncio/selector_events.py", line 165, in _accept_connection
    conn, addr = sock.accept()
                 ^^^^^^^^^^^^^
  File "/mnt/wsl/dev/git/github/external/papris/ziti-streaming-webapp/.env/lib/python3.11/site-packages/openziti/zitisock.py", line 119, in accept
    fd, peer = zitilib.accept(self.fileno())
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/mnt/wsl/dev/git/github/external/papris/ziti-streaming-webapp/.env/lib/python3.11/site-packages/openziti/zitilib.py", line 276, in accept
    check_error(clt)
  File "/mnt/wsl/dev/git/github/external/papris/ziti-streaming-webapp/.env/lib/python3.11/site-packages/openziti/zitilib.py", line 219, in check_error
    raise Exception(err, msg)
Exception: (11, 'unexpected error')
(343135)[        7.533]   TRACE ziti-sdk:channel.c:838 on_channel_data() ch[0] on_data [len=487]
(343135)[        7.533]   TRACE ziti-sdk:channel.c:525 process_inbound() ch[0] <= ct[ED72] seq[5] len[419] hdrs[48]
(343135)[        7.533]   TRACE ziti-sdk:channel.c:535 process_inbound() ch[0] completing msg seq[5] body+hrds=419+48, in_offset=0, want=467, got=467
(343135)[        7.533]   TRACE ziti-sdk:channel.c:546 process_inbound() ch[0] message is complete seq[5] ct[ED72]
(343135)[        7.533]   TRACE ziti-sdk:connect.c:734 flush_connection() conn[0.1/Connected] starting flusher
(343135)[        7.533]   TRACE ziti-sdk:connect.c:1264 process_edge_message() conn[0.1/Connected] <= ct[ED72] edge_seq[2] body[419]
(343135)[        7.533] VERBOSE ziti-sdk:connect.c:835 conn_inbound_data_msg() conn[0.1/Connected] decrypting 419 bytes
(343135)[        7.533] VERBOSE ziti-sdk:connect.c:839 conn_inbound_data_msg() conn[0.1/Connected] decrypted 402 bytes
(343135)[        7.533] VERBOSE ziti-sdk:connect.c:777 flush_to_client() conn[0.1/Connected] 402 bytes available
(343135)[        7.533]   TRACE ziti-sdk:conn_bridge.c:264 on_ziti_data() br[0.1] received 402 bytes from ziti
(343135)[        7.533]   TRACE ziti-sdk:connect.c:783 flush_to_client() conn[0.1/Connected] client consumed 402 out of 402 bytes
(343135)[        7.533]   TRACE ziti-sdk:connect.c:764 flush_to_service() conn[0.1/Connected] flushed 0 messages
(343135)[        7.533]   TRACE ziti-sdk:connect.c:727 on_flush() conn[0.1/Connected] stopping flusher
(343135)[        7.536]   TRACE ziti-sdk:conn_bridge.c:313 bridge_alloc() br[0.1] alloc live
(343135)[        7.536]   TRACE ziti-sdk:connect.c:1132 ziti_write() conn[0.1/Connected] write 1002 bytes
(343135)[        7.536]   TRACE ziti-sdk:connect.c:734 flush_connection() conn[0.1/Connected] starting flusher
(343135)[        7.536] VERBOSE ziti-sdk:connect.c:777 flush_to_client() conn[0.1/Connected] 0 bytes available
(343135)[        7.536]   TRACE ziti-sdk:connect.c:764 flush_to_service() conn[0.1/Connected] flushed 1 messages
(343135)[        7.536]   TRACE ziti-sdk:connect.c:727 on_flush() conn[0.1/Connected] stopping flusher
(343135)[        7.536]   TRACE ziti-sdk:channel.c:330 on_channel_send() ch[0] write delay = 0.000d q=1 qs=1063
(343135)[        7.536]   TRACE ziti-sdk:connect.c:215 on_write_completed() conn[0.1/Connected] status 0
(343135)[        7.564]   TRACE ziti-sdk:channel.c:838 on_channel_data() ch[0] on_data [len=458]
(343135)[        7.564]   TRACE ziti-sdk:channel.c:525 process_inbound() ch[0] <= ct[ED72] seq[6] len[390] hdrs[48]
(343135)[        7.564]   TRACE ziti-sdk:channel.c:535 process_inbound() ch[0] completing msg seq[6] body+hrds=390+48, in_offset=0, want=438, got=438
(343135)[        7.564]   TRACE ziti-sdk:channel.c:546 process_inbound() ch[0] message is complete seq[6] ct[ED72]
(343135)[        7.564]   TRACE ziti-sdk:connect.c:734 flush_connection() conn[0.1/Connected] starting flusher
(343135)[        7.564]   TRACE ziti-sdk:connect.c:1264 process_edge_message() conn[0.1/Connected] <= ct[ED72] edge_seq[3] body[390]
(343135)[        7.564] VERBOSE ziti-sdk:connect.c:835 conn_inbound_data_msg() conn[0.1/Connected] decrypting 390 bytes
(343135)[        7.564] VERBOSE ziti-sdk:connect.c:839 conn_inbound_data_msg() conn[0.1/Connected] decrypted 373 bytes
(343135)[        7.564] VERBOSE ziti-sdk:connect.c:777 flush_to_client() conn[0.1/Connected] 373 bytes available
(343135)[        7.564]   TRACE ziti-sdk:conn_bridge.c:264 on_ziti_data() br[0.1] received 373 bytes from ziti
(343135)[        7.564]   TRACE ziti-sdk:connect.c:783 flush_to_client() conn[0.1/Connected] client consumed 373 out of 373 bytes
(343135)[        7.564]   TRACE ziti-sdk:connect.c:764 flush_to_service() conn[0.1/Connected] flushed 0 messages
(343135)[        7.564]   TRACE ziti-sdk:connect.c:727 on_flush() conn[0.1/Connected] stopping flusher
(343135)[        7.565]   TRACE ziti-sdk:conn_bridge.c:313 bridge_alloc() br[0.1] alloc live
(343135)[        7.565]   TRACE ziti-sdk:connect.c:1132 ziti_write() conn[0.1/Connected] write 1002 bytes
(343135)[        7.565]   TRACE ziti-sdk:connect.c:734 flush_connection() conn[0.1/Connected] starting flusher
(343135)[        7.565] VERBOSE ziti-sdk:connect.c:777 flush_to_client() conn[0.1/Connected] 0 bytes available
(343135)[        7.565]   TRACE ziti-sdk:connect.c:764 flush_to_service() conn[0.1/Connected] flushed 1 messages
(343135)[        7.565]   TRACE ziti-sdk:connect.c:727 on_flush() conn[0.1/Connected] stopping flusher
(343135)[        7.565]   TRACE ziti-sdk:channel.c:330 on_channel_send() ch[0] write delay = 0.000d q=1 qs=1063
(343135)[        7.565]   TRACE ziti-sdk:connect.c:215 on_write_completed() conn[0.1/Connected] status 0
(343135)[        7.566]   TRACE ziti-sdk:channel.c:838 on_channel_data() ch[0] on_data [len=325]
(343135)[        7.566]   TRACE ziti-sdk:channel.c:525 process_inbound() ch[0] <= ct[ED73] seq[7] len[36] hdrs[269]
(343135)[        7.566]   TRACE ziti-sdk:channel.c:535 process_inbound() ch[0] completing msg seq[7] body+hrds=36+269, in_offset=0, want=305, got=305
(343135)[        7.566]   TRACE ziti-sdk:channel.c:546 process_inbound() ch[0] message is complete seq[7] ct[ED73]
(343135)[        7.566]   DEBUG ziti-sdk:bind.c:343 on_message() received msg ct[ed73] code[0] from quickstart-router
(343135)[        7.566]   DEBUG ziti-sdk:zitilib.c:859 on_ziti_client() incoming client[client] for service[pred]/fd[23]
(343135)[        7.566]   DEBUG ziti-sdk:zitilib.c:890 on_ziti_client() server[23] no active accept: putting connection in backlog and sending notify
(343135)[        7.567]   DEBUG ziti-sdk:zitilib.c:1011 do_ziti_accept() server[23]: pending connection[client] for service[pred]
(343135)[        7.567]   DEBUG ziti-sdk:channel.c:212 ziti_channel_add_receiver() ch[0] added receiver[2]
(343135)[        7.567]   TRACE ziti-sdk:connect.c:1078 ziti_accept() conn[0.2/Accepting] ch[0] => Edge Accept parent_conn_id[0]
(343135)[        7.567]   DEBUG ziti-sdk:zitilib.c:1046 Ziti_accept() fd[23] waiting for future[0x3627280]
(343135)[        7.567]   TRACE ziti-sdk:channel.c:400 ziti_channel_send_for_reply() ch[0] => ct[ED74] seq[3] len[4]
(343135)[        7.567]   TRACE ziti-sdk:channel.c:330 on_channel_send() ch[0] write delay = 0.000d q=1 qs=60
(343135)[        7.567]   TRACE ziti-sdk:channel.c:838 on_channel_data() ch[0] on_data [len=56]
(343135)[        7.567]   TRACE ziti-sdk:channel.c:525 process_inbound() ch[0] <= ct[ED70] seq[8] len[0] hdrs[36]
(343135)[        7.567]   TRACE ziti-sdk:channel.c:535 process_inbound() ch[0] completing msg seq[8] body+hrds=0+36, in_offset=0, want=36, got=36
(343135)[        7.567]   TRACE ziti-sdk:channel.c:546 process_inbound() ch[0] message is complete seq[8] ct[ED70]
(343135)[        7.567]   TRACE ziti-sdk:connect.c:937 connect_reply_cb() conn[0.2/Accepting] accepted
(343135)[        7.567] VERBOSE ziti-sdk:connect.c:79 conn_set_state() conn[0.2/Accepting] transitioning Accepting => Connected
(343135)[        7.567] VERBOSE ziti-sdk:zitilib.c:492 connect_socket() connecting client socket[27]
(343135)[        7.567] VERBOSE ziti-sdk:zitilib.c:514 connect_socket() connected client socket[27] <-> ziti_fd[31]
(343135)[        7.567]    INFO ziti-sdk:zitilib.c:834 on_ziti_accept() bridging socket for fd[27]
(343135)[        7.567]   DEBUG ziti-sdk:conn_bridge.c:95 ziti_conn_bridge() br[0.2] connected
(343135)[        7.567]   DEBUG ziti-sdk:zitilib.c:846 on_ziti_accept() completing accept future[0x3627280] with fd[27]
(343135)[        7.567]   TRACE ziti-sdk:connect.c:734 flush_connection() conn[0.2/Connected] starting flusher
(343135)[        7.567]   DEBUG ziti-sdk:zitilib.c:1049 Ziti_accept() fd[23] future[0x3627280] completed err = 0
(343135)[        7.567]   TRACE ziti-sdk:channel.c:330 on_channel_send() ch[0] write delay = 0.000d q=1 qs=68
(343135)[        7.567]   TRACE ziti-sdk:connect.c:215 on_write_completed() conn[0.2/Connected] status 0
(343135)[        7.567] VERBOSE ziti-sdk:connect.c:777 flush_to_client() conn[0.2/Connected] 0 bytes available
(343135)[        7.567]   TRACE ziti-sdk:connect.c:764 flush_to_service() conn[0.2/Connected] flushed 0 messages
(343135)[        7.567]   TRACE ziti-sdk:connect.c:727 on_flush() conn[0.2/Connected] stopping flusher
(343135)[        7.567]   DEBUG ziti-sdk:zitilib.c:1057 Ziti_accept() fd[23] future[0x3627280] completed with caller client
(343135)[        7.567]   DEBUG ziti-sdk:zitilib.c:1067 Ziti_accept() fd[23] future[0x3627280] returning clt[27]
(343135)[        7.567]   DEBUG ziti-sdk:zitilib.c:1046 Ziti_accept() fd[23] waiting for future[0x3627c40]
(343135)[        7.567]   DEBUG ziti-sdk:zitilib.c:1031 do_ziti_accept() fd[23] is_blocking[0]
(343135)[        7.567]   DEBUG ziti-sdk:zitilib.c:1033 do_ziti_accept() no pending connections for server fd[23]
(343135)[        7.567]   DEBUG ziti-sdk:zitilib.c:1049 Ziti_accept() fd[23] future[0x3627c40] completed err = 11
(343135)[        7.567]   DEBUG ziti-sdk:zitilib.c:1067 Ziti_accept() fd[23] future[0x3627c40] returning clt[-1]
(343135)[        7.568]   TRACE ziti-sdk:channel.c:838 on_channel_data() ch[0] on_data [len=445]
(343135)[        7.568]   TRACE ziti-sdk:channel.c:525 process_inbound() ch[0] <= ct[ED72] seq[9] len[377] hdrs[48]
(343135)[        7.568]   TRACE ziti-sdk:channel.c:535 process_inbound() ch[0] completing msg seq[9] body+hrds=377+48, in_offset=0, want=425, got=425
(343135)[        7.568]   TRACE ziti-sdk:channel.c:546 process_inbound() ch[0] message is complete seq[9] ct[ED72]
(343135)[        7.568]   TRACE ziti-sdk:connect.c:734 flush_connection() conn[0.1/Connected] starting flusher
(343135)[        7.568]   TRACE ziti-sdk:connect.c:1264 process_edge_message() conn[0.1/Connected] <= ct[ED72] edge_seq[4] body[377]
(343135)[        7.568] VERBOSE ziti-sdk:connect.c:835 conn_inbound_data_msg() conn[0.1/Connected] decrypting 377 bytes
(343135)[        7.568] VERBOSE ziti-sdk:connect.c:839 conn_inbound_data_msg() conn[0.1/Connected] decrypted 360 bytes
(343135)[        7.568] VERBOSE ziti-sdk:connect.c:777 flush_to_client() conn[0.1/Connected] 360 bytes available
(343135)[        7.568]   TRACE ziti-sdk:conn_bridge.c:264 on_ziti_data() br[0.1] received 360 bytes from ziti
(343135)[        7.568]   TRACE ziti-sdk:connect.c:783 flush_to_client() conn[0.1/Connected] client consumed 360 out of 360 bytes
(343135)[        7.568]   TRACE ziti-sdk:connect.c:764 flush_to_service() conn[0.1/Connected] flushed 0 messages
(343135)[        7.568]   TRACE ziti-sdk:connect.c:727 on_flush() conn[0.1/Connected] stopping flusher
Exception in callback BaseSelectorEventLoop._accept_connection(<function sta...x7fda4bf7b740>, <openziti.dec...27.0.0.1', 0)>, None, <Server socke...0.0.1', 0)>,)>, 100, None, None)
handle: <Handle BaseSelectorEventLoop._accept_connection(<function sta...x7fda4bf7b740>, <openziti.dec...27.0.0.1', 0)>, None, <Server socke...0.0.1', 0)>,)>, 100, None, None)>
Traceback (most recent call last):
  File "/usr/lib/python3.11/asyncio/events.py", line 80, in _run
    self._context.run(self._callback, *self._args)
  File "/usr/lib/python3.11/asyncio/selector_events.py", line 165, in _accept_connection
    conn, addr = sock.accept()
                 ^^^^^^^^^^^^^
  File "/mnt/wsl/dev/git/github/external/papris/ziti-streaming-webapp/.env/lib/python3.11/site-packages/openziti/zitisock.py", line 119, in accept
    fd, peer = zitilib.accept(self.fileno())
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/mnt/wsl/dev/git/github/external/papris/ziti-streaming-webapp/.env/lib/python3.11/site-packages/openziti/zitilib.py", line 276, in accept
    check_error(clt)
  File "/mnt/wsl/dev/git/github/external/papris/ziti-streaming-webapp/.env/lib/python3.11/site-packages/openziti/zitilib.py", line 219, in check_error
    raise Exception(err, msg)
Exception: (11, 'unexpected error')
(343135)[        7.613]   TRACE ziti-sdk:channel.c:838 on_channel_data() ch[0] on_data [len=92]
(343135)[        7.613]   TRACE ziti-sdk:channel.c:525 process_inbound() ch[0] <= ct[ED72] seq[10] len[24] hdrs[48]
(343135)[        7.613]   TRACE ziti-sdk:channel.c:535 process_inbound() ch[0] completing msg seq[10] body+hrds=24+48, in_offset=0, want=72, got=72
(343135)[        7.613]   TRACE ziti-sdk:channel.c:546 process_inbound() ch[0] message is complete seq[10] ct[ED72]
(343135)[        7.613]   TRACE ziti-sdk:connect.c:734 flush_connection() conn[0.2/Connected] starting flusher
(343135)[        7.613]   TRACE ziti-sdk:connect.c:1264 process_edge_message() conn[0.2/Connected] <= ct[ED72] edge_seq[1] body[24]
(343135)[        7.613] VERBOSE ziti-sdk:connect.c:825 conn_inbound_data_msg() conn[0.2/Connected] processing crypto header(24 bytes)
(343135)[        7.613] VERBOSE ziti-sdk:connect.c:828 conn_inbound_data_msg() conn[0.2/Connected] processed crypto header
(343135)[        7.613] VERBOSE ziti-sdk:connect.c:777 flush_to_client() conn[0.2/Connected] 0 bytes available
(343135)[        7.613]   TRACE ziti-sdk:connect.c:764 flush_to_service() conn[0.2/Connected] flushed 0 messages
(343135)[        7.613]   TRACE ziti-sdk:connect.c:727 on_flush() conn[0.2/Connected] stopping flusher
^CTraceback (most recent call last):
  File "<frozen runpy>", line 198, in _run_module_as_main
  File "<frozen runpy>", line 88, in _run_code
Process Process-3:
  File "/mnt/wsl/dev/git/github/external/papris/ziti-streaming-webapp/ziti_streaming_webapp/__main__.py", line 45, in <module>
    main()
  File "/mnt/wsl/dev/git/github/external/papris/ziti-streaming-webapp/ziti_streaming_webapp/__main__.py", line 39, in main
    time.sleep(1)
KeyboardInterrupt
Traceback (most recent call last):
  File "/usr/lib/python3.11/multiprocessing/process.py", line 314, in _bootstrap
    self.run()
  File "/usr/lib/python3.11/multiprocessing/process.py", line 108, in run
    self._target(*self._args, **self._kwargs)
  File "/mnt/wsl/dev/git/github/external/papris/ziti-streaming-webapp/.env/lib/python3.11/site-packages/openziti/decor.py", line 80, in zitified
    func(*args, **kwargs)
  File "/mnt/wsl/dev/git/github/external/papris/ziti-streaming-webapp/ziti_streaming_webapp/webapp.py", line 206, in run_webapp
    loop.run_until_complete(
  File "/usr/lib/python3.11/asyncio/base_events.py", line 637, in run_until_complete
    self.run_forever()
  File "/usr/lib/python3.11/asyncio/base_events.py", line 604, in run_forever
    self._run_once()
  File "/usr/lib/python3.11/asyncio/base_events.py", line 1909, in _run_once
    handle._run()
  File "/usr/lib/python3.11/asyncio/events.py", line 80, in _run
    self._context.run(self._callback, *self._args)
  File "/mnt/wsl/dev/git/github/external/papris/ziti-streaming-webapp/.env/lib/python3.11/site-packages/sse_starlette/sse.py", line 271, in wrap
    await func()
  File "/mnt/wsl/dev/git/github/external/papris/ziti-streaming-webapp/.env/lib/python3.11/site-packages/sse_starlette/sse.py", line 251, in stream_response
    async for data in self.body_iterator:
  File "/mnt/wsl/dev/git/github/external/papris/ziti-streaming-webapp/ziti_streaming_webapp/webapp.py", line 158, in vid_message_generator
    base64_image = q_img.get()
                   ^^^^^^^^^^^
  File "/usr/lib/python3.11/multiprocessing/queues.py", line 103, in get
    res = self._recv_bytes()
          ^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/multiprocessing/connection.py", line 220, in recv_bytes
    buf = self._recv_bytes(maxlength)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/multiprocessing/connection.py", line 418, in _recv_bytes
    buf = self._recv(4)
          ^^^^^^^^^^^^^
  File "/usr/lib/python3.11/multiprocessing/connection.py", line 383, in _recv
    chunk = read(handle, remaining)
            ^^^^^^^^^^^^^^^^^^^^^^^
KeyboardInterrupt

(.env) cd@192.168.253.239:sg4u22: ~/git/github/external/papris/ziti-streaming-webapp
$
1 Like

Glad we got that figured out, sorry it wasn't setup properly for 3.10...
I'm on fedora 40, which provides me with 3.12 OOTB. Had to install 3.11 manually from the fedora repos because lots of ML-libs only support up to 3.11.

Neat instructions!
Would you like to open a PR to add them to the example webapp repo, or for me to commit them directly? :slight_smile:

Got ziti + multiprocessing + hypercorn + queues + SSE working with the latest commit to the example webapp!
The main process passes a mp.Manager.Dict containing the keys ztx and service to the webserver subprocess.
Openziti monkeypatch gets address and port directly from config file, and nests the mp.M.Dict recieved from main process.

Error 11 is still prevalent, but the app finally works! :dancer: :sparkles:

1 Like

Oh wow! Great!!! I'll put a PR up, sure :slight_smile:

Sounds like we need to track down "Error 11" still, but excited and glad to hear you got it working!

1 Like