Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

WIP: fastapi inference server implementation #73

Merged
merged 28 commits into from
Oct 22, 2024

Conversation

che85
Copy link
Contributor

@che85 che85 commented Jul 25, 2024

Inference server based on fastapi that can be started from the command line or from 3D Slicer's MONAIAuto3DSeg module

Untitled.mov

addresses #30

- model database class that handles all local models including their download and/or deletion
- extracted functions into utils module
- fastapi server can be run from Slicer directly or from commandline
- make sure loaded terminologies are searched
Copy link
Owner

@lassoan lassoan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Very nice work. I've added a few comments, mainly trying to keep things reasonably simple. Let me know if this is ready to test and merge (maybe until then you can mark it the pull request as a draft).

MONAIAuto3DSeg/MONAIAuto3DSegLib/constants.py Outdated Show resolved Hide resolved
packageName = "torch"
if not self._checkModuleInstalled(packageName):
logger.info("PyTorch Python package is required. Installing... (it may take several minutes)")
install(packageName)
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This only works on macOS. Installation command depends on operating system, hardware, and drivers - see the table here: https://pytorch.org/get-started/locally/ To automate it, you need to use a package like light-the-torch. Even with that it is not completely trivial, that's why we have the Slicer pytorch extension. Please always use that extension for installing pytorch.

Copy link
Contributor Author

@che85 che85 Oct 3, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This may have been misleading then. The dependency_handler's task is to differentiate between LocalPythonDependencies (running fastapi server locally) vs SlicerPythonDependencies (running webserver within 3D Slicer).

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is PyTorch always running in Slicer's Python environment and installed by SlicerPytorch extension?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

When server is started from Slicer, SlicerPython will be used. If server is started outside of Slicer, a local environment is required.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

PyTorch is installed using the SlicerPytorch extension.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually, let me check and I will get back to you

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@lassoan I made a few changes. When starting the server from Slicer, availability of PyTorch extension will be checked. If successful, other Python packages (monai) will be checked and installed.

As mentioned before, the server can be used independently from Slicer by just running uvicorn from the commandline.

MONAIAuto3DSeg/MONAIAuto3DSegLib/dependency_handler.py Outdated Show resolved Hide resolved
@@ -0,0 +1,130 @@
import sys
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure if we need this new custom Python dependency installer.

  1. There are efforts to move towards more standard way of declaring Python dependencies in Slicer Python scripted modules, while still allowing lazy installation and import. See Allow scripted modules to declare and lazily install pip requirements Slicer/Slicer#7707 and Improving Support for Python Package Dependencies in Slicer Extensions Slicer/Slicer#7171. Therefore, I would not invest time into an alternative way of installing dependencies for one specific extension, but rather spend that time with improving and switching to this new approach.

  2. We could add Slicer-specific improvements that are visible to the user. But I don't see such improvements in this class (it seems pretty much the same functionality as before with slightly different design). For example, if we spend time with improving dependency installation then we could show progress information and errors in a popup window during pip install so that the user knows that something is happening and he has to wait.

If this dependency handler was needed for the uvicorn server then it should be moved there, to make it clear that it is only for that special case.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As long as the Slicer core efforts are not integrated, I am leaning toward using the current version.

MONAIAuto3DSeg/MONAIAuto3DSegLib/utils.py Show resolved Hide resolved
MONAIAuto3DSeg/MONAIAuto3DSegLib/server.py Outdated Show resolved Hide resolved
MONAIAuto3DSeg/MONAIAuto3DSegLib/server.py Outdated Show resolved Hide resolved
MONAIAuto3DSeg/auto3dseg/main.py Outdated Show resolved Hide resolved
MONAIAuto3DSeg/auto3dseg/main.py Outdated Show resolved Hide resolved
@che85 che85 marked this pull request as draft October 1, 2024 17:35
che85 added 4 commits October 4, 2024 15:35
- requesting server from Slicer will use SlicerPythonDependencies and check for PyTorch extension
  prior to server start. Other modules (e.g. monai) will also be installed prior to server start.
- using fastapi from commandline will use NonSlicerPythonDependencies.
@lassoan
Copy link
Owner

lassoan commented Oct 7, 2024

Thanks for the updates. I'm reviewing the changes now. Is there any additional changes that you plan to make before this branch is ready for integration?

@che85 che85 marked this pull request as ready for review October 7, 2024 16:39
@lassoan
Copy link
Owner

lassoan commented Oct 7, 2024

I've pushed a few minor suggestions instead of commenting on them.

  • Some changes were GUI tweaks, those should be trivial.
  • I've also starting to make WebServer in server.py make an inference web server class (it was just a generic process class). It could either make sense to continue specialization of this class (move as much as possible into this class) or my changes could be reverted and this class could be renamed to something like BackgroundProcess and then it could be used for both the web server and the local MONAIAuto3DSeg inference process (because currently starting/stopping the process, getting the process outputs in the background, etc. are implemented twice: in server.py and in MONAIAuto3DSeg.py. Or maybe keep a generic BackgroundProcess class and a separate LocalInference and InferenceServer classes (using BackgroundProcess as parent).

When the inference is performed locally then there is no progress information in the GUI (segmentation completes succesfully). It should be fixed.

Another small thing: when starting a local segmentation server, before installing all the dependencies, it would be nice to tell the user that dependencies need to be installed that may take several minutes and let the user decide if he is OK with that or would like to cancel instead.

@che85
Copy link
Contributor Author

che85 commented Oct 7, 2024

In my opinion, the information displayed when running locally or remotely is insufficient as there is barely any feedback provided about the ongoing process.

@che85
Copy link
Contributor Author

che85 commented Oct 7, 2024

@lassoan I will experiement with the suggestions you made RE BackgroundProcess and LocalInference, InferenceServer.

@lassoan
Copy link
Owner

lassoan commented Oct 7, 2024

In my opinion, the information displayed when running locally or remotely is insufficient as there is barely any feedback provided about the ongoing process.

It may depend on the hardware and model.

On my computer with a GPU all segmentations complete within 1-2 minutes, with a log message printed every couple of seconds (the longest wait between two messages is maybe 10 seconds).

That said, the messages are good for showing that something is happening, but they do not give a good indication of how much longer the segmentation will take. It could be nice to implement overall progress reporting (tqdm reporting is done during inference, but that usually takes a couple of seconds, so again not much info about the overall progress). I've added a separate issue for tracking this feature request: #81

@che85
Copy link
Contributor Author

che85 commented Oct 10, 2024

@lassoan I made the changes you requested. Now using a parent class BackgroundProcess and having a LocalInference and InferenceServer inheriting from it. I also made use of the dataclass decorator to keep track of ProcessInfo / SegmentationProcessInfo.

Let me know what you think and we can go from there.

@lassoan
Copy link
Owner

lassoan commented Oct 11, 2024

These changes are great! If the segmentation works well at least locally then we can merge this. Two things to fix before merging this:

  • When doing local segmentation, there is still no progress information is visible at all. Only a "Processing finished." message in the end. With no feedback at all, after 10-15 seconds, most people would think that the processing failed to start.
  • Wanted to try segmentation in server mode, so started the server and typed the server address into the local server URL. It fails with this error:
Traceback (most recent call last):
  File "C:\Users\andra\AppData\Local\slicer.org\Slicer 5.7.0-2024-09-21\bin\Python\slicer\util.py", line 3295, in tryWithErrorDisplay
    yield
  File "C:/D/SlicerMONAIAuto3DSeg/MONAIAuto3DSeg/MONAIAuto3DSeg.py", line 591, in onApply
    self._segmentationProcessInfo = self.logic.process(inputNodes, self.ui.outputSegmentationSelector.currentNode(),
  File "C:/D/SlicerMONAIAuto3DSeg/MONAIAuto3DSeg/MONAIAuto3DSeg.py", line 1391, in process
    r.raise_for_status()
  File "C:\Users\andra\AppData\Local\slicer.org\Slicer 5.7.0-2024-09-21\lib\Python\Lib\site-packages\requests\models.py", line 1024, in raise_for_status
    raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 500 Server Error: Internal Server Error for url: http://desktop-nm47af2:8891/infer?model_name=abdominal-organs-3mm-v2.0.0

The server does not log anything even though log to console and GUI are enabled.

Minor GUI tweaks:

  • Collapse "Local segmentation server" section by default (advanced option)
  • Add a label ("Server:") next to the server address combobox, because without that it is difficult to explain users what to do (now I would need to say "type the server URL to the left from the Connect button"; or above the "Segmentation model" selector). Maybe even adding a checkbox "Local processing" would make things even more clear (if unchecked then the server address combobox and "Connected" button would be hidden)
  • Would be nice to separate the server output from the segmentation output. If the two modes (run segmentation server, run inference) are mutually exclusive that should be clearly reflected on the GUI, for example by having a selector at the very top that lets user choose between them (and then it would hide the entire GUI of the other function)

@che85
Copy link
Contributor Author

che85 commented Oct 14, 2024

@lassoan did you check the server logs when trying to run inference with it? Wondering what happened there since it works for me....

@lassoan
Copy link
Owner

lassoan commented Oct 14, 2024

No logs are displayed, even if I click log to GUI. Do you see any logs when using local or server inference?

@che85
Copy link
Contributor Author

che85 commented Oct 14, 2024

Not seeing any logs in the UI, but Slicer's error log window. Everything works on Linux.

image

I will work on the "Log to GUI". Do you want to see the actual log information there or also just some custom statements?

@che85
Copy link
Contributor Author

che85 commented Oct 14, 2024

@lassoan I will also try on the Windows server we intended for the SimplifiedViewer.

@lassoan
Copy link
Owner

lassoan commented Oct 14, 2024

When doing local (no server) inference: logs are visible in the application log window. Just need to fix logging to the GUI.

When doing inference on server:

Application log of Slicer that launched the server:

Session start time .......: 20241014_123947
Slicer version ...........: 5.7.0-2024-09-21 (revision 33025 / f46b1e2) win-amd64 - installed release
Operating system .........: Windows /  Professional / (Build 22631, Code Page 65001) - 64-bit
Memory ...................: 65245 MB physical, 74973 MB virtual
CPU ......................: GenuineIntel , 20 cores, 20 logical processors
VTK configuration ........: OpenGL2 rendering, TBB threading
Qt configuration .........: version 5.15.2, with SSL, requested OpenGL 3.2 (compatibility profile)
Internationalization .....: enabled, language=en_US
Developer mode ...........: enabled
Application path .........: C:/Users/andra/AppData/Local/slicer.org/Slicer 5.7.0-2024-09-21/bin
Additional module paths ..: C:/Users/andra/OneDrive/Projects/SlicerTesting2024/20240923-PyFileWriter, C:/D/SlicerHeartPrivate/Import4DFlow, slicer.org/Extensions-33025/TotalSegmentator/lib/Slicer-5.7/qt-scripted-modules, slicer.org/Extensions-33025/PyTorch/lib/Slicer-5.7/qt-scripted-modules, slicer.org/Extensions-33025/Sandbox/lib/Slicer-5.7/qt-loadable-modules, slicer.org/Extensions-33025/Sandbox/lib/Slicer-5.7/qt-scripted-modules, C:/D/SlicerMONAIAuto3DSeg/MONAIAuto3DSeg
Unused annotations: [SingleStep(value=0.01)]
Scripted subject hierarchy plugin registered: SegmentEditor
Scripted subject hierarchy plugin registered: SegmentStatistics
Switch to module:  "Welcome"
Switch to module:  "MONAIAuto3DSeg"
Initializing PyTorch...
Importing torch...
PyTorch 2.4.1+cu118 imported successfully
CUDA available: True
Initializing MONAI...
Requirement already satisfied: monai[fire,itk,nibabel,psutil,pynrrd,pyyaml,skimage,tensorboard,tqdm]>=1.3 in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (1.3.2)
Requirement already satisfied: torch>=1.9 in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from monai[fire,itk,nibabel,psutil,pynrrd,pyyaml,skimage,tensorboard,tqdm]>=1.3) (2.4.1+cu118)
Requirement already satisfied: numpy>=1.20 in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from monai[fire,itk,nibabel,psutil,pynrrd,pyyaml,skimage,tensorboard,tqdm]>=1.3) (1.26.4)
Requirement already satisfied: pynrrd in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from monai[fire,itk,nibabel,psutil,pynrrd,pyyaml,skimage,tensorboard,tqdm]>=1.3) (1.0.0)
Requirement already satisfied: tqdm>=4.47.0 in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from monai[fire,itk,nibabel,psutil,pynrrd,pyyaml,skimage,tensorboard,tqdm]>=1.3) (4.66.5)
Requirement already satisfied: fire in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from monai[fire,itk,nibabel,psutil,pynrrd,pyyaml,skimage,tensorboard,tqdm]>=1.3) (0.7.0)
Requirement already satisfied: scikit-image>=0.14.2 in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from monai[fire,itk,nibabel,psutil,pynrrd,pyyaml,skimage,tensorboard,tqdm]>=1.3) (0.24.0)
Requirement already satisfied: tensorboard in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from monai[fire,itk,nibabel,psutil,pynrrd,pyyaml,skimage,tensorboard,tqdm]>=1.3) (2.18.0)
Requirement already satisfied: nibabel in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from monai[fire,itk,nibabel,psutil,pynrrd,pyyaml,skimage,tensorboard,tqdm]>=1.3) (5.2.1)
Requirement already satisfied: pyyaml in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from monai[fire,itk,nibabel,psutil,pynrrd,pyyaml,skimage,tensorboard,tqdm]>=1.3) (6.0.2)
Requirement already satisfied: itk>=5.2 in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from monai[fire,itk,nibabel,psutil,pynrrd,pyyaml,skimage,tensorboard,tqdm]>=1.3) (5.4.0)
Requirement already satisfied: psutil in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from monai[fire,itk,nibabel,psutil,pynrrd,pyyaml,skimage,tensorboard,tqdm]>=1.3) (6.0.0)
Requirement already satisfied: itk-core==5.4.0 in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from itk>=5.2->monai[fire,itk,nibabel,psutil,pynrrd,pyyaml,skimage,tensorboard,tqdm]>=1.3) (5.4.0)
Requirement already satisfied: itk-numerics==5.4.0 in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from itk>=5.2->monai[fire,itk,nibabel,psutil,pynrrd,pyyaml,skimage,tensorboard,tqdm]>=1.3) (5.4.0)
Requirement already satisfied: itk-io==5.4.0 in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from itk>=5.2->monai[fire,itk,nibabel,psutil,pynrrd,pyyaml,skimage,tensorboard,tqdm]>=1.3) (5.4.0)
Requirement already satisfied: itk-filtering==5.4.0 in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from itk>=5.2->monai[fire,itk,nibabel,psutil,pynrrd,pyyaml,skimage,tensorboard,tqdm]>=1.3) (5.4.0)
Requirement already satisfied: itk-registration==5.4.0 in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from itk>=5.2->monai[fire,itk,nibabel,psutil,pynrrd,pyyaml,skimage,tensorboard,tqdm]>=1.3) (5.4.0)
Requirement already satisfied: itk-segmentation==5.4.0 in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from itk>=5.2->monai[fire,itk,nibabel,psutil,pynrrd,pyyaml,skimage,tensorboard,tqdm]>=1.3) (5.4.0)
Requirement already satisfied: scipy>=1.9 in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from scikit-image>=0.14.2->monai[fire,itk,nibabel,psutil,pynrrd,pyyaml,skimage,tensorboard,tqdm]>=1.3) (1.13.1)
Requirement already satisfied: networkx>=2.8 in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from scikit-image>=0.14.2->monai[fire,itk,nibabel,psutil,pynrrd,pyyaml,skimage,tensorboard,tqdm]>=1.3) (3.2.1)
Requirement already satisfied: pillow>=9.1 in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from scikit-image>=0.14.2->monai[fire,itk,nibabel,psutil,pynrrd,pyyaml,skimage,tensorboard,tqdm]>=1.3) (10.0.1)
Requirement already satisfied: imageio>=2.33 in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from scikit-image>=0.14.2->monai[fire,itk,nibabel,psutil,pynrrd,pyyaml,skimage,tensorboard,tqdm]>=1.3) (2.35.1)
Requirement already satisfied: tifffile>=2022.8.12 in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from scikit-image>=0.14.2->monai[fire,itk,nibabel,psutil,pynrrd,pyyaml,skimage,tensorboard,tqdm]>=1.3) (2024.8.30)
Requirement already satisfied: packaging>=21 in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from scikit-image>=0.14.2->monai[fire,itk,nibabel,psutil,pynrrd,pyyaml,skimage,tensorboard,tqdm]>=1.3) (24.0)
Requirement already satisfied: lazy-loader>=0.4 in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from scikit-image>=0.14.2->monai[fire,itk,nibabel,psutil,pynrrd,pyyaml,skimage,tensorboard,tqdm]>=1.3) (0.4)
Requirement already satisfied: filelock in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from torch>=1.9->monai[fire,itk,nibabel,psutil,pynrrd,pyyaml,skimage,tensorboard,tqdm]>=1.3) (3.16.1)
Requirement already satisfied: typing-extensions>=4.8.0 in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from torch>=1.9->monai[fire,itk,nibabel,psutil,pynrrd,pyyaml,skimage,tensorboard,tqdm]>=1.3) (4.12.1)
Requirement already satisfied: sympy in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from torch>=1.9->monai[fire,itk,nibabel,psutil,pynrrd,pyyaml,skimage,tensorboard,tqdm]>=1.3) (1.13.3)
Requirement already satisfied: jinja2 in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from torch>=1.9->monai[fire,itk,nibabel,psutil,pynrrd,pyyaml,skimage,tensorboard,tqdm]>=1.3) (3.1.4)
Requirement already satisfied: fsspec in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from torch>=1.9->monai[fire,itk,nibabel,psutil,pynrrd,pyyaml,skimage,tensorboard,tqdm]>=1.3) (2024.9.0)
Requirement already satisfied: colorama in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from tqdm>=4.47.0->monai[fire,itk,nibabel,psutil,pynrrd,pyyaml,skimage,tensorboard,tqdm]>=1.3) (0.4.6)
Requirement already satisfied: termcolor in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from fire->monai[fire,itk,nibabel,psutil,pynrrd,pyyaml,skimage,tensorboard,tqdm]>=1.3) (2.5.0)
Requirement already satisfied: nptyping in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from pynrrd->monai[fire,itk,nibabel,psutil,pynrrd,pyyaml,skimage,tensorboard,tqdm]>=1.3) (2.5.0)
Requirement already satisfied: absl-py>=0.4 in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from tensorboard->monai[fire,itk,nibabel,psutil,pynrrd,pyyaml,skimage,tensorboard,tqdm]>=1.3) (2.1.0)
Requirement already satisfied: grpcio>=1.48.2 in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from tensorboard->monai[fire,itk,nibabel,psutil,pynrrd,pyyaml,skimage,tensorboard,tqdm]>=1.3) (1.66.2)
Requirement already satisfied: markdown>=2.6.8 in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from tensorboard->monai[fire,itk,nibabel,psutil,pynrrd,pyyaml,skimage,tensorboard,tqdm]>=1.3) (3.7)
Requirement already satisfied: protobuf!=4.24.0,>=3.19.6 in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from tensorboard->monai[fire,itk,nibabel,psutil,pynrrd,pyyaml,skimage,tensorboard,tqdm]>=1.3) (5.28.2)
Requirement already satisfied: setuptools>=41.0.0 in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from tensorboard->monai[fire,itk,nibabel,psutil,pynrrd,pyyaml,skimage,tensorboard,tqdm]>=1.3) (70.0.0)
Requirement already satisfied: six>1.9 in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from tensorboard->monai[fire,itk,nibabel,psutil,pynrrd,pyyaml,skimage,tensorboard,tqdm]>=1.3) (1.16.0)
Requirement already satisfied: tensorboard-data-server<0.8.0,>=0.7.0 in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from tensorboard->monai[fire,itk,nibabel,psutil,pynrrd,pyyaml,skimage,tensorboard,tqdm]>=1.3) (0.7.2)
Requirement already satisfied: werkzeug>=1.0.1 in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from tensorboard->monai[fire,itk,nibabel,psutil,pynrrd,pyyaml,skimage,tensorboard,tqdm]>=1.3) (3.0.4)
Requirement already satisfied: importlib-metadata>=4.4 in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from markdown>=2.6.8->tensorboard->monai[fire,itk,nibabel,psutil,pynrrd,pyyaml,skimage,tensorboard,tqdm]>=1.3) (8.5.0)
Requirement already satisfied: MarkupSafe>=2.1.1 in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from werkzeug>=1.0.1->tensorboard->monai[fire,itk,nibabel,psutil,pynrrd,pyyaml,skimage,tensorboard,tqdm]>=1.3) (2.1.5)
Requirement already satisfied: mpmath<1.4,>=1.1.0 in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from sympy->torch>=1.9->monai[fire,itk,nibabel,psutil,pynrrd,pyyaml,skimage,tensorboard,tqdm]>=1.3) (1.3.0)
Requirement already satisfied: zipp>=3.20 in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from importlib-metadata>=4.4->markdown>=2.6.8->tensorboard->monai[fire,itk,nibabel,psutil,pynrrd,pyyaml,skimage,tensorboard,tqdm]>=1.3) (3.20.2)
Dependencies are set up successfully.
Requirement already satisfied: python-multipart in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (0.0.12)
Requirement already satisfied: fastapi in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (0.115.0)
Requirement already satisfied: uvicorn[standard] in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (0.31.0)
Requirement already satisfied: starlette<0.39.0,>=0.37.2 in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from fastapi) (0.38.6)
Requirement already satisfied: pydantic!=1.8,!=1.8.1,!=2.0.0,!=2.0.1,!=2.1.0,<3.0.0,>=1.7.4 in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from fastapi) (2.9.2)
Requirement already satisfied: typing-extensions>=4.8.0 in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from fastapi) (4.12.1)
Requirement already satisfied: click>=7.0 in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from uvicorn[standard]) (8.1.7)
Requirement already satisfied: h11>=0.8 in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from uvicorn[standard]) (0.14.0)
Requirement already satisfied: colorama>=0.4 in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from uvicorn[standard]) (0.4.6)
Requirement already satisfied: httptools>=0.5.0 in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from uvicorn[standard]) (0.6.1)
Requirement already satisfied: python-dotenv>=0.13 in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from uvicorn[standard]) (1.0.1)
Requirement already satisfied: pyyaml>=5.1 in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from uvicorn[standard]) (6.0.2)
Requirement already satisfied: watchfiles>=0.13 in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from uvicorn[standard]) (0.24.0)
Requirement already satisfied: websockets>=10.4 in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from uvicorn[standard]) (13.1)
Requirement already satisfied: annotated-types>=0.6.0 in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from pydantic!=1.8,!=1.8.1,!=2.0.0,!=2.0.1,!=2.1.0,<3.0.0,>=1.7.4->fastapi) (0.7.0)
Requirement already satisfied: pydantic-core==2.23.4 in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from pydantic!=1.8,!=1.8.1,!=2.0.0,!=2.0.1,!=2.1.0,<3.0.0,>=1.7.4->fastapi) (2.23.4)
Requirement already satisfied: anyio<5,>=3.4.0 in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from starlette<0.39.0,>=0.37.2->fastapi) (4.6.0)
Requirement already satisfied: idna>=2.8 in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from anyio<5,>=3.4.0->starlette<0.39.0,>=0.37.2->fastapi) (3.7)
Requirement already satisfied: sniffio>=1.1 in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from anyio<5,>=3.4.0->starlette<0.39.0,>=0.37.2->fastapi) (1.3.1)
Requirement already satisfied: exceptiongroup>=1.0.2 in c:\users\andra\appdata\local\slicer.org\slicer 5.7.0-2024-09-21\lib\python\lib\site-packages (from anyio<5,>=3.4.0->starlette<0.39.0,>=0.37.2->fastapi) (1.2.2)
Launching process: ['C:/Users/andra/AppData/Local/slicer.org/Slicer 5.7.0-2024-09-21/bin/PythonSlicer.exe', WindowsPath('C:/D/SlicerMONAIAuto3DSeg/MONAIAuto3DSeg/MONAIAuto3DSegServer/main.py'), '--host', 'DESKTOP-NM47AF2', '--port', '8891']
Server Started
INFO:     Will watch for changes in these directories: ['C:\\Users\\andra\\AppData\\Local\\slicer.org\\Slicer 5.7.0-2024-09-21']
INFO:     Uvicorn running on http://DESKTOP-NM47AF2:8891 (Press CTRL+C to quit)
INFO:     Started reloader process [38308] using WatchFiles
INFO:     Started server process [22604]
INFO:     Waiting for application startup.
INFO:     Application startup complete.

Application log on the client side:

Processing started
Writing input file to C:\Users\andra\AppData\Local\Temp\Slicer\__SlicerTemp__2024-10-14_12+46+18.630\tmpz465g15r\input-volume0.nrrd
Initiating Inference on http://DESKTOP-NM47AF2:8891
Starting new HTTP connection (1): desktop-nm47af2:8891
http://desktop-nm47af2:8891 "POST /infer?model_name=abdominal-organs-3mm-v2.0.0 HTTP/1.1" 500 40
Failed to start processing.

500 Server Error: Internal Server Error for url: http://desktop-nm47af2:8891/infer?model_name=abdominal-organs-3mm-v2.0.0

The server log does not contain any information about a client trying to connect. I've turned off the firewall, but did not make any difference.

Server connection succeeded (and opening http://desktop-nm47af2:8891/models in the browser opened the models.json file), still there is nothing in the application log of the server instance. I guess you log something when a client connects, so probably there is a logging issue. If the logging issue is fixed and we get some information from the server then maybe we can see what is the root cause of the 500 Server Error.

@che85
Copy link
Contributor Author

che85 commented Oct 14, 2024

I just installed everything on the Windows and can start a server which I can connect to from my local MBP. I can retrieve model information, but when trying to run inference, I get the same error after some timeout. No signs of an attempted connection on the server-side. I will try with uvicorn directly.

@lassoan
Copy link
Owner

lassoan commented Oct 14, 2024

Let me know if you get stuck and then I can try to debug it.

@che85
Copy link
Contributor Author

che85 commented Oct 16, 2024

@lassoan I committed a fix for running the server on Windows. Maybe you can test it locally on your machine and confirm. I am going to work on the progress reporting now.

@che85
Copy link
Contributor Author

che85 commented Oct 17, 2024

@lassoan Ready to test. Let me know what you think about the progressbar.

A few more thoughts when running remotely. Sometimes, the files to be sent to the server are huge. It may be good to explore more user feedback for uploading/running inference and not having the main loop blocked for the whole process.

@che85
Copy link
Contributor Author

che85 commented Oct 21, 2024

@lassoan I made the requested changes. Improved error reporting on the client side and limiting the number of inference requests to 5/minute.

Let me know what you think.

@lassoan
Copy link
Owner

lassoan commented Oct 21, 2024

Thank you, I'll test this and let you know!

@lassoan
Copy link
Owner

lassoan commented Oct 21, 2024

Thanks for the updates, it is getting really close to be ready. Just a few small things to fix before integration:

  • If remote segmentation mode is selected but the user did not connect to the server then the segmentation model selector and Apply button (and maybe also the input selectors) should be disabled.
  • There is still no progress information when doing inference locally (without server connection). This is all I see in the textbox:
Starting...
In Progress
Importing Results
Processing finished.

We would need to see something like this:

Starting...
In Progress
You are using `torch.load` with `weights_only=False` (the current default value), which uses the default pickle module implicitly. It is possible to construct malicious pickle data which will execute arbitrary code during unpickling (See https://github.com/pytorch/pytorch/blob/main/SECURITY.md#untrusted-models for more details). In a future release, the default value for `weights_only` will be flipped to `True`. This limits the functions that could be executed during unpickling. Arbitrary objects will no longer be allowed to be loaded via this mode unless they are explicitly allowlisted by the user via `torch.serialization.add_safe_globals`. We recommend you start setting `weights_only=True` for any use case where you don't have full control of the loaded file. Please open an issue on GitHub for any issues related to this experimental feature.
`apex.normalization.InstanceNorm3dNVFuser` is not installed properly, use nn.InstanceNorm3d instead.
Model epoch 294 metric 0.9070999026298523
Using crop_foreground
Using resample with resample_resolution [3.0, 3.0, 3.0]
Running Inference ...
`torch.cuda.amp.autocast(args...)` is deprecated. Please use `torch.amp.autocast('cuda', args...)` instead.

0%| | 0/4 [00:00
25%|##5 | 1/4 [00:00
100%|##########| 4/4 [00:00
Logits torch.Size([1, 25, 120, 110, 161])
Converting logits into predictions
preds torch.Size([1, 1, 120, 110, 161])
preds inverted torch.Size([512, 512, 685])
Computation time log:
Loading volumes: 5.59 seconds
Preprocessing: 1.69 seconds
Inference: 1.04 seconds
Logits: 0.37 seconds
Preds: 0.38 seconds
Convert to array: 0.27 seconds
Save: 2.95 seconds
ALL DONE, result saved in C:\Users\andra\AppData\Local\Temp\tmpb3b9bk9t\output-segmentation.nrrd
Importing Results
Processing finished.
  • When segmenting on remote server, progress information appears in the server - good!, However, we don't see anything on the client side, so users may miss some warning messages or other useful information. It would be nice if progress information could be displayed continuously, but if that was hard then at least the segmentation process output should be sent to the client on completion so that it can display it. Since the server interface is a new feature, lack of this feature would not be a regression, so if you would prefer not to spend time implementing this now then you can just create an issue for it.

I've fixed a path-related bug that broke things on Windows and pushed it to this branch.

@lassoan
Copy link
Owner

lassoan commented Oct 22, 2024

I've tested it and progress reporting now works well for local inference.

Only two tasks remain now:

  • If remote segmentation mode is selected but the user did not connect to the server then the segmentation model selector and Apply button (and maybe also the input selectors) should be disabled. Alternatively, the segmentation model list could be cleared. The point of all these is to make it clear for the user if models come from the server or local; and if the segmentation happens locally or on the server.
  • Send processing output from server to client (optional, may submit issue instead)

@che85
Copy link
Contributor Author

che85 commented Oct 22, 2024

Will work on it in a bit. Helping Nicolas was something right now.

Make Remote processing button state persistent.
Disable model selection and Apply button if remote processing is enabled but not connected to a server yet.
@lassoan
Copy link
Owner

lassoan commented Oct 22, 2024

I've implemented persistency for the remote processing button (last state stored in QSettings) + disabling the model selection and Apply button if not connected to server.

I've submitted an issue for sending processing output from the server to the client: #85

Copy link
Owner

@lassoan lassoan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This new remote server interface works well, while all the previous functionality is preserved. It is all ready to be merged.

@lassoan lassoan merged commit 8b70585 into lassoan:main Oct 22, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants