Skip to content

Commit

Permalink
Merge branch 'openvinotoolkit:master' into quantile
Browse files Browse the repository at this point in the history
  • Loading branch information
geeky33 authored Jan 30, 2025
2 parents 0381b66 + 9930aea commit dcd43e3
Show file tree
Hide file tree
Showing 162 changed files with 3,917 additions and 3,831 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/job_openvino_js.yml
Original file line number Diff line number Diff line change
Expand Up @@ -62,7 +62,7 @@ jobs:

- name: Setup Node ${{ env.NODE_VERSION }}
if: runner.os != 'Linux' # Node is already installed in the Docker image
uses: actions/setup-node@39370e3970a6d050c480ffad4ff0ed4d3fdee5af # v4.1.0
uses: actions/setup-node@1d0ff469b7ec7b3cb9d8673fde0c81c44821de2a # v4.2.0
with:
node-version: ${{ env.NODE_VERSION }}

Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/windows_vs2019_release.yml
Original file line number Diff line number Diff line change
Expand Up @@ -205,7 +205,7 @@ jobs:
working-directory: ${{ env.OPENVINO_JS_LIBS_DIR }}

- name: Setup Node ${{ env.NODE_VERSION }}
uses: actions/setup-node@39370e3970a6d050c480ffad4ff0ed4d3fdee5af # v4.1.0
uses: actions/setup-node@1d0ff469b7ec7b3cb9d8673fde0c81c44821de2a # v4.2.0
with:
node-version: ${{ env.NODE_VERSION }}

Expand Down
8 changes: 4 additions & 4 deletions docs/articles_en/about-openvino.rst
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ About OpenVINO

about-openvino/key-features
about-openvino/performance-benchmarks
about-openvino/compatibility-and-support
OpenVINO Ecosystem <about-openvino/openvino-ecosystem>
about-openvino/contributing
Release Notes <about-openvino/release-notes-openvino>

Expand Down Expand Up @@ -40,10 +40,10 @@ OpenVINO Ecosystem
Along with the primary components of model optimization and runtime, the toolkit also includes:

* `Neural Network Compression Framework (NNCF) <https://github.com/openvinotoolkit/nncf>`__ - a tool for enhanced OpenVINO™ inference to get performance boost with minimal accuracy drop.
* :doc:`Openvino Notebooks <learn-openvino/interactive-tutorials-python>`- Jupyter Python notebook, which demonstrate key features of the toolkit.
* :doc:`Openvino Notebooks <get-started/learn-openvino/interactive-tutorials-python>`- Jupyter Python notebook, which demonstrate key features of the toolkit.
* `OpenVINO Model Server <https://github.com/openvinotoolkit/model_server>`__ - a server that enables scalability via a serving microservice.
* :doc:`OpenVINO Training Extensions <documentation/openvino-ecosystem/openvino-training-extensions>` – a convenient environment to train Deep Learning models and convert them using the OpenVINO™ toolkit for optimized inference.
* :doc:`Dataset Management Framework (Datumaro) <documentation/openvino-ecosystem/datumaro>` - a tool to build, transform, and analyze datasets.
* :doc:`OpenVINO Training Extensions <about-openvino/openvino-ecosystem/openvino-training-extensions>` – a convenient environment to train Deep Learning models and convert them using the OpenVINO™ toolkit for optimized inference.
* :doc:`Dataset Management Framework (Datumaro) <about-openvino/openvino-ecosystem/datumaro>` - a tool to build, transform, and analyze datasets.

Community
##############################################################
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -68,7 +68,7 @@ without the need to convert.

| **OpenVINO Training Extensions**
| :bdg-link-dark:`Github <https://github.com/openvinotoolkit/training_extensions>`
:bdg-link-success:`Overview Page <https://docs.openvino.ai/2025/documentation/openvino-ecosystem/openvino-training-extensions.html>`
:bdg-link-success:`Overview Page <https://docs.openvino.ai/2025/about-openvino/openvino-ecosystem/openvino-training-extensions.html>`
A convenient environment to train Deep Learning models and convert them using the OpenVINO™
toolkit for optimized inference.
Expand All @@ -77,7 +77,7 @@ toolkit for optimized inference.

| **OpenVINO Security Addon**
| :bdg-link-dark:`Github <https://github.com/openvinotoolkit/security_addon>`
:bdg-link-success:`User Guide <https://docs.openvino.ai/2025/documentation/openvino-ecosystem/openvino-security-add-on.html>`
:bdg-link-success:`User Guide <https://docs.openvino.ai/2025/about-openvino/openvino-ecosystem/openvino-security-add-on.html>`
A solution for Model Developers and Independent Software Vendors to use secure packaging and
secure model execution.
Expand All @@ -86,7 +86,7 @@ secure model execution.

| **Datumaro**
| :bdg-link-dark:`Github <https://github.com/openvinotoolkit/datumaro>`
:bdg-link-success:`Overview Page <https://docs.openvino.ai/2025/documentation/openvino-ecosystem/datumaro.html>`
:bdg-link-success:`Overview Page <https://docs.openvino.ai/2025/about-openvino/openvino-ecosystem/datumaro.html>`
A framework and a CLI tool for building, transforming, and analyzing datasets.
|hr|
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -128,7 +128,7 @@ General considerations
When comparing OpenVINO Runtime performance with the framework or reference code,
make sure that both versions are as similar as possible:

* Wrap the exact inference execution (for examples, see :doc:`Benchmark app <../../learn-openvino/openvino-samples/benchmark-tool>`).
* Wrap the exact inference execution (for examples, see :doc:`Benchmark app <../../get-started/learn-openvino/openvino-samples/benchmark-tool>`).
* Do not include model loading time.
* Ensure that the inputs are identical for OpenVINO Runtime and the framework. For example, watch out for random values that can be used to populate the inputs.
* In situations when any user-side pre-processing should be tracked separately, consider :doc:`image pre-processing and conversion <../../openvino-workflow/running-inference/optimize-inference/optimize-preprocessing>`.
Expand All @@ -149,7 +149,7 @@ OpenVINO benchmarking (general)
+++++++++++++++++++++++++++++++

The default way of measuring OpenVINO performance is running a piece of code, referred to as
:doc:`the benchmark tool <../../learn-openvino/openvino-samples/benchmark-tool>`.
:doc:`the benchmark tool <../../get-started/learn-openvino/openvino-samples/benchmark-tool>`.
For Python, it is part of the OpenVINO Runtime installation, while for C++, it is available as
a code sample.

Expand All @@ -165,7 +165,7 @@ as:
benchmark_app -m <model> -d <device> -i <input>
Each of the :doc:`OpenVINO supported devices <../compatibility-and-support/supported-devices>`
Each of the :doc:`OpenVINO supported devices <../../documentation/compatibility-and-support/supported-devices>`
offers performance settings that contain command-line equivalents in the Benchmark app.

While these settings provide really low-level control for the optimal model performance on a
Expand All @@ -186,7 +186,7 @@ Internal Inference Performance Counters and Execution Graphs

More detailed insights into inference performance breakdown can be achieved with device-specific
performance counters and/or execution graphs.
Both :doc:`C++ and Python <../../learn-openvino/openvino-samples/benchmark-tool>`
Both :doc:`C++ and Python <../../get-started/learn-openvino/openvino-samples/benchmark-tool>`
versions of the benchmark_app support a ``-pc`` command-line parameter that outputs an internal
execution breakdown.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ Performance Information F.A.Q.

All of the performance benchmarks on traditional network models are generated using the
open-source tool within the Intel® Distribution of OpenVINO™ toolkit
called :doc:`benchmark_app <../../learn-openvino/openvino-samples/benchmark-tool>`.
called :doc:`benchmark_app <../../get-started/learn-openvino/openvino-samples/benchmark-tool>`.

For diffusers (Stable-Diffusion) and foundational models (aka LLMs) please use the OpenVINO GenAI
opensource repo `OpenVINO GenAI tools/llm_bench <https://github.com/openvinotoolkit/openvino.genai/tree/master/tools/llm_bench>`__
Expand Down Expand Up @@ -97,7 +97,7 @@ Performance Information F.A.Q.

Intel partners with vendors all over the world. For a list of Hardware Manufacturers, see the
`Intel® AI: In Production Partners & Solutions Catalog <https://www.intel.com/content/www/us/en/internet-of-things/ai-in-production/partners-solutions-catalog.html>`__.
For more details, see the :doc:`Supported Devices <../compatibility-and-support/supported-devices>` article.
For more details, see the :doc:`Supported Devices <../../documentation/compatibility-and-support/supported-devices>` article.


.. dropdown:: How can I optimize my models for better performance or accuracy?
Expand Down
6 changes: 3 additions & 3 deletions docs/articles_en/documentation.rst
Original file line number Diff line number Diff line change
Expand Up @@ -13,18 +13,18 @@ Documentation

API Reference <api/api_reference>
OpenVINO IR format and Operation Sets <documentation/openvino-ir-format>
Tool Ecosystem <documentation/openvino-ecosystem>
Compatibility and Support <documentation/compatibility-and-support>
Legacy Features <documentation/legacy-features>
OpenVINO Extensibility <documentation/openvino-extensibility>
OpenVINO™ Security <documentation/openvino-security>
Legacy Features <documentation/legacy-features>


This section provides reference documents that guide you through the OpenVINO toolkit workflow, from preparing models, optimizing them, to deploying them in your own deep learning applications.

| :doc:`API Reference doc path <api/api_reference>`
| A collection of reference articles for OpenVINO C++, C, and Python APIs.
| :doc:`OpenVINO Ecosystem <documentation/openvino-ecosystem>`
| :doc:`OpenVINO Ecosystem <about-openvino/openvino-ecosystem>`
| Apart from the core components, OpenVINO offers tools, plugins, and expansions revolving around it, even if not constituting necessary parts of its workflow. This section gives you an overview of what makes up the OpenVINO toolkit.
| :doc:`OpenVINO Extensibility Mechanism <documentation/openvino-extensibility>`
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ deep learning models:
:doc:`NPU <../../openvino-workflow/running-inference/inference-devices-and-modes/npu-device>`.

| For their usage guides, see :doc:`Devices and Modes <../../openvino-workflow/running-inference/inference-devices-and-modes>`.
| For a detailed list of devices, see :doc:`System Requirements <../release-notes-openvino/system-requirements>`.
| For a detailed list of devices, see :doc:`System Requirements <../../about-openvino/release-notes-openvino/system-requirements>`.

Beside running inference with a specific device,
Expand Down Expand Up @@ -43,7 +43,7 @@ Feature Support and API Coverage
:doc:`Multi-stream execution <../../openvino-workflow/running-inference/optimize-inference/optimizing-throughput>` Yes Yes No
:doc:`Model caching <../../openvino-workflow/running-inference/optimize-inference/optimizing-latency/model-caching-overview>` Yes Partial Yes
:doc:`Dynamic shapes <../../openvino-workflow/running-inference/dynamic-shapes>` Yes Partial No
:doc:`Import/Export <../../documentation/openvino-ecosystem>` Yes Yes Yes
:doc:`Import/Export <../../about-openvino/openvino-ecosystem>` Yes Yes Yes
:doc:`Preprocessing acceleration <../../openvino-workflow/running-inference/optimize-inference/optimize-preprocessing>` Yes Yes No
:doc:`Stateful models <../../openvino-workflow/running-inference/stateful-models>` Yes Yes Yes
:doc:`Extensibility <../../documentation/openvino-extensibility>` Yes Yes No
Expand Down
6 changes: 3 additions & 3 deletions docs/articles_en/documentation/openvino-extensibility.rst
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ OpenVINO Extensibility Mechanism

The Intel® Distribution of OpenVINO™ toolkit supports neural-network models trained with various frameworks, including
TensorFlow, PyTorch, ONNX, TensorFlow Lite, and PaddlePaddle. The list of supported operations is different for each of the supported frameworks.
To see the operations supported by your framework, refer to :doc:`Supported Framework Operations <../about-openvino/compatibility-and-support/supported-operations>`.
To see the operations supported by your framework, refer to :doc:`Supported Framework Operations <../documentation/compatibility-and-support/supported-operations>`.

Custom operations, which are not included in the list, are not recognized by OpenVINO out-of-the-box. The need for custom operation may appear in two cases:

Expand Down Expand Up @@ -187,6 +187,6 @@ See Also
########

* :doc:`OpenVINO Transformations <openvino-extensibility/transformation-api>`
* :doc:`Using OpenVINO Runtime Samples <../learn-openvino/openvino-samples>`
* :doc:`Hello Shape Infer SSD sample <../learn-openvino/openvino-samples/hello-reshape-ssd>`
* :doc:`Using OpenVINO Runtime Samples <../get-started/learn-openvino/openvino-samples>`
* :doc:`Hello Shape Infer SSD sample <../get-started/learn-openvino/openvino-samples/hello-reshape-ssd>`

Original file line number Diff line number Diff line change
Expand Up @@ -322,7 +322,7 @@ redistributed in the "Saved model" format, converted to a frozen graph using the
Inference
+++++++++

The simplest way to infer the model and collect performance counters is :doc:`Benchmark Application <../../../../learn-openvino/openvino-samples/benchmark-tool>`.
The simplest way to infer the model and collect performance counters is :doc:`Benchmark Application <../../../../get-started/learn-openvino/openvino-samples/benchmark-tool>`.

.. code-block:: sh
Expand Down
4 changes: 2 additions & 2 deletions docs/articles_en/documentation/openvino-security.rst
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ with encryption or other security tools.
Actual security and privacy requirements depend on your unique deployment scenario.
This section provides general guidance on using OpenVINO tools and libraries securely.
The main security measure for OpenVINO is its
:doc:`Security Add-on <openvino-ecosystem/openvino-security-add-on>`. You can find its description
:doc:`Security Add-on <../about-openvino/openvino-ecosystem/openvino-security-add-on>`. You can find its description
in the Ecosystem section.

.. _encrypted-models:
Expand Down Expand Up @@ -86,4 +86,4 @@ Additional Resources
- Intel® Distribution of OpenVINO™ toolkit `home page <https://software.intel.com/en-us/openvino-toolkit>`__.
- :doc:`Convert a Model <../openvino-workflow/model-preparation/convert-model-to-ir>`.
- :doc:`OpenVINO™ Runtime User Guide <../openvino-workflow/running-inference>`.
- For more information on Sample Applications, see the :doc:`OpenVINO Samples Overview <../learn-openvino/openvino-samples>`
- For more information on Sample Applications, see the :doc:`OpenVINO Samples Overview <../get-started/learn-openvino/openvino-samples>`
7 changes: 4 additions & 3 deletions docs/articles_en/get-started.rst
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,7 @@ GET STARTED
:hidden:

Install OpenVINO <get-started/install-openvino>
Learn OpenVINO <get-started/learn-openvino>
System Requirements <./about-openvino/release-notes-openvino/system-requirements>


Expand Down Expand Up @@ -67,7 +68,7 @@ Learn the basics of working with models and inference in OpenVINO. Begin with
Interactive Tutorials - Jupyter Notebooks
-----------------------------------------

Start with :doc:`interactive Python <learn-openvino/interactive-tutorials-python>` that show the basics of model inference, the OpenVINO API, how to convert models to OpenVINO format, and more.
Start with :doc:`interactive Python <get-started/learn-openvino/interactive-tutorials-python>` that show the basics of model inference, the OpenVINO API, how to convert models to OpenVINO format, and more.

* `Hello Image Classification <notebooks/hello-world-with-output.html>`__ - Load an image classification model in OpenVINO and use it to apply a label to an image
* `OpenVINO Runtime API Tutorial <notebooks/openvino-api-with-output.html>`__ - Learn the basic Python API for working with models in OpenVINO
Expand All @@ -79,7 +80,7 @@ Start with :doc:`interactive Python <learn-openvino/interactive-tutorials-python
OpenVINO Code Samples
---------------------

View :doc:`sample code <learn-openvino/openvino-samples>` for various C++ and Python applications that can be used as a starting point for your own application. For C++ developers, step through the :doc:`Get Started with C++ Samples <learn-openvino/openvino-samples/get-started-demos>` to learn how to build and run an image classification program that uses OpenVINO’s C++ API.
View :doc:`sample code <get-started/learn-openvino/openvino-samples>` for various C++ and Python applications that can be used as a starting point for your own application. For C++ developers, step through the :doc:`Get Started with C++ Samples <get-started/learn-openvino/openvino-samples/get-started-demos>` to learn how to build and run an image classification program that uses OpenVINO’s C++ API.

.. _integrate-openvino:

Expand Down Expand Up @@ -120,7 +121,7 @@ Pipeline and model configuration features in OpenVINO Runtime allow you to easil
* :doc:`Automatic Batching <openvino-workflow/running-inference/inference-devices-and-modes/automatic-batching>` performs on-the-fly grouping of inference requests to maximize utilization of the target hardware’s memory and processing cores.
* :doc:`Performance Hints <openvino-workflow/running-inference/optimize-inference/high-level-performance-hints>` automatically adjust runtime parameters to prioritize for low latency or high throughput
* :doc:`Dynamic Shapes <openvino-workflow/running-inference/dynamic-shapes>` reshapes models to accept arbitrarily-sized inputs, increasing flexibility for applications that encounter different data shapes
* :doc:`Benchmark Tool <learn-openvino/openvino-samples/benchmark-tool>` characterizes model performance in various hardware and pipeline configurations
* :doc:`Benchmark Tool <get-started/learn-openvino/openvino-samples/benchmark-tool>` characterizes model performance in various hardware and pipeline configurations

.. _additional-about-openvino/additional-resources:

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,7 @@ Additional Configurations
For GPU <configurations/configurations-intel-gpu>
For NPU <configurations/configurations-intel-npu>
GenAI Dependencies <configurations/genai-dependencies>
Troubleshooting Guide for OpenVINO™ Installation & Configuration <troubleshooting-install-config.html>

For certain use cases, you may need to install additional software, to benefit from the full
potential of OpenVINO™. Check the following list for components used in your workflow:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -156,7 +156,7 @@ need to install additional components. Check the
to see if your case needs any of them.

With the APT distribution, you can build OpenVINO sample files, as explained in the
:doc:`guide for OpenVINO sample applications <../../../learn-openvino/openvino-samples>`.
:doc:`guide for OpenVINO sample applications <../../../get-started/learn-openvino/openvino-samples>`.
For C++ and C, just run the ``build_samples.sh`` script:

.. tab-set::
Expand Down Expand Up @@ -215,23 +215,23 @@ What's Next?
Now that you've installed OpenVINO Runtime, you're ready to run your own machine learning applications!
Learn more about how to integrate a model in OpenVINO applications by trying out the following tutorials:

* Try the :doc:`C++ Quick Start Example <../../../learn-openvino/openvino-samples/get-started-demos>` for step-by-step
* Try the :doc:`C++ Quick Start Example <../../../get-started/learn-openvino/openvino-samples/get-started-demos>` for step-by-step
instructions on building and running a basic image classification C++ application.

.. image:: https://user-images.githubusercontent.com/36741649/127170593-86976dc3-e5e4-40be-b0a6-206379cd7df5.jpg
:width: 400

* Visit the :ref:`Samples <code samples>` page for other C++ example applications to get you started with OpenVINO, such as:

* :doc:`Basic object detection with the Hello Reshape SSD C++ sample <../../../learn-openvino/openvino-samples/hello-reshape-ssd>`
* :doc:`Object classification sample <../../../learn-openvino/openvino-samples/hello-classification>`
* :doc:`Basic object detection with the Hello Reshape SSD C++ sample <../../../get-started/learn-openvino/openvino-samples/hello-reshape-ssd>`
* :doc:`Object classification sample <../../../get-started/learn-openvino/openvino-samples/hello-classification>`

You can also try the following:

* Learn more about :doc:`OpenVINO Workflow <../../../openvino-workflow>`.
* To prepare your models for working with OpenVINO, see :doc:`Model Preparation <../../../openvino-workflow/model-preparation>`.
* See pre-trained deep learning models on `Hugging Face <https://huggingface.co/OpenVINO>`__
* Learn more about :doc:`Inference with OpenVINO Runtime <../../../openvino-workflow/running-inference>`.
* See sample applications in :doc:`OpenVINO toolkit Samples Overview <../../../learn-openvino/openvino-samples>`.
* See sample applications in :doc:`OpenVINO toolkit Samples Overview <../../../get-started/learn-openvino/openvino-samples>`.
* Take a glance at the OpenVINO `product home page <https://software.intel.com/en-us/openvino-toolkit>`__ .

Loading

0 comments on commit dcd43e3

Please sign in to comment.