conflict rez
BIN
modules/Face-Detection-SSD-master.zip
Normal file
57
modules/Face-Detection-SSD-master/README.md
Normal file
@ -0,0 +1,57 @@
|
||||
|
||||
## Face-Detection-SSD
|
||||
This repository contains the code for face detection using SSD. This repository detect the face from video and cropped the face. The cropped face will save in the given folder name.!
|
||||
|
||||
### Project Structure
|
||||
```
|
||||
.
|
||||
├── ckpt_ # Weight file
|
||||
├── images # Images
|
||||
├── input-data # Input data for detection.
|
||||
├── Readme # Readme for Face-detection-SSD
|
||||
├── requiremnts # Requirements file for Facenet-detection-SSD
|
||||
```
|
||||
|
||||
#### Single Shot Detector - SSD :
|
||||
|
||||
`Single-shot MultiBox Detector` is a one-stage object detection algorithm. This means that, in contrast to two-stage models, SSDs do not need an initial object proposals generation step. This makes it, usually, faster and more efficient than two-stage approaches such as Faster R-CNN, although it sacrifices performance for detection of small objects to gain speed.
|
||||
|
||||
#### Require packages:
|
||||
|
||||
`virtualenv --python=python3 env_fds`
|
||||
|
||||
`source env_fds/bin/activate`
|
||||
|
||||
`pip install -r requirements.txt`
|
||||
|
||||
|
||||
#### [RESULTS](result):
|
||||
|
||||
##### Architecture of SSD :
|
||||
Single class object detection models will need less learnable features. Less parameters mean that the network will be smaller. Smaller networks run faster because it requires less computations.
|
||||
|
||||

|
||||
|
||||

|
||||
|
||||
##### Result of face detection SSD :
|
||||
|
||||

|
||||

|
||||

|
||||
|
||||
|
||||
##### Cropped faces :
|
||||
   
|
||||
|
||||
   
|
||||
|
||||
|
||||
|
||||
#### Research Paper Reference for this repository:
|
||||
|
||||
1. [SSD: Single Shot MultiBox Detector](https://arxiv.org/abs/1512.02325)
|
||||
2. [Blogs links for SSD and images reference](https://towardsdatascience.com/review-ssd-single-shot-detector-object-detection-851a94607d11)
|
||||
3. [Blogs links for SSD and images reference](https://towardsdatascience.com/faced-cpu-real-time-face-detection-using-deep-learning-1488681c1602)
|
||||
|
||||
### TODO
|
||||
22
modules/Face-Detection-SSD-master/app.py
Normal file
@ -0,0 +1,22 @@
|
||||
import crop_face as cf
|
||||
import cfal as alf
|
||||
|
||||
# parameters for loading data and images
|
||||
prototxt = 'ckpt_/deploy.prototxt.txt'
|
||||
weights = 'ckpt_/res10_300x300_ssd_iter_140000.caffemodel'
|
||||
|
||||
# loading models
|
||||
face_detection = load_detection_model(prototxt, weights)
|
||||
|
||||
filename = "filepath_of_video"
|
||||
|
||||
dirName = "dir_name for cropped_images"
|
||||
|
||||
# for detection and saved the crop the detected face
|
||||
a = cf.crop_face(filename, face_detection, dirName)
|
||||
|
||||
# for detection and align faces and saved the crop the detected face
|
||||
# a = afl.crop_face(filename, face_detection, dirName)
|
||||
|
||||
print("Done", a)
|
||||
|
||||
171
modules/Face-Detection-SSD-master/cfal.py
Normal file
@ -0,0 +1,171 @@
|
||||
from statistics import mode
|
||||
import imutils
|
||||
import cv2
|
||||
import numpy as np
|
||||
from imutils.video import VideoStream
|
||||
import time
|
||||
|
||||
from preprocessor import preprocess_input
|
||||
|
||||
from imutils.face_utils import FaceAligner
|
||||
from imutils.face_utils import rect_to_bb
|
||||
import dlib
|
||||
|
||||
|
||||
shape_predictor = "shape_predictor_68_face_landmarks.dat"
|
||||
predictor = dlib.shape_predictor(shape_predictor)
|
||||
|
||||
from mtcnn.mtcnn import MTCNN
|
||||
detector = MTCNN()
|
||||
|
||||
|
||||
# Support functions
|
||||
|
||||
def get_labels(dataset_name):
|
||||
if dataset_name == 'KDEF':
|
||||
return {0: 'AN', 1: 'DI', 2: 'AF', 3: 'HA', 4: 'SA', 5: 'SU', 6: 'NE'}
|
||||
else:
|
||||
raise Exception('Invalid dataset name')
|
||||
|
||||
|
||||
def detect_faces(detection_model, gray_image_array, conf):
|
||||
frame = gray_image_array
|
||||
# Grab frame dimention and convert to blob
|
||||
(h,w) = frame.shape[:2]
|
||||
# Preprocess input image: mean subtraction, normalization
|
||||
blob = cv2.dnn.blobFromImage(cv2.resize(frame, (300, 300)), 1.0,
|
||||
(300, 300), (104.0, 177.0, 123.0))
|
||||
# Set read image as input to model
|
||||
detection_model.setInput(blob)
|
||||
|
||||
# Run forward pass on model. Receive output of shape (1,1,no_of_predictions, 7)
|
||||
predictions = detection_model.forward()
|
||||
coord_list = []
|
||||
count = 0
|
||||
for i in range(0, predictions.shape[2]):
|
||||
confidence = predictions[0,0,i,2]
|
||||
if confidence > conf:
|
||||
# Find box coordinates rescaled to original image
|
||||
box_coord = predictions[0,0,i,3:7] * np.array([w,h,w,h])
|
||||
conf_text = '{:.2f}'.format(confidence)
|
||||
# Find output coordinates
|
||||
xmin, ymin, xmax, ymax = box_coord.astype('int')
|
||||
coord_list.append([xmin, ymin, (xmax-xmin), (ymax-ymin)])
|
||||
|
||||
print('Coordinate list:', coord_list)
|
||||
|
||||
return coord_list
|
||||
|
||||
|
||||
def draw_text(coordinates, image_array, text, color, x_offset=0, y_offset=0,
|
||||
font_scale=2, thickness=2):
|
||||
x, y = coordinates[:2]
|
||||
cv2.putText(image_array, text, (x + x_offset, y + y_offset),
|
||||
cv2.FONT_HERSHEY_SIMPLEX,
|
||||
font_scale, color, thickness, cv2.LINE_AA)
|
||||
|
||||
|
||||
def draw_bounding_box(face_coordinates, image_array, color, identity):
|
||||
x, y, w, h = face_coordinates
|
||||
cv2.rectangle(image_array, (x, y), (x + w, y + h), color, 2)
|
||||
cv2.putText(image_array, str(identity), (x+5,y-5), font, 1, (255,255,255), 2)
|
||||
|
||||
|
||||
def apply_offsets(face_coordinates, offsets):
|
||||
x, y, width, height = face_coordinates
|
||||
x_off, y_off = offsets
|
||||
return (x - x_off, x + width + x_off, y - y_off, y + height + y_off)
|
||||
|
||||
|
||||
def load_detection_model(prototxt, weights):
|
||||
detection_model = cv2.dnn.readNetFromCaffe(prototxt, weights)
|
||||
return detection_model
|
||||
|
||||
|
||||
# parameters for loading data and images
|
||||
prototxt = 'trained_models/deploy.prototxt.txt'
|
||||
weights = 'trained_models/res10_300x300_ssd_iter_140000.caffemodel'
|
||||
|
||||
font = cv2.FONT_HERSHEY_SIMPLEX
|
||||
|
||||
frame_window = 10
|
||||
face_offsets = (30, 40)
|
||||
emotion_offsets = (20, 40)
|
||||
confidence = 0.6
|
||||
|
||||
# loading models
|
||||
face_detection = load_detection_model(prototxt, weights)
|
||||
# face_detection_size = (40, 40)
|
||||
counter = 0
|
||||
# frame_process_counter = 0
|
||||
|
||||
def crop_face(file_name, face_detection, name_count):
|
||||
|
||||
face_detection_size = (40, 40)
|
||||
counter = 0
|
||||
frame_process_counter = 0
|
||||
|
||||
# starting video streaming
|
||||
cv2.namedWindow('Attendence_Tracker', cv2.WINDOW_NORMAL)
|
||||
# cv2.namedWindow('Attendence_Tracker')
|
||||
# file_name = '../top10/person1.mp4'
|
||||
video_capture = cv2.VideoCapture(file_name)
|
||||
|
||||
time.sleep(1.0)
|
||||
|
||||
while (video_capture.isOpened()):
|
||||
ret, bgr_image = video_capture.read()
|
||||
if ret == False:
|
||||
break
|
||||
counter += 1
|
||||
if counter % 1 == 0:
|
||||
frame_process_counter += 1
|
||||
gray_image = cv2.cvtColor(bgr_image, cv2.COLOR_BGR2GRAY)
|
||||
rgb_image = cv2.cvtColor(bgr_image, cv2.COLOR_BGR2RGB)
|
||||
faces = detect_faces(face_detection, bgr_image,confidence)
|
||||
count = 0
|
||||
for face_coordinates in faces:
|
||||
x1, x2, y1, y2 = apply_offsets(face_coordinates, face_offsets)
|
||||
rgb_face = rgb_image[y1:y2, x1:x2]
|
||||
|
||||
print("len", len(rgb_face))
|
||||
# print(rgb_face)
|
||||
if len(rgb_face) != 0 and counter % 1 ==0:
|
||||
print(detector.detect_faces(rgb_face))
|
||||
dict_mtcnn = detector.detect_faces(rgb_face)
|
||||
if len(dict_mtcnn) != 0:
|
||||
bounding_box = dict_mtcnn[0]['box']
|
||||
new_image = rgb_image[bounding_box[2]:bounding_box[3], bounding_box[0]:bounding_box[1]]
|
||||
cv2.rectangle(new_image,
|
||||
(bounding_box[0], bounding_box[1]),
|
||||
(bounding_box[0]+bounding_box[2], bounding_box[1] + bounding_box[3]),
|
||||
(0,155,255), 2)
|
||||
|
||||
# cv2.imwrite("align/align_{}/align_{}_{}".format(name_count, name_count,counter) + ".jpg", cv2.cvtColor(img, cv2.COLOR_RGB2BGR))
|
||||
cv2.imwrite("align/emp_{}/emp_{}_{}".format(name_count, name_count,counter) + ".jpg", cv2.cvtColor(rgb_face, cv2.COLOR_RGB2BGR))
|
||||
print("image saved-------------------", counter)
|
||||
count += 1
|
||||
try:
|
||||
rgb_face = cv2.resize(rgb_face, (face_detection_size))
|
||||
except:
|
||||
continue
|
||||
rgb_face = np.expand_dims(rgb_face, 0)
|
||||
rgb_face = preprocess_input(rgb_face, False)
|
||||
|
||||
# Bounding box color
|
||||
color = (255, 0, 0)
|
||||
identity = "this is me"
|
||||
draw_bounding_box(face_coordinates, rgb_image, color, identity)
|
||||
bgr_image = cv2.cvtColor(rgb_image, cv2.COLOR_RGB2BGR)
|
||||
cv2.imshow('Attendence_Tracker', bgr_image)
|
||||
|
||||
if cv2.waitKey(1) & 0xFF == ord('q'):
|
||||
print('Total frames processed:', counter, frame_process_counter)
|
||||
break
|
||||
video_capture.release()
|
||||
# out.release()
|
||||
cv2.destroyAllWindows()
|
||||
|
||||
return "successful"
|
||||
|
||||
|
||||
1789
modules/Face-Detection-SSD-master/ckpt_/deploy.prototxt.txt
Normal file
BIN
modules/Face-Detection-SSD-master/ckpt_/simple_CNN.81-0.96.hdf5
Normal file
150
modules/Face-Detection-SSD-master/crop_face.py
Normal file
@ -0,0 +1,150 @@
|
||||
from statistics import mode
|
||||
import imutils
|
||||
import cv2
|
||||
import numpy as np
|
||||
from imutils.video import VideoStream
|
||||
import time
|
||||
import os
|
||||
from preprocessor import preprocess_input
|
||||
|
||||
import pickle as pkl
|
||||
|
||||
# Support functions
|
||||
|
||||
def get_labels(dataset_name):
|
||||
if dataset_name == 'KDEF':
|
||||
return {0: 'AN', 1: 'DI', 2: 'AF', 3: 'HA', 4: 'SA', 5: 'SU', 6: 'NE'}
|
||||
else:
|
||||
raise Exception('Invalid dataset name')
|
||||
|
||||
|
||||
def detect_faces(detection_model, gray_image_array, conf):
|
||||
frame = gray_image_array
|
||||
# Grab frame dimention and convert to blob
|
||||
(h,w) = frame.shape[:2]
|
||||
# Preprocess input image: mean subtraction, normalization
|
||||
blob = cv2.dnn.blobFromImage(cv2.resize(frame, (300, 300)), 1.0,
|
||||
(300, 300), (104.0, 177.0, 123.0))
|
||||
# Set read image as input to model
|
||||
detection_model.setInput(blob)
|
||||
|
||||
# Run forward pass on model. Receive output of shape (1,1,no_of_predictions, 7)
|
||||
predictions = detection_model.forward()
|
||||
coord_list = []
|
||||
count = 0
|
||||
for i in range(0, predictions.shape[2]):
|
||||
confidence = predictions[0,0,i,2]
|
||||
if confidence > conf:
|
||||
# Find box coordinates rescaled to original image
|
||||
box_coord = predictions[0,0,i,3:7] * np.array([w,h,w,h])
|
||||
conf_text = '{:.2f}'.format(confidence)
|
||||
# Find output coordinates
|
||||
xmin, ymin, xmax, ymax = box_coord.astype('int')
|
||||
coord_list.append([xmin, ymin, (xmax-xmin), (ymax-ymin)])
|
||||
|
||||
print('Coordinate list:', coord_list)
|
||||
|
||||
return coord_list
|
||||
|
||||
|
||||
def draw_text(coordinates, image_array, text, color, x_offset=0, y_offset=0,
|
||||
font_scale=2, thickness=2):
|
||||
x, y = coordinates[:2]
|
||||
cv2.putText(image_array, text, (x + x_offset, y + y_offset),
|
||||
cv2.FONT_HERSHEY_SIMPLEX,
|
||||
font_scale, color, thickness, cv2.LINE_AA)
|
||||
|
||||
|
||||
def draw_bounding_box(face_coordinates, image_array, color, identity):
|
||||
x, y, w, h = face_coordinates
|
||||
cv2.rectangle(image_array, (x, y), (x + w, y + h), color, 2)
|
||||
cv2.putText(image_array, str(identity), (x+5,y-5), font, 1, (255,255,255), 2)
|
||||
|
||||
|
||||
def apply_offsets(face_coordinates, offsets):
|
||||
x, y, width, height = face_coordinates
|
||||
x_off, y_off = offsets
|
||||
return (x - x_off, x + width + x_off, y - y_off, y + height + y_off)
|
||||
|
||||
|
||||
def load_detection_model(prototxt, weights):
|
||||
detection_model = cv2.dnn.readNetFromCaffe(prototxt, weights)
|
||||
return detection_model
|
||||
|
||||
font = cv2.FONT_HERSHEY_SIMPLEX
|
||||
|
||||
frame_window = 10
|
||||
face_offsets = (30, 40)
|
||||
emotion_offsets = (20, 40)
|
||||
confidence = 0.6
|
||||
|
||||
# face_detection_size = (40, 40)
|
||||
counter = 0
|
||||
# frame_process_counter = 0
|
||||
|
||||
def crop_face(file_name, face_detection, dirName):
|
||||
dire = "cropped_faces/" + dirName
|
||||
try:
|
||||
os.makedirs(dire)
|
||||
print("Directory " , dire , " Created ")
|
||||
except FileExistsError:
|
||||
print("Directory " , dire , " already exists")
|
||||
|
||||
|
||||
face_detection_size = (40, 40)
|
||||
counter = 0
|
||||
frame_process_counter = 0
|
||||
|
||||
# starting video streaming
|
||||
cv2.namedWindow('Attendence_Tracker', cv2.WINDOW_NORMAL)
|
||||
# cv2.namedWindow('Attendence_Tracker')
|
||||
# file_name = '../top10/person1.mp4'
|
||||
video_capture = cv2.VideoCapture(file_name)
|
||||
|
||||
time.sleep(1.0)
|
||||
|
||||
while (video_capture.isOpened()):
|
||||
ret, bgr_image = video_capture.read()
|
||||
if ret == False:
|
||||
break
|
||||
counter += 1
|
||||
if counter % 1 == 0:
|
||||
frame_process_counter += 1
|
||||
gray_image = cv2.cvtColor(bgr_image, cv2.COLOR_BGR2GRAY)
|
||||
rgb_image = cv2.cvtColor(bgr_image, cv2.COLOR_BGR2RGB)
|
||||
faces = detect_faces(face_detection, bgr_image,confidence)
|
||||
count = 0
|
||||
for face_coordinates in faces:
|
||||
x1, x2, y1, y2 = apply_offsets(face_coordinates, face_offsets)
|
||||
rgb_face = rgb_image[y1:y2, x1:x2]
|
||||
|
||||
print("len", len(rgb_face))
|
||||
# print(rgb_face)
|
||||
if len(rgb_face) != 0 and counter % 1 ==0:
|
||||
cv2.imwrite(dire +"/"+dirName+"_{}".format(counter) + ".jpg", cv2.cvtColor(rgb_face, cv2.COLOR_RGB2BGR))
|
||||
print("image saved-------------------", counter)
|
||||
count += 1
|
||||
try:
|
||||
rgb_face = cv2.resize(rgb_face, (face_detection_size))
|
||||
except:
|
||||
continue
|
||||
rgb_face = np.expand_dims(rgb_face, 0)
|
||||
rgb_face = preprocess_input(rgb_face, False)
|
||||
|
||||
# Bounding box color
|
||||
color = (255, 0, 0)
|
||||
identity = "this is me"
|
||||
draw_bounding_box(face_coordinates, rgb_image, color, identity)
|
||||
bgr_image = cv2.cvtColor(rgb_image, cv2.COLOR_RGB2BGR)
|
||||
cv2.imshow('Attendence_Tracker', bgr_image)
|
||||
|
||||
if cv2.waitKey(1) & 0xFF == ord('q'):
|
||||
print('Total frames processed:', counter, frame_process_counter)
|
||||
break
|
||||
video_capture.release()
|
||||
# out.release()
|
||||
cv2.destroyAllWindows()
|
||||
|
||||
return "successful"
|
||||
|
||||
|
||||
7
modules/Face-Detection-SSD-master/data/README.md
Normal file
@ -0,0 +1,7 @@
|
||||
|
||||
## Face-Detection-SSD
|
||||
This repository contains the code for face detection using SSD. This repository detect the face from video and cropped the face. The cropped face will save in given folder name.
|
||||
|
||||
### Input Data
|
||||
|
||||
Put the video file here.
|
||||
69
modules/Face-Detection-SSD-master/helper.py
Normal file
@ -0,0 +1,69 @@
|
||||
import imutils
|
||||
import cv2
|
||||
from keras.models import load_model
|
||||
import numpy as np
|
||||
from imutils.video import VideoStream
|
||||
import time
|
||||
|
||||
def get_labels(dataset_name):
|
||||
if dataset_name == 'fer2013':
|
||||
return {0: 'angry', 1: 'disgust', 2: 'fear', 3: 'happy',
|
||||
4: 'sad', 5: 'surprise', 6: 'neutral'}
|
||||
elif dataset_name == 'imdb':
|
||||
return {0: 'woman', 1: 'man'}
|
||||
elif dataset_name == 'KDEF':
|
||||
return {0: 'AN', 1: 'DI', 2: 'AF', 3: 'HA', 4: 'SA', 5: 'SU', 6: 'NE'}
|
||||
else:
|
||||
raise Exception('Invalid dataset name')
|
||||
|
||||
|
||||
|
||||
def draw_text(coordinates, image_array, text, color, x_offset=0, y_offset=0,
|
||||
font_scale=2, thickness=2):
|
||||
x, y = coordinates[:2]
|
||||
cv2.putText(image_array, text, (x + x_offset, y + y_offset),
|
||||
cv2.FONT_HERSHEY_SIMPLEX,
|
||||
font_scale, color, thickness, cv2.LINE_AA)
|
||||
|
||||
|
||||
def draw_bounding_box(face_coordinates, image_array, color):
|
||||
x, y, w, h = face_coordinates
|
||||
cv2.rectangle(image_array, (x, y), (x + w, y + h), color, 2)
|
||||
|
||||
|
||||
def apply_offsets(face_coordinates, offsets):
|
||||
x, y, width, height = face_coordinates
|
||||
x_off, y_off = offsets
|
||||
return (x - x_off, x + width + x_off, y - y_off, y + height + y_off)
|
||||
|
||||
|
||||
def load_detection_model(prototxt, weights):
|
||||
detection_model = cv2.dnn.readNetFromCaffe(prototxt, weights)
|
||||
return detection_model
|
||||
|
||||
|
||||
|
||||
def detect_faces(detection_model, gray_image_array, conf):
|
||||
frame = gray_image_array
|
||||
# Grab frame dimention and convert to blob
|
||||
(h,w) = frame.shape[:2]
|
||||
# Preprocess input image: mean subtraction, normalization
|
||||
blob = cv2.dnn.blobFromImage(cv2.resize(frame, (300, 300)), 1.0,
|
||||
(300, 300), (104.0, 177.0, 123.0))
|
||||
# Set read image as input to model
|
||||
detection_model.setInput(blob)
|
||||
|
||||
# Run forward pass on model. Receive output of shape (1,1,no_of_predictions, 7)
|
||||
predictions = detection_model.forward()
|
||||
coord_list = []
|
||||
for i in range(0, predictions.shape[2]):
|
||||
confidence = predictions[0,0,i,2]
|
||||
if confidence > conf:
|
||||
# Find box coordinates rescaled to original image
|
||||
box_coord = predictions[0,0,i,3:7] * np.array([w,h,w,h])
|
||||
conf_text = '{:.2f}'.format(confidence)
|
||||
# Find output coordinates
|
||||
xmin, ymin, xmax, ymax = box_coord.astype('int')
|
||||
coord_list.append([xmin, ymin, (xmax-xmin), (ymax-ymin)])
|
||||
print('Coordinate list:', coord_list)
|
||||
return coord_list
|
||||
BIN
modules/Face-Detection-SSD-master/images/arch1.png
Normal file
|
After Width: | Height: | Size: 152 KiB |
BIN
modules/Face-Detection-SSD-master/images/r1.png
Normal file
|
After Width: | Height: | Size: 240 KiB |
BIN
modules/Face-Detection-SSD-master/images/r2.png
Normal file
|
After Width: | Height: | Size: 356 KiB |
BIN
modules/Face-Detection-SSD-master/images/r3.png
Normal file
|
After Width: | Height: | Size: 295 KiB |
BIN
modules/Face-Detection-SSD-master/images/ssd_arch.png
Normal file
|
After Width: | Height: | Size: 46 KiB |
28
modules/Face-Detection-SSD-master/preprocessor.py
Normal file
@ -0,0 +1,28 @@
|
||||
import numpy as np
|
||||
#from scipy.misc import imresize
|
||||
import matplotlib as mp
|
||||
|
||||
|
||||
def preprocess_input(x, v2=True):
|
||||
x = x.astype('float32')
|
||||
x = x / 255.0
|
||||
if v2:
|
||||
x = x - 0.5
|
||||
x = x * 2.0
|
||||
return x
|
||||
|
||||
|
||||
def _imread(image_name):
|
||||
return mp.pyplot.imread(image_name)
|
||||
|
||||
|
||||
def _imresize(image_array, size):
|
||||
return mp.pyplot.imresize(image_array, size)
|
||||
|
||||
|
||||
def to_categorical(integer_classes, num_classes=2):
|
||||
integer_classes = np.asarray(integer_classes, dtype='int')
|
||||
num_samples = integer_classes.shape[0]
|
||||
categorical = np.zeros((num_samples, num_classes))
|
||||
categorical[np.arange(num_samples), integer_classes] = 1
|
||||
return categorical
|
||||
6
modules/Face-Detection-SSD-master/requirements.txt
Normal file
@ -0,0 +1,6 @@
|
||||
tensorflow
|
||||
opencv-python
|
||||
numpy
|
||||
Pillow
|
||||
opencv-contrib-python
|
||||
imutils
|
||||
17
modules/Face-Detection-SSD-master/result/README.md
Normal file
@ -0,0 +1,17 @@
|
||||
|
||||
## Face-Detection-SSD
|
||||
This repository contains the code for face detection using SSD. This repository detect the face from video and cropped the face. The cropped face will save in given folder name.
|
||||
|
||||
|
||||
#### Result of face detection SSD :
|
||||
|
||||

|
||||

|
||||

|
||||
|
||||
|
||||
#### Cropped faces :
|
||||
   
|
||||
|
||||
   
|
||||
|
||||
BIN
modules/Face-Detection-SSD-master/result/f11.jpg
Normal file
|
After Width: | Height: | Size: 55 KiB |
BIN
modules/Face-Detection-SSD-master/result/f12.jpg
Normal file
|
After Width: | Height: | Size: 54 KiB |
BIN
modules/Face-Detection-SSD-master/result/f13.jpg
Normal file
|
After Width: | Height: | Size: 48 KiB |
BIN
modules/Face-Detection-SSD-master/result/f14.jpg
Normal file
|
After Width: | Height: | Size: 41 KiB |
BIN
modules/Face-Detection-SSD-master/result/f21.jpg
Normal file
|
After Width: | Height: | Size: 18 KiB |
BIN
modules/Face-Detection-SSD-master/result/f22.jpg
Normal file
|
After Width: | Height: | Size: 18 KiB |
BIN
modules/Face-Detection-SSD-master/result/f23.jpg
Normal file
|
After Width: | Height: | Size: 18 KiB |
BIN
modules/Face-Detection-SSD-master/result/f24.jpg
Normal file
|
After Width: | Height: | Size: 17 KiB |
58
modules/Face-Detection-SSD-master/save.py
Normal file
@ -0,0 +1,58 @@
|
||||
import cv2
|
||||
import numpy as np
|
||||
import os
|
||||
|
||||
from os.path import isfile, join
|
||||
|
||||
|
||||
def sort_fun(list_data):
|
||||
li_data = []
|
||||
for i in list_data:
|
||||
z = i.split(".")
|
||||
li_data.append(int(z[0]))
|
||||
# print(li_data)
|
||||
sort_list = sorted(li_data)
|
||||
# print(sort_list)
|
||||
final_list = []
|
||||
for i in range(len(list_data)):
|
||||
final_list.append(list_data[li_data.index(sort_list[i])])
|
||||
return final_list
|
||||
|
||||
def convert_frames_to_video(pathIn,pathOut,fps):
|
||||
frame_array = []
|
||||
files = [f for f in os.listdir(pathIn) if isfile(join(pathIn, f))]
|
||||
|
||||
files = sort_fun(files)
|
||||
for i in range(len(files)):
|
||||
filename=pathIn + files[i]
|
||||
#reading each files
|
||||
img = cv2.imread(filename)
|
||||
height, width, layers = img.shape
|
||||
print(img.shape)
|
||||
size = (width,height)
|
||||
# print(filename)
|
||||
#inserting the frames into an image array
|
||||
frame_array.append(img)
|
||||
print("processed files", i)
|
||||
|
||||
# out = cv2.VideoWriter(pathOut,cv2.VideoWriter_fourcc(*'DIVX'), fps, size)
|
||||
out = cv2.VideoWriter(pathOut,cv2.VideoWriter_fourcc(*'DIVX'), fps, size)
|
||||
|
||||
for i in range(len(frame_array)):
|
||||
# writing to a image array
|
||||
out.write(frame_array[i])
|
||||
out.release()
|
||||
return "success"
|
||||
|
||||
def save_video(pathIn, pathOut, fps):
|
||||
|
||||
convert_frames_to_video(pathIn, pathOut, fps)
|
||||
|
||||
|
||||
|
||||
pathIn = "15/"
|
||||
pathOut = "video/15.avi"
|
||||
fps = 32
|
||||
|
||||
save_video(pathIn, pathOut, fps)
|
||||
|
||||
56
modules/Face-Detection-SSD-master/test_cam.py
Normal file
@ -0,0 +1,56 @@
|
||||
import cv2
|
||||
import numpy as np
|
||||
from mtcnn.mtcnn import MTCNN
|
||||
detector = MTCNN()
|
||||
|
||||
filename = "video/video10.mp4"
|
||||
|
||||
# cv2.namedWindow('Attendence_Tracker', cv2.WINDOW_NORMAL)
|
||||
|
||||
# cap = cv2.VideoCapture(filename)
|
||||
# # cap = cv2.VideoCapture(0)
|
||||
# while True:
|
||||
# #Capture frame-by-frame
|
||||
# __, frame = cap.read()
|
||||
|
||||
# #Use MTCNN to detect faces
|
||||
# result = detector.detect_faces(frame)
|
||||
# if result != []:
|
||||
# for person in result:
|
||||
# bounding_box = person['box']
|
||||
# keypoints = person['keypoints']
|
||||
|
||||
# cv2.rectangle(frame,
|
||||
# (bounding_box[0], bounding_box[1]),
|
||||
# (bounding_box[0]+bounding_box[2], bounding_box[1] + bounding_box[3]),
|
||||
# (0,155,255),
|
||||
# 2)
|
||||
|
||||
# cv2.circle(frame,(keypoints['left_eye']), 2, (0,155,255), 2)
|
||||
# cv2.circle(frame,(keypoints['right_eye']), 2, (0,155,255), 2)
|
||||
# cv2.circle(frame,(keypoints['nose']), 2, (0,155,255), 2)
|
||||
# cv2.circle(frame,(keypoints['mouth_left']), 2, (0,155,255), 2)
|
||||
# cv2.circle(frame,(keypoints['mouth_right']), 2, (0,155,255), 2)
|
||||
# #display resulting frame
|
||||
# cv2.imshow('Attendence_Tracker',frame)
|
||||
# if cv2.waitKey(1) &0xFF == ord('q'):
|
||||
# break
|
||||
# #When everything's done, release capture
|
||||
# cap.release()
|
||||
# cv2.destroyAllWindows()
|
||||
|
||||
img = "align/3.jpg"
|
||||
|
||||
def adjust_gamma(image, gamma=1.5):
|
||||
|
||||
invGamma = 1.0 / gamma
|
||||
table = np.array([((i / 255.0) ** invGamma) * 255
|
||||
for i in np.arange(0, 256)]).astype("uint8")
|
||||
|
||||
return cv2.LUT(image, table)
|
||||
|
||||
image = cv2.imread(img)
|
||||
img = adjust_gamma(image)
|
||||
result = detector.detect_faces(image)
|
||||
|
||||
print(result)
|
||||
158
modules/Face-Detection-SSD-master/utils.py
Normal file
@ -0,0 +1,158 @@
|
||||
import tensorflow as tf
|
||||
import numpy as np
|
||||
import os
|
||||
from numpy import genfromtxt
|
||||
from keras.layers import Conv2D, ZeroPadding2D, Activation, Input, concatenate
|
||||
from keras.models import Model
|
||||
from keras.layers.normalization import BatchNormalization
|
||||
from keras.layers.pooling import MaxPooling2D, AveragePooling2D
|
||||
|
||||
|
||||
_FLOATX = 'float32'
|
||||
|
||||
def variable(value, dtype=_FLOATX, name=None):
|
||||
v = tf.Variable(np.asarray(value, dtype=dtype), name=name)
|
||||
_get_session().run(v.initializer)
|
||||
return v
|
||||
|
||||
def shape(x):
|
||||
return x.get_shape()
|
||||
|
||||
def square(x):
|
||||
return tf.square(x)
|
||||
|
||||
def zeros(shape, dtype=_FLOATX, name=None):
|
||||
return variable(np.zeros(shape), dtype, name)
|
||||
|
||||
def concatenate(tensors, axis=-1):
|
||||
if axis < 0:
|
||||
axis = axis % len(tensors[0].get_shape())
|
||||
return tf.concat(axis, tensors)
|
||||
|
||||
def LRN2D(x):
|
||||
return tf.nn.lrn(x, alpha=1e-4, beta=0.75)
|
||||
|
||||
def conv2d_bn(
|
||||
x,
|
||||
layer=None,
|
||||
cv1_out=None,
|
||||
cv1_filter=(1, 1),
|
||||
cv1_strides=(1, 1),
|
||||
cv2_out=None,
|
||||
cv2_filter=(3, 3),
|
||||
cv2_strides=(1, 1),
|
||||
padding=None,
|
||||
):
|
||||
num = '' if cv2_out == None else '1'
|
||||
tensor = Conv2D(cv1_out, cv1_filter, strides=cv1_strides, name=layer+'_conv'+num)(x)
|
||||
tensor = BatchNormalization(axis=3, epsilon=0.00001, name=layer+'_bn'+num)(tensor)
|
||||
tensor = Activation('relu')(tensor)
|
||||
if padding == None:
|
||||
return tensor
|
||||
tensor = ZeroPadding2D(padding=padding)(tensor)
|
||||
if cv2_out == None:
|
||||
return tensor
|
||||
tensor = Conv2D(cv2_out, cv2_filter, strides=cv2_strides, name=layer+'_conv'+'2')(tensor)
|
||||
tensor = BatchNormalization(axis=3, epsilon=0.00001, name=layer+'_bn'+'2')(tensor)
|
||||
tensor = Activation('relu')(tensor)
|
||||
return tensor
|
||||
|
||||
weights = [
|
||||
'conv1', 'bn1', 'conv2', 'bn2', 'conv3', 'bn3',
|
||||
'inception_3a_1x1_conv', 'inception_3a_1x1_bn',
|
||||
'inception_3a_pool_conv', 'inception_3a_pool_bn',
|
||||
'inception_3a_5x5_conv1', 'inception_3a_5x5_conv2', 'inception_3a_5x5_bn1', 'inception_3a_5x5_bn2',
|
||||
'inception_3a_3x3_conv1', 'inception_3a_3x3_conv2', 'inception_3a_3x3_bn1', 'inception_3a_3x3_bn2',
|
||||
'inception_3b_3x3_conv1', 'inception_3b_3x3_conv2', 'inception_3b_3x3_bn1', 'inception_3b_3x3_bn2',
|
||||
'inception_3b_5x5_conv1', 'inception_3b_5x5_conv2', 'inception_3b_5x5_bn1', 'inception_3b_5x5_bn2',
|
||||
'inception_3b_pool_conv', 'inception_3b_pool_bn',
|
||||
'inception_3b_1x1_conv', 'inception_3b_1x1_bn',
|
||||
'inception_3c_3x3_conv1', 'inception_3c_3x3_conv2', 'inception_3c_3x3_bn1', 'inception_3c_3x3_bn2',
|
||||
'inception_3c_5x5_conv1', 'inception_3c_5x5_conv2', 'inception_3c_5x5_bn1', 'inception_3c_5x5_bn2',
|
||||
'inception_4a_3x3_conv1', 'inception_4a_3x3_conv2', 'inception_4a_3x3_bn1', 'inception_4a_3x3_bn2',
|
||||
'inception_4a_5x5_conv1', 'inception_4a_5x5_conv2', 'inception_4a_5x5_bn1', 'inception_4a_5x5_bn2',
|
||||
'inception_4a_pool_conv', 'inception_4a_pool_bn',
|
||||
'inception_4a_1x1_conv', 'inception_4a_1x1_bn',
|
||||
'inception_4e_3x3_conv1', 'inception_4e_3x3_conv2', 'inception_4e_3x3_bn1', 'inception_4e_3x3_bn2',
|
||||
'inception_4e_5x5_conv1', 'inception_4e_5x5_conv2', 'inception_4e_5x5_bn1', 'inception_4e_5x5_bn2',
|
||||
'inception_5a_3x3_conv1', 'inception_5a_3x3_conv2', 'inception_5a_3x3_bn1', 'inception_5a_3x3_bn2',
|
||||
'inception_5a_pool_conv', 'inception_5a_pool_bn',
|
||||
'inception_5a_1x1_conv', 'inception_5a_1x1_bn',
|
||||
'inception_5b_3x3_conv1', 'inception_5b_3x3_conv2', 'inception_5b_3x3_bn1', 'inception_5b_3x3_bn2',
|
||||
'inception_5b_pool_conv', 'inception_5b_pool_bn',
|
||||
'inception_5b_1x1_conv', 'inception_5b_1x1_bn',
|
||||
'dense_layer'
|
||||
]
|
||||
|
||||
conv_shape = {
|
||||
'conv1': [64, 3, 7, 7],
|
||||
'conv2': [64, 64, 1, 1],
|
||||
'conv3': [192, 64, 3, 3],
|
||||
'inception_3a_1x1_conv': [64, 192, 1, 1],
|
||||
'inception_3a_pool_conv': [32, 192, 1, 1],
|
||||
'inception_3a_5x5_conv1': [16, 192, 1, 1],
|
||||
'inception_3a_5x5_conv2': [32, 16, 5, 5],
|
||||
'inception_3a_3x3_conv1': [96, 192, 1, 1],
|
||||
'inception_3a_3x3_conv2': [128, 96, 3, 3],
|
||||
'inception_3b_3x3_conv1': [96, 256, 1, 1],
|
||||
'inception_3b_3x3_conv2': [128, 96, 3, 3],
|
||||
'inception_3b_5x5_conv1': [32, 256, 1, 1],
|
||||
'inception_3b_5x5_conv2': [64, 32, 5, 5],
|
||||
'inception_3b_pool_conv': [64, 256, 1, 1],
|
||||
'inception_3b_1x1_conv': [64, 256, 1, 1],
|
||||
'inception_3c_3x3_conv1': [128, 320, 1, 1],
|
||||
'inception_3c_3x3_conv2': [256, 128, 3, 3],
|
||||
'inception_3c_5x5_conv1': [32, 320, 1, 1],
|
||||
'inception_3c_5x5_conv2': [64, 32, 5, 5],
|
||||
'inception_4a_3x3_conv1': [96, 640, 1, 1],
|
||||
'inception_4a_3x3_conv2': [192, 96, 3, 3],
|
||||
'inception_4a_5x5_conv1': [32, 640, 1, 1,],
|
||||
'inception_4a_5x5_conv2': [64, 32, 5, 5],
|
||||
'inception_4a_pool_conv': [128, 640, 1, 1],
|
||||
'inception_4a_1x1_conv': [256, 640, 1, 1],
|
||||
'inception_4e_3x3_conv1': [160, 640, 1, 1],
|
||||
'inception_4e_3x3_conv2': [256, 160, 3, 3],
|
||||
'inception_4e_5x5_conv1': [64, 640, 1, 1],
|
||||
'inception_4e_5x5_conv2': [128, 64, 5, 5],
|
||||
'inception_5a_3x3_conv1': [96, 1024, 1, 1],
|
||||
'inception_5a_3x3_conv2': [384, 96, 3, 3],
|
||||
'inception_5a_pool_conv': [96, 1024, 1, 1],
|
||||
'inception_5a_1x1_conv': [256, 1024, 1, 1],
|
||||
'inception_5b_3x3_conv1': [96, 736, 1, 1],
|
||||
'inception_5b_3x3_conv2': [384, 96, 3, 3],
|
||||
'inception_5b_pool_conv': [96, 736, 1, 1],
|
||||
'inception_5b_1x1_conv': [256, 736, 1, 1],
|
||||
}
|
||||
|
||||
def load_weights():
|
||||
# Set weights path
|
||||
dirPath = './weights'
|
||||
fileNames = filter(lambda f: not f.startswith('.'), os.listdir(dirPath))
|
||||
paths = {}
|
||||
weights_dict = {}
|
||||
|
||||
for n in fileNames:
|
||||
paths[n.replace('.csv', '')] = dirPath + '/' + n
|
||||
|
||||
for name in weights:
|
||||
if 'conv' in name:
|
||||
conv_w = genfromtxt(paths[name + '_w'], delimiter=',', dtype=None)
|
||||
conv_w = np.reshape(conv_w, conv_shape[name])
|
||||
conv_w = np.transpose(conv_w, (2, 3, 1, 0))
|
||||
conv_b = genfromtxt(paths[name + '_b'], delimiter=',', dtype=None)
|
||||
weights_dict[name] = [conv_w, conv_b]
|
||||
elif 'bn' in name:
|
||||
bn_w = genfromtxt(paths[name + '_w'], delimiter=',', dtype=None)
|
||||
bn_b = genfromtxt(paths[name + '_b'], delimiter=',', dtype=None)
|
||||
bn_m = genfromtxt(paths[name + '_m'], delimiter=',', dtype=None)
|
||||
bn_v = genfromtxt(paths[name + '_v'], delimiter=',', dtype=None)
|
||||
weights_dict[name] = [bn_w, bn_b, bn_m, bn_v]
|
||||
elif 'dense' in name:
|
||||
dense_w = genfromtxt(dirPath+'/dense_w.csv', delimiter=',', dtype=None)
|
||||
dense_w = np.reshape(dense_w, (128, 736))
|
||||
dense_w = np.transpose(dense_w, (1, 0))
|
||||
dense_b = genfromtxt(dirPath+'/dense_b.csv', delimiter=',', dtype=None)
|
||||
weights_dict[name] = [dense_w, dense_b]
|
||||
|
||||
return weights_dict
|
||||
|
||||
BIN
modules/facenet512/facenet_weights.h5
Normal file
BIN
modules/openvino-master.zip
Normal file
53
modules/openvino-master/.ci/azure/analyze_gtest_log.py
Normal file
@ -0,0 +1,53 @@
|
||||
# Copyright (C) 2018-2021 Intel Corporation
|
||||
# SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
"""
|
||||
Analyze GTest logs
|
||||
"""
|
||||
|
||||
import re
|
||||
from argparse import ArgumentParser
|
||||
|
||||
|
||||
def get_passed_tests(log_file_path):
|
||||
"""Gets passed tests with OK status"""
|
||||
ok_test_line_pattern = "[ OK ] "
|
||||
ok_tests = []
|
||||
with open(log_file_path) as log_file_obj:
|
||||
for line in log_file_obj.readlines():
|
||||
if ok_test_line_pattern in line:
|
||||
ok_tests.append(line.split(ok_test_line_pattern)[1])
|
||||
return ok_tests
|
||||
|
||||
|
||||
def get_total_time(tests):
|
||||
"""Gets total execution time (sec)"""
|
||||
re_compile_time = re.compile(r".+ \(([0-9]+) ms\)")
|
||||
total_time = 0.0
|
||||
for test in tests:
|
||||
re_time = re_compile_time.match(test)
|
||||
if re_time:
|
||||
total_time += int(re_time.group(1)) / 1000
|
||||
else:
|
||||
print("No time in the test line:", test)
|
||||
return total_time
|
||||
|
||||
|
||||
def main():
|
||||
"""The main entry point function"""
|
||||
arg_parser = ArgumentParser()
|
||||
arg_parser.add_argument(
|
||||
"--log-file", metavar="PATH", default="gtest.log", help="Path to GTest log file"
|
||||
)
|
||||
args = arg_parser.parse_args()
|
||||
|
||||
passed_tests = get_passed_tests(args.log_file)
|
||||
print("PASSED tests count:", len(passed_tests))
|
||||
print("Total execution time of passed tests (sec):", get_total_time(passed_tests))
|
||||
|
||||
print("\nPASSED tests:")
|
||||
print("".join(sorted(passed_tests)))
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
196
modules/openvino-master/.ci/azure/android_arm64.yml
Normal file
@ -0,0 +1,196 @@
|
||||
trigger:
|
||||
branches:
|
||||
include:
|
||||
- 'master'
|
||||
- 'releases/*'
|
||||
paths:
|
||||
exclude:
|
||||
- '*/docs/*'
|
||||
- 'docs/*'
|
||||
- '*/*.md'
|
||||
- '*.md'
|
||||
- '*/layer_tests_summary/*'
|
||||
- '*/conformance/*'
|
||||
- 'tools/*'
|
||||
- 'tests/layer_tests/*'
|
||||
|
||||
pr:
|
||||
drafts: 'false'
|
||||
branches:
|
||||
include:
|
||||
- 'master'
|
||||
- 'releases/*'
|
||||
paths:
|
||||
exclude:
|
||||
- '*/docs/*'
|
||||
- 'docs/*'
|
||||
- '*/*.md'
|
||||
- '*.md'
|
||||
- '*/layer_tests_summary/*'
|
||||
- '*/conformance/*'
|
||||
- 'tools/*'
|
||||
- 'tests/layer_tests/*'
|
||||
|
||||
resources:
|
||||
repositories:
|
||||
- repository: vcpkg
|
||||
type: github
|
||||
endpoint: openvinotoolkit
|
||||
name: microsoft/vcpkg
|
||||
|
||||
variables:
|
||||
- group: github
|
||||
|
||||
jobs:
|
||||
- job: android_arm64
|
||||
# About 150% of total time
|
||||
timeoutInMinutes: '120'
|
||||
|
||||
pool:
|
||||
name: LIN_VMSS_VENV_F16S_U20_WU2
|
||||
|
||||
variables:
|
||||
system.debug: true
|
||||
VSTS_HTTP_RETRY: 5
|
||||
VSTS_HTTP_TIMEOUT: 200
|
||||
BUILD_TYPE: Debug
|
||||
OPENVINO_REPO_DIR: $(Build.Repository.LocalPath)
|
||||
VCPKG_ROOT: $(OPENVINO_REPO_DIR)/../vcpkg
|
||||
WORK_DIR: $(Pipeline.Workspace)/_w
|
||||
BUILD_DIR: $(WORK_DIR)/build
|
||||
ANDROID_TOOLS: $(WORK_DIR)/android_tools
|
||||
ANDROID_NDK_HOME: $(WORK_DIR)/android_tools/ndk-bundle
|
||||
ANDROID_SDK_VERSION: 29
|
||||
ANDROID_ABI_CONFIG: arm64-v8a
|
||||
TMP_DIR: /mnt/tmp
|
||||
SHARE_DIR: /mount/cinfsshare/onnxtestdata
|
||||
CCACHE_DIR: $(SHARE_DIR)/ccache/master/android_arm64
|
||||
LD_LIBRARY_PATH: $(Agent.ToolsDirectory)/Python/$(OV_PYTHON_VERSION)/x64/lib
|
||||
OV_PYTHON_VERSION: 3.11.2 # Full version of Python its required for LD_LIBRARY_PATH. More details https://github.com/microsoft/azure-pipelines-tool-lib/blob/master/docs/overview.md#tool-cache
|
||||
|
||||
steps:
|
||||
- task: UsePythonVersion@0
|
||||
inputs:
|
||||
versionSpec: '$(OV_PYTHON_VERSION)' # Setting only major & minor version will download latest release from GH repo example 3.10 will be 3.10.10.
|
||||
addToPath: true
|
||||
disableDownloadFromRegistry: false
|
||||
architecture: 'x64'
|
||||
githubToken: $(auth_token)
|
||||
displayName: Setup Python 3.11
|
||||
name: setupPython
|
||||
- bash: |
|
||||
#!/bin/bash
|
||||
python -V
|
||||
|
||||
- script: |
|
||||
curl -H Metadata:true --noproxy "*" "http://169.254.169.254/metadata/instance?api-version=2019-06-01"
|
||||
whoami
|
||||
uname -a
|
||||
echo ls /usr/bin/python3.10
|
||||
rm -rf /usr/bin/python3
|
||||
sudo ln -s /usr/bin/python3.10 /usr/bin/python3
|
||||
echo Python3 info ; which python3 ; python3 --version
|
||||
echo Python info ; which python ; python --version
|
||||
echo Java info ; which java ; java -version
|
||||
echo gcc info ; which gcc ; gcc --version
|
||||
echo cmake info ; which cmake ; cmake --version
|
||||
lsb_release
|
||||
env
|
||||
cat /proc/cpuinfo
|
||||
cat /proc/meminfo
|
||||
cat /etc/fstab
|
||||
vmstat -s
|
||||
df
|
||||
lsblk -o NAME,HCTL,SIZE,MOUNTPOINT | grep -i "sd"
|
||||
free -h
|
||||
displayName: 'System information'
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
rm -rf $(WORK_DIR) ; mkdir $(WORK_DIR)
|
||||
rm -rf $(BUILD_DIR) ; mkdir $(BUILD_DIR)
|
||||
rm -rf $(ANDROID_TOOLS) ; mkdir $(ANDROID_TOOLS)
|
||||
sudo rm -rf $(TMP_DIR) ; sudo mkdir $(TMP_DIR) ; sudo chmod 777 -R $(TMP_DIR)
|
||||
sudo mkdir -p $(SHARE_DIR)
|
||||
sudo apt --assume-yes update && sudo apt --assume-yes install nfs-common
|
||||
sudo mount -vvv -t nfs cinfsshare.file.core.windows.net:/cinfsshare/onnxtestdata $(SHARE_DIR) -o vers=4,minorversion=1,sec=sys
|
||||
mkdir -p $(CCACHE_DIR)
|
||||
displayName: 'Make dir'
|
||||
|
||||
- checkout: self
|
||||
submodules: 'true'
|
||||
clean: 'true'
|
||||
path: openvino
|
||||
|
||||
- checkout: vcpkg
|
||||
clean: 'true'
|
||||
path: vcpkg
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
# generic dependencies
|
||||
sudo -E apt --assume-yes install ccache scons default-jdk python3-pip ninja-build
|
||||
# vcpkg requires cmake 3.19 or later
|
||||
python3 -m pip install -U pip cmake
|
||||
# vcpkg's tool dependencies
|
||||
sudo -E apt --assume-yes install curl zip unzip tar
|
||||
# vcpkg tree of dependencies require extra packages
|
||||
sudo -E apt --assume-yes install pkg-config linux-libc-dev
|
||||
# Install Android SDK, NDK and Tools
|
||||
sudo apt -y --no-install-recommends install unzip
|
||||
wget https://dl.google.com/android/repository/commandlinetools-linux-7583922_latest.zip
|
||||
unzip commandlinetools-linux-7583922_latest.zip
|
||||
yes | ./cmdline-tools/bin/sdkmanager --sdk_root=$(ANDROID_TOOLS) --licenses
|
||||
./cmdline-tools/bin/sdkmanager --sdk_root=$(ANDROID_TOOLS) --install "ndk-bundle" "platform-tools" "platforms;android-$(ANDROID_SDK_VERSION)"
|
||||
displayName: 'Install dependencies'
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
$(VCPKG_ROOT)/bootstrap-vcpkg.sh --disableMetrics
|
||||
# patch vcpkg default (community) toolchain to build only Release configuration
|
||||
echo "set(VCPKG_BUILD_TYPE release)" >> $(VCPKG_ROOT)/triplets/community/arm64-android.cmake
|
||||
displayName: 'Build vcpkg'
|
||||
|
||||
- task: CMake@1
|
||||
inputs:
|
||||
cmakeArgs: >
|
||||
-G Ninja
|
||||
-DCMAKE_VERBOSE_MAKEFILE=ON
|
||||
-DCMAKE_BUILD_TYPE=$(BUILD_TYPE)
|
||||
-DVCPKG_TARGET_TRIPLET=arm64-android
|
||||
-DVCPKG_HOST_TRIPLET=x64-linux-release
|
||||
-DCMAKE_TOOLCHAIN_FILE=$(VCPKG_ROOT)/scripts/buildsystems/vcpkg.cmake
|
||||
-DVCPKG_CHAINLOAD_TOOLCHAIN_FILE=$(ANDROID_NDK_HOME)/build/cmake/android.toolchain.cmake
|
||||
-DCMAKE_COMPILE_WARNING_AS_ERROR=ON
|
||||
-DANDROID_ABI=$(ANDROID_ABI_CONFIG)
|
||||
-DANDROID_PLATFORM=$(ANDROID_SDK_VERSION)
|
||||
-DENABLE_PYTHON=OFF
|
||||
-DENABLE_SYSTEM_OPENCL=ON
|
||||
-DENABLE_SYSTEM_PROTOBUF=ON
|
||||
-DENABLE_SYSTEM_PUGIXML=ON
|
||||
-DENABLE_SYSTEM_SNAPPY=ON
|
||||
-DENABLE_SYSTEM_TBB=ON
|
||||
-DENABLE_SYSTEM_FLATBUFFERS=ON
|
||||
-DENABLE_INTEL_GPU=ON
|
||||
-DENABLE_TESTS=ON
|
||||
-DCMAKE_CXX_COMPILER_LAUNCHER=ccache
|
||||
-DCMAKE_C_COMPILER_LAUNCHER=ccache
|
||||
-S $(OPENVINO_REPO_DIR)
|
||||
-B $(BUILD_DIR)
|
||||
|
||||
- script: ccache --zero-stats --max-size=50G --show-config
|
||||
displayName: 'Clean ccache stats'
|
||||
|
||||
- script: cmake --build $(BUILD_DIR) --parallel --config $(BUILD_TYPE)
|
||||
env:
|
||||
CCACHE_DIR: $(CCACHE_DIR)
|
||||
CCACHE_TEMPDIR: $(TMP_DIR)/ccache
|
||||
CCACHE_BASEDIR: $(Pipeline.Workspace)
|
||||
CCACHE_MAXSIZE: 50G
|
||||
displayName: 'Build Android ARM64'
|
||||
|
||||
- script: ccache --show-stats
|
||||
displayName: 'Show ccache stats'
|
||||
|
||||
- script: ls -alR $(OPENVINO_REPO_DIR)/bin/
|
||||
displayName: 'List binary files'
|
||||
@ -0,0 +1,6 @@
|
||||
TransposeOpTest.NHWC2NCHW
|
||||
TransposeOpTest.NCHW2NHWC
|
||||
TransposeOpTest.TwoDim_int16
|
||||
GatherOpTest.Gather_axis1_indices2d_int16
|
||||
SoftmaxOperator.ThreeDimsAxis1
|
||||
SoftmaxOperator.ThreeDimsAxis0
|
||||
@ -0,0 +1 @@
|
||||
rel-1.14.0
|
||||
598
modules/openvino-master/.ci/azure/linux.yml
Normal file
@ -0,0 +1,598 @@
|
||||
trigger:
|
||||
branches:
|
||||
include:
|
||||
- 'master'
|
||||
- 'releases/*'
|
||||
paths:
|
||||
exclude:
|
||||
- '*/docs/*'
|
||||
- 'docs/*'
|
||||
- '*/*.md'
|
||||
- '*.md'
|
||||
- '*/layer_tests_summary/*'
|
||||
- '*/conformance/*'
|
||||
|
||||
pr:
|
||||
branches:
|
||||
include:
|
||||
- 'master'
|
||||
- 'releases/*'
|
||||
paths:
|
||||
exclude:
|
||||
- '*/docs/*'
|
||||
- 'docs/*'
|
||||
- '*/*.md'
|
||||
- '*.md'
|
||||
- '*/layer_tests_summary/*'
|
||||
- '*/conformance/*'
|
||||
|
||||
resources:
|
||||
repositories:
|
||||
- repository: openvino_contrib
|
||||
type: github
|
||||
endpoint: openvinotoolkit
|
||||
name: openvinotoolkit/openvino_contrib
|
||||
ref: master
|
||||
|
||||
- repository: testdata
|
||||
type: github
|
||||
endpoint: openvinotoolkit
|
||||
name: openvinotoolkit/testdata
|
||||
ref: master
|
||||
|
||||
variables:
|
||||
- group: github
|
||||
|
||||
jobs:
|
||||
- job: Lin
|
||||
strategy:
|
||||
matrix:
|
||||
# Dynamic:
|
||||
# CMAKE_BUILD_SHARED_LIBS: 'ON'
|
||||
# PYTHON_STATIC_ARGS:
|
||||
# CMAKE_CPACK_GENERATOR:
|
||||
# SAMPLES_INSTALL_DIR: $(INSTALL_DIR)/samples
|
||||
# PYTHON_SAMPLES_INSTALL_DIR: $(SAMPLES_INSTALL_DIR)/python
|
||||
# RUN_PREFIX: . $(SETUPVARS) -pyver 3.8 &&
|
||||
# Debian:
|
||||
# CMAKE_BUILD_SHARED_LIBS: 'ON'
|
||||
# PYTHON_STATIC_ARGS:
|
||||
# CMAKE_CPACK_GENERATOR: 'DEB'
|
||||
# SAMPLES_INSTALL_DIR: /usr/share/openvino/samples
|
||||
# PYTHON_SAMPLES_INSTALL_DIR: $(INSTALL_DIR)/share/openvino/samples/python
|
||||
# RUN_PREFIX: LD_LIBRARY_PATH=$(INSTALL_TEST_DIR):$(INSTALL_DIR)/opencv/lib:$LD_LIBRARY_PATH
|
||||
Static:
|
||||
CMAKE_BUILD_SHARED_LIBS: 'OFF'
|
||||
PYTHON_STATIC_ARGS: -m "not dynamic_library"
|
||||
CMAKE_CPACK_GENERATOR: "TGZ"
|
||||
SAMPLES_INSTALL_DIR: $(INSTALL_DIR)/samples
|
||||
PYTHON_SAMPLES_INSTALL_DIR: $(SAMPLES_INSTALL_DIR)/python
|
||||
RUN_PREFIX: . $(SETUPVARS) &&
|
||||
maxParallel: '2'
|
||||
|
||||
# About 150% of total time
|
||||
timeoutInMinutes: '180'
|
||||
|
||||
pool:
|
||||
name: LIN_VMSS_VENV_F16S_U20_WU2
|
||||
|
||||
variables:
|
||||
system.debug: true
|
||||
VSTS_HTTP_RETRY: 5
|
||||
VSTS_HTTP_TIMEOUT: 200
|
||||
BUILD_TYPE: Release
|
||||
REPO_DIR: $(Build.Repository.LocalPath)
|
||||
OPENVINO_CONTRIB_REPO_DIR: $(REPO_DIR)/../openvino_contrib
|
||||
MODELS_PATH: $(REPO_DIR)/../testdata
|
||||
WORK_DIR: $(Pipeline.Workspace)/_w
|
||||
BUILD_DIR: $(WORK_DIR)/build
|
||||
BUILD_SAMPLES_DIR: $(WORK_DIR)/build_samples
|
||||
BUILD_LAYER_TESTS_DIR: $(WORK_DIR)/build_layer_tests
|
||||
BUILD_SAMPLES_TESTS_DIR: $(WORK_DIR)/build_samples_tests
|
||||
INSTALL_DIR: $(WORK_DIR)/install_pkg
|
||||
INSTALL_TEST_DIR: $(INSTALL_DIR)/tests
|
||||
LAYER_TESTS_DIR: $(INSTALL_TEST_DIR)/layer_tests
|
||||
SETUPVARS: $(INSTALL_DIR)/setupvars.sh
|
||||
TMP_DIR: /mnt/tmp
|
||||
SHARE_DIR: /mount/cinfsshare/onnxtestdata
|
||||
CCACHE_DIR: $(SHARE_DIR)/ccache/master/linux
|
||||
CMAKE_VERSION: 3.24.0
|
||||
BUILD_PYTHON: $(WORK_DIR)/build_python
|
||||
INSTALL_PYTHON: $(INSTALL_OPENVINO)/extras/python
|
||||
LD_LIBRARY_PATH: $(Agent.ToolsDirectory)/Python/$(OV_PYTHON_VERSION)/x64/lib
|
||||
OV_PYTHON_VERSION: 3.11.2 # Full version of Python its required for LD_LIBRARY_PATH. More details https://github.com/microsoft/azure-pipelines-tool-lib/blob/master/docs/overview.md#tool-cache
|
||||
|
||||
steps:
|
||||
- task: UsePythonVersion@0
|
||||
inputs:
|
||||
versionSpec: '$(OV_PYTHON_VERSION)' # Setting only major & minor version will download latest release from GH repo example 3.10 will be 3.10.10.
|
||||
addToPath: true
|
||||
disableDownloadFromRegistry: false
|
||||
architecture: 'x64'
|
||||
githubToken: $(auth_token)
|
||||
displayName: Setup Python 3.11
|
||||
name: setupPython
|
||||
- bash: |
|
||||
#!/bin/bash
|
||||
python -V
|
||||
|
||||
- script: |
|
||||
curl -H Metadata:true --noproxy "*" "http://169.254.169.254/metadata/instance?api-version=2019-06-01"
|
||||
whoami
|
||||
uname -a
|
||||
echo Python3 info ; which python3 ; python3 --version
|
||||
echo Java info ; which java ; java -version
|
||||
echo gcc info ; which gcc ; gcc --version
|
||||
echo cmake info ; which cmake ; cmake --version
|
||||
lsb_release
|
||||
env
|
||||
cat /proc/cpuinfo
|
||||
cat /proc/meminfo
|
||||
cat /etc/fstab
|
||||
vmstat -s
|
||||
df
|
||||
lsblk -o NAME,HCTL,SIZE,MOUNTPOINT | grep -i "sd"
|
||||
free -h
|
||||
echo TargetBranch: $(System.PullRequest.TargetBranch)
|
||||
echo SourceBranch: $(Build.SourceBranch)
|
||||
displayName: 'System info'
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
rm -rf $(WORK_DIR) ; mkdir $(WORK_DIR)
|
||||
rm -rf $(BUILD_DIR) ; mkdir $(BUILD_DIR)
|
||||
rm -rf $(BUILD_SAMPLES_DIR) ; mkdir $(BUILD_SAMPLES_DIR)
|
||||
sudo rm -rf $(TMP_DIR) ; sudo mkdir $(TMP_DIR) ; sudo chmod 777 -R $(TMP_DIR)
|
||||
sudo mkdir -p $(SHARE_DIR)
|
||||
sudo apt --assume-yes update && sudo apt --assume-yes install nfs-common
|
||||
sudo mount -vvv -t nfs cinfsshare.file.core.windows.net:/cinfsshare/onnxtestdata $(SHARE_DIR) -o vers=4,minorversion=1,sec=sys
|
||||
mkdir -p $(CCACHE_DIR)
|
||||
displayName: 'Make dir'
|
||||
|
||||
- checkout: self
|
||||
clean: 'true'
|
||||
submodules: 'true'
|
||||
path: openvino
|
||||
|
||||
- checkout: openvino_contrib
|
||||
clean: 'true'
|
||||
submodules: 'true'
|
||||
path: openvino_contrib
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
sudo -E $(REPO_DIR)/install_build_dependencies.sh
|
||||
# Move jdk into contrib
|
||||
# 'clang' compiler is used as a default compiler
|
||||
sudo apt --assume-yes install openjdk-11-jdk libbz2-dev clang
|
||||
# For Python API
|
||||
python3 -m pip install --upgrade pip
|
||||
python3 -m pip install -r $(REPO_DIR)/src/bindings/python/wheel/requirements-dev.txt
|
||||
python3 -m pip install -r $(REPO_DIR)/src/bindings/python/requirements.txt
|
||||
# For running Python API tests
|
||||
python3 -m pip install -r $(REPO_DIR)/src/bindings/python/src/compatibility/openvino/requirements-dev.txt
|
||||
# For running Paddle frontend unit tests
|
||||
# TODO Reenable PDPD after paddlepaddle==2.5.0 with compliant protobuf is released (ticket 95904)
|
||||
#python3 -m pip install -r $(REPO_DIR)/src/frontends/paddle/tests/requirements.txt
|
||||
# For running ONNX frontend unit tests
|
||||
python3 -m pip install -r $(REPO_DIR)/src/frontends/onnx/tests/requirements.txt
|
||||
# For running TensorFlow frontend unit tests
|
||||
python3 -m pip install -r $(REPO_DIR)/src/frontends/tensorflow/tests/requirements.txt
|
||||
# For running torchvision -> OpenVINO preprocess converter
|
||||
python3 -m pip install -r $(REPO_DIR)/src/bindings/python/src/openvino/preprocess/torchvision/requirements.txt
|
||||
# For MO unit tests
|
||||
python3 -m pip install -r $(REPO_DIR)/tools/mo/requirements_mxnet.txt
|
||||
python3 -m pip install -r $(REPO_DIR)/tools/mo/requirements_caffe.txt
|
||||
python3 -m pip install -r $(REPO_DIR)/tools/mo/requirements_kaldi.txt
|
||||
python3 -m pip install -r $(REPO_DIR)/tools/mo/requirements_onnx.txt
|
||||
python3 -m pip install -r $(REPO_DIR)/tools/mo/requirements_tf2.txt
|
||||
python3 -m pip install -r $(REPO_DIR)/tools/mo/requirements_dev.txt
|
||||
# Speed up build
|
||||
sudo apt -y --no-install-recommends install unzip
|
||||
wget https://github.com/ninja-build/ninja/releases/download/v1.10.2/ninja-linux.zip
|
||||
unzip ninja-linux.zip
|
||||
sudo cp -v ninja /usr/local/bin/
|
||||
displayName: 'Install dependencies'
|
||||
|
||||
- script: |
|
||||
curl -H Metadata:true --noproxy "*" "http://169.254.169.254/metadata/instance?api-version=2019-06-01"
|
||||
whoami
|
||||
uname -a
|
||||
echo Python3 info ; which python3 ; python3 --version
|
||||
echo Python info ; which python ; python --version
|
||||
echo Java info ; which java ; java -version
|
||||
echo gcc info ; which gcc ; gcc --version
|
||||
echo cmake info ; which cmake ; cmake --version
|
||||
lsb_release
|
||||
env
|
||||
cat /proc/cpuinfo
|
||||
cat /proc/meminfo
|
||||
cat /etc/fstab
|
||||
vmstat -s
|
||||
df
|
||||
lsblk -o NAME,HCTL,SIZE,MOUNTPOINT | grep -i "sd"
|
||||
free -h
|
||||
echo TargetBranch: $(System.PullRequest.TargetBranch)
|
||||
echo SourceBranch: $(Build.SourceBranch)
|
||||
displayName: 'System info'
|
||||
|
||||
|
||||
# Should be after 'Install dependencies' because Git lfs is not installed
|
||||
- checkout: testdata
|
||||
clean: 'true'
|
||||
lfs: 'true'
|
||||
path: testdata
|
||||
|
||||
- task: CMake@1
|
||||
inputs:
|
||||
# CMake must get Python 3.x version by default
|
||||
cmakeArgs: >
|
||||
-GNinja
|
||||
-DCMAKE_VERBOSE_MAKEFILE=ON
|
||||
-DCMAKE_BUILD_TYPE=$(BUILD_TYPE)
|
||||
-DCMAKE_COMPILE_WARNING_AS_ERROR=ON
|
||||
-DENABLE_PYTHON=ON
|
||||
-DBUILD_SHARED_LIBS=$(CMAKE_BUILD_SHARED_LIBS)
|
||||
-DENABLE_ONEDNN_FOR_GPU=$(CMAKE_BUILD_SHARED_LIBS)
|
||||
-DENABLE_TESTS=ON
|
||||
-DENABLE_OV_ONNX_FRONTEND=ON
|
||||
-DENABLE_FASTER_BUILD=ON
|
||||
-DENABLE_STRICT_DEPENDENCIES=OFF
|
||||
-DOPENVINO_EXTRA_MODULES=$(OPENVINO_CONTRIB_REPO_DIR)/modules
|
||||
-DCUSTOM_OPERATIONS="calculate_grid;complex_mul;fft;grid_sample;sparse_conv;sparse_conv_transpose"
|
||||
-DCMAKE_CXX_COMPILER_LAUNCHER=ccache
|
||||
-DCMAKE_C_COMPILER_LAUNCHER=ccache
|
||||
-DCMAKE_CXX_LINKER_LAUNCHER=ccache
|
||||
-DCMAKE_C_LINKER_LAUNCHER=ccache
|
||||
-DCMAKE_CXX_COMPILER=clang++
|
||||
-DCMAKE_C_COMPILER=clang
|
||||
-DENABLE_SYSTEM_SNAPPY=ON
|
||||
-DENABLE_SYSTEM_TBB=ON
|
||||
-DCPACK_GENERATOR=$(CMAKE_CPACK_GENERATOR)
|
||||
-DBUILD_nvidia_plugin=OFF
|
||||
-S $(REPO_DIR)
|
||||
-B $(BUILD_DIR)
|
||||
displayName: 'Cmake OpenVINO'
|
||||
|
||||
- script: ls -alR $(REPO_DIR)/temp/
|
||||
displayName: 'List temp SDKs'
|
||||
|
||||
- script: ccache --zero-stats --max-size=50G --show-config
|
||||
displayName: 'Clean ccache stats'
|
||||
|
||||
- script: cmake --build $(BUILD_DIR) --parallel --config $(BUILD_TYPE)
|
||||
env:
|
||||
CCACHE_DIR: $(CCACHE_DIR)
|
||||
CCACHE_TEMPDIR: $(TMP_DIR)/ccache
|
||||
CCACHE_BASEDIR: $(Pipeline.Workspace)
|
||||
CCACHE_MAXSIZE: 50G
|
||||
displayName: 'Build Lin'
|
||||
|
||||
- script: ccache --show-stats
|
||||
displayName: 'Show ccache stats'
|
||||
|
||||
- script: ls -alR $(REPO_DIR)/bin/
|
||||
displayName: 'List bin files'
|
||||
|
||||
- task: CMake@1
|
||||
inputs:
|
||||
cmakeArgs: >
|
||||
-GNinja
|
||||
-S $(REPO_DIR)/tests/layer_tests
|
||||
-B $(BUILD_LAYER_TESTS_DIR)
|
||||
displayName: 'Cmake Layer Tests'
|
||||
|
||||
- script: cmake --build $(BUILD_LAYER_TESTS_DIR) --parallel --config $(BUILD_TYPE)
|
||||
displayName: 'Build Layer Tests'
|
||||
|
||||
- script: sudo apt-get remove libtbb2 -y
|
||||
displayName: 'Remove debian dependencies'
|
||||
condition: eq(variables['CMAKE_CPACK_GENERATOR'], 'DEB')
|
||||
|
||||
- script: cmake -DCOMPONENT=python_wheels -DCMAKE_INSTALL_PREFIX=$(INSTALL_DIR) -P $(BUILD_DIR)/cmake_install.cmake
|
||||
displayName: 'Install wheel packages'
|
||||
|
||||
- script: cmake -DCOMPONENT=tests -DCMAKE_INSTALL_PREFIX=$(INSTALL_DIR) -P $(BUILD_LAYER_TESTS_DIR)/cmake_install.cmake
|
||||
displayName: 'Install Layer Tests'
|
||||
|
||||
- script: python3 -m pip install openvino-dev --find-links=$(INSTALL_DIR)/tools
|
||||
displayName: 'Install python wheels'
|
||||
|
||||
- script: cmake -DCMAKE_INSTALL_PREFIX=$(INSTALL_DIR) -DCOMPONENT=tests -P $(BUILD_DIR)/cmake_install.cmake
|
||||
displayName: 'Install tests'
|
||||
|
||||
- script: ls -alR $(INSTALL_DIR)
|
||||
displayName: 'List install test files'
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
sudo apt-get install libtbb-dev libpugixml-dev -y
|
||||
cmake --build $(BUILD_DIR) --target package --parallel
|
||||
condition: eq(variables['CMAKE_CPACK_GENERATOR'], 'DEB')
|
||||
displayName: 'Build Debian packages'
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
# install debian packages from previous release
|
||||
sudo apt-get install --no-install-recommends gnupg wget -y
|
||||
wget https://apt.repos.intel.com/intel-gpg-keys/GPG-PUB-KEY-INTEL-SW-PRODUCTS.PUB
|
||||
sudo apt-key add GPG-PUB-KEY-INTEL-SW-PRODUCTS.PUB
|
||||
echo "deb https://apt.repos.intel.com/openvino/2022 focal main" | sudo tee /etc/apt/sources.list.d/intel-openvino-2022.list
|
||||
sudo apt-get update -o Dir::Etc::sourcelist=/etc/apt/sources.list.d/intel-openvino-2022.list
|
||||
sudo apt-get install openvino -y
|
||||
# install our local one and make sure the conflicts are resolved
|
||||
sudo apt-get install --no-install-recommends dpkg-dev -y
|
||||
rm -r _CPack_Packages
|
||||
dpkg-scanpackages . /dev/null | gzip -9c > Packages.gz
|
||||
echo "deb [trusted=yes] file:$(BUILD_DIR) ./" | sudo tee /etc/apt/sources.list.d/openvino-local.list
|
||||
sudo apt-get update -o Dir::Etc::sourcelist=/etc/apt/sources.list.d/openvino-local.list
|
||||
sudo apt-get install openvino -y
|
||||
workingDirectory: $(BUILD_DIR)
|
||||
condition: eq(variables['CMAKE_CPACK_GENERATOR'], 'DEB')
|
||||
displayName: 'Install Debian packages'
|
||||
|
||||
- script: cmake -DCMAKE_INSTALL_PREFIX=$(INSTALL_DIR) -P $(BUILD_DIR)/cmake_install.cmake
|
||||
condition: ne(variables['CMAKE_CPACK_GENERATOR'], 'DEB')
|
||||
displayName: 'Install openvino'
|
||||
|
||||
- script: ls -alR $(INSTALL_DIR)
|
||||
condition: ne(variables['CMAKE_CPACK_GENERATOR'], 'DEB')
|
||||
displayName: 'List install files'
|
||||
|
||||
- script: $(SAMPLES_INSTALL_DIR)/cpp/build_samples.sh -i $(INSTALL_DIR) -b $(BUILD_DIR)/cpp_samples
|
||||
displayName: 'Build cpp samples - gcc'
|
||||
|
||||
- script: $(SAMPLES_INSTALL_DIR)/cpp/build_samples.sh -b $(BUILD_DIR)/cpp_samples_clang
|
||||
env:
|
||||
CC: clang
|
||||
CXX: clang++
|
||||
displayName: 'Build cpp samples - clang'
|
||||
|
||||
- script: $(SAMPLES_INSTALL_DIR)/c/build_samples.sh -i $(INSTALL_DIR) -b $(BUILD_DIR)/c_samples
|
||||
env:
|
||||
VERBOSE: 1
|
||||
displayName: 'Build c samples'
|
||||
|
||||
- script: rm -fr $(BUILD_DIR)
|
||||
displayName: 'Clean build dir'
|
||||
|
||||
- script: $(RUN_PREFIX) $(INSTALL_TEST_DIR)/ov_core_unit_tests --gtest_print_time=1 --gtest_output=xml:$(INSTALL_TEST_DIR)/TEST-OVCoreUT.xml
|
||||
displayName: 'OV Core UT'
|
||||
|
||||
- script: $(RUN_PREFIX) $(INSTALL_TEST_DIR)/ov_inference_functional_tests --gtest_print_time=1 --gtest_output=xml:$(INSTALL_TEST_DIR)/TEST-InferenceFunc.xml
|
||||
displayName: 'Inference Func Tests'
|
||||
|
||||
- script: $(RUN_PREFIX) $(INSTALL_TEST_DIR)/ov_inference_unit_tests --gtest_print_time=1 --gtest_output=xml:$(INSTALL_TEST_DIR)/TEST-InferenceUnit.xml
|
||||
displayName: 'Inference Unit Tests'
|
||||
|
||||
- script: $(RUN_PREFIX) $(INSTALL_TEST_DIR)/ov_proxy_plugin_tests --gtest_print_time=1 --gtest_output=xml:$(INSTALL_TEST_DIR)/TEST-OVProxyTests.xml
|
||||
displayName: 'OV Proxy Plugin Tests'
|
||||
|
||||
- script: $(RUN_PREFIX) $(INSTALL_TEST_DIR)/ov_hetero_func_tests --gtest_print_time=1 --gtest_output=xml:$(INSTALL_TEST_DIR)/TEST-OVHeteroFuncTests.xml
|
||||
displayName: 'OV Hetero Func Tests'
|
||||
|
||||
- script: $(RUN_PREFIX) $(INSTALL_TEST_DIR)/ov_conditional_compilation_tests --gtest_print_time=1 --gtest_output=xml:$(INSTALL_TEST_DIR)/TEST-ConditionalCompilation.xml
|
||||
displayName: 'Conditional Compilation Tests'
|
||||
|
||||
- script: $(RUN_PREFIX) $(INSTALL_TEST_DIR)/ov_ir_frontend_tests --gtest_print_time=1 --gtest_output=xml:$(INSTALL_TEST_DIR)/TEST-IRFrontend.xml
|
||||
displayName: 'IR Frontend Tests'
|
||||
|
||||
- script: $(RUN_PREFIX) $(INSTALL_TEST_DIR)/ov_onnx_frontend_tests --gtest_print_time=1 --gtest_filter=-*IE_GPU* --gtest_output=xml:$(INSTALL_TEST_DIR)/TEST-ONNXFrontend.xml
|
||||
displayName: 'ONNX Frontend Tests'
|
||||
|
||||
# TODO Reenable PDPD after paddlepaddle==2.5.0 with compliant protobuf is released (ticket 95904)
|
||||
- script: $(RUN_PREFIX) $(INSTALL_TEST_DIR)/paddle_tests --gtest_print_time=1 --gtest_output=xml:$(INSTALL_TEST_DIR)/TEST-Paddle.xml
|
||||
displayName: 'Paddle Frontend UT'
|
||||
enabled: 'false'
|
||||
|
||||
- script: $(RUN_PREFIX) $(INSTALL_TEST_DIR)/ov_tensorflow_frontend_tests --gtest_print_time=1 --gtest_output=xml:$(INSTALL_TEST_DIR)/TEST-Tensorflow.xml
|
||||
displayName: 'TensorFlow Frontend Unit Tests'
|
||||
|
||||
- script: $(RUN_PREFIX) $(INSTALL_TEST_DIR)/ov_tensorflow_common_tests --gtest_print_time=1 --gtest_output=xml:$(INSTALL_TEST_DIR)/TEST-TensorflowCommon.xml
|
||||
displayName: 'TensorFlow Common Unit Tests'
|
||||
|
||||
- script: $(RUN_PREFIX) $(INSTALL_TEST_DIR)/ov_tensorflow_lite_frontend_tests --gtest_print_time=1 --gtest_output=xml:$(INSTALL_TEST_DIR)/TEST-TensorflowLite.xml
|
||||
displayName: 'TensorFlow Lite Frontend Unit Tests'
|
||||
|
||||
- script: $(RUN_PREFIX) $(INSTALL_TEST_DIR)/ov_lp_transformations_tests --gtest_print_time=1 --gtest_output=xml:$(INSTALL_TEST_DIR)/TEST-LpTransformations.xml
|
||||
displayName: 'Low Precision Transformations Tests'
|
||||
|
||||
- script: $(RUN_PREFIX) $(INSTALL_TEST_DIR)/ov_transformations_tests --gtest_print_time=1 --gtest_output=xml:$(INSTALL_TEST_DIR)/TEST-Transformations.xml
|
||||
displayName: 'Transformations Tests'
|
||||
|
||||
- script: $(RUN_PREFIX) $(INSTALL_TEST_DIR)/ov_legacy_transformations_tests --gtest_print_time=1 --gtest_output=xml:$(INSTALL_TEST_DIR)/TEST-LegacyTransformations.xml
|
||||
displayName: 'Legacy Transformations Tests'
|
||||
|
||||
- script: $(RUN_PREFIX) $(INSTALL_TEST_DIR)/ov_util_tests --gtest_print_time=1 --gtest_output=xml:$(INSTALL_TEST_DIR)/TEST-CommonUtilTests.xml
|
||||
displayName: 'Common Utils Tests'
|
||||
|
||||
- script: $(RUN_PREFIX) $(INSTALL_TEST_DIR)/InferenceEngineUnitTests --gtest_print_time=1 --gtest_output=xml:$(INSTALL_TEST_DIR)/TEST-InferenceEngineUnitTests.xml
|
||||
displayName: 'IE UT old'
|
||||
|
||||
- script: $(RUN_PREFIX) $(INSTALL_TEST_DIR)/ov_snippets_func_tests --gtest_output=xml:$(INSTALL_TEST_DIR)/TEST-ov_snippets_func_tests.xml
|
||||
displayName: 'Snippets Func Tests'
|
||||
|
||||
- script: $(RUN_PREFIX) $(INSTALL_TEST_DIR)/ov_cpu_unit_tests --gtest_output=xml:$(INSTALL_TEST_DIR)/TEST-ov_cpu_unit_tests.xml
|
||||
displayName: 'Intel CPU Unit Tests'
|
||||
|
||||
- script: $(RUN_PREFIX) $(INSTALL_TEST_DIR)/ov_gna_unit_tests --gtest_output=xml:$(INSTALL_TEST_DIR)/TEST-ov_gna_unit_tests.xml
|
||||
displayName: 'GNA UT'
|
||||
enabled: 'false' # TODO: fix
|
||||
|
||||
- script: $(RUN_PREFIX) $(INSTALL_TEST_DIR)/ov_auto_unit_tests --gtest_output=xml:$(INSTALL_TEST_DIR)/TEST-ov_auto_unit_tests.xml
|
||||
displayName: 'AUTO UT'
|
||||
|
||||
- script: $(RUN_PREFIX) $(INSTALL_TEST_DIR)/ov_auto_batch_unit_tests --gtest_output=xml:$(INSTALL_TEST_DIR)/TEST-ov_auto_batch_unit_tests.xml
|
||||
displayName: 'AutoBatch UT'
|
||||
|
||||
- script: $(RUN_PREFIX) $(INSTALL_TEST_DIR)/ov_template_func_tests --gtest_filter=*smoke* --gtest_output=xml:$(INSTALL_TEST_DIR)/TEST-templateFuncTests.xml
|
||||
displayName: 'TEMPLATE FuncTests'
|
||||
|
||||
- script: $(RUN_PREFIX) $(INSTALL_TEST_DIR)/InferenceEngineCAPITests --gtest_output=xml:$(INSTALL_TEST_DIR)/TEST-InferenceEngineCAPITests.xml
|
||||
displayName: 'IE CAPITests'
|
||||
|
||||
- script: $(RUN_PREFIX) $(INSTALL_TEST_DIR)/ov_capi_test --gtest_output=xml:$(INSTALL_TEST_DIR)/TEST-ov_capi_test.xml
|
||||
displayName: 'OV CAPITests'
|
||||
|
||||
- script: $(RUN_PREFIX) $(INSTALL_TEST_DIR)/ov_auto_batch_func_tests --gtest_output=xml:$(INSTALL_TEST_DIR)/TEST-ov_auto_batch_func_tests.xml
|
||||
displayName: 'AutoBatch FuncTests'
|
||||
|
||||
# Skip test_onnx/test_zoo_models and test_onnx/test_backend due to long execution time
|
||||
- script: |
|
||||
$(RUN_PREFIX) python3 -m pytest -s $(INSTALL_TEST_DIR)/pyngraph $(PYTHON_STATIC_ARGS) \
|
||||
--junitxml=$(INSTALL_TEST_DIR)/TEST-Pyngraph.xml \
|
||||
--ignore=$(INSTALL_TEST_DIR)/pyngraph/tests/test_onnx/test_zoo_models.py \
|
||||
--ignore=$(INSTALL_TEST_DIR)/pyngraph/tests/test_onnx/test_backend.py
|
||||
displayName: 'nGraph and IE Python Bindings Tests'
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
export LD_LIBRARY_PATH=$INSTALL_TEST_DIR:$LD_LIBRARY_PATH
|
||||
$(RUN_PREFIX) python3 -m pytest -sv $(INSTALL_TEST_DIR)/pyopenvino $(PYTHON_STATIC_ARGS) \
|
||||
--junitxml=$(INSTALL_TEST_DIR)/TEST-Pyngraph.xml \
|
||||
--ignore=$(INSTALL_TEST_DIR)/pyopenvino/tests/test_utils/test_utils.py
|
||||
displayName: 'Python API 2.0 Tests'
|
||||
|
||||
# Skip test_onnx/test_zoo_models and test_onnx/test_backend due to long execution time
|
||||
- script: |
|
||||
python3 -m pytest -sv $(REPO_DIR)/src/frontends/onnx/tests $(PYTHON_STATIC_ARGS) \
|
||||
--ignore=$(REPO_DIR)/src/frontends/onnx/tests/test_python/test_zoo_models.py \
|
||||
--ignore=$(REPO_DIR)/src/frontends/onnx/tests/test_python/test_backend.py -v
|
||||
displayName: 'ONNX Frontend Python Tests'
|
||||
|
||||
- script: python3 -m pytest -s $(INSTALL_TEST_DIR)/mo/unit_tests --junitxml=$(INSTALL_TEST_DIR)/TEST-ModelOptimizer.xml
|
||||
displayName: 'Model Optimizer UT'
|
||||
|
||||
- script: python3 -m pytest -s $(REPO_DIR)/tools/ovc/unit_tests --junitxml=$(INSTALL_TEST_DIR)/TEST-OpenVinoConversion.xml
|
||||
displayName: 'OpenVino Conversion UT'
|
||||
|
||||
- script: $(RUN_PREFIX) $(INSTALL_TEST_DIR)/ov_cpu_func_tests --gtest_filter=*smoke* --gtest_print_time=1 --gtest_output=xml:$(INSTALL_TEST_DIR)/TEST-ov_cpu_func_tests.xml
|
||||
displayName: 'CPU FuncTests'
|
||||
condition: and(succeeded(), eq(variables['CMAKE_BUILD_SHARED_LIBS'], 'OFF'))
|
||||
|
||||
- task: CMake@1
|
||||
inputs:
|
||||
cmakeArgs: >
|
||||
-GNinja
|
||||
-S $(REPO_DIR)/tests/samples_tests
|
||||
-B $(BUILD_SAMPLES_TESTS_DIR)
|
||||
displayName: 'CMake Samples Tests'
|
||||
|
||||
- script: cmake -DCOMPONENT=tests -DCMAKE_INSTALL_PREFIX=$(INSTALL_DIR) -P $(BUILD_SAMPLES_TESTS_DIR)/cmake_install.cmake
|
||||
displayName: 'Install Samples Tests'
|
||||
|
||||
- script: python3 -m pip install -r $(INSTALL_TEST_DIR)/smoke_tests/requirements.txt
|
||||
displayName: 'Install dependencies for samples smoke tests'
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
export PATH=$HOME/.local/bin:$PATH
|
||||
export LD_LIBRARY_PATH=$IE_APP_PATH:$LD_LIBRARY_PATH
|
||||
$(RUN_PREFIX) python3 -m pytest $(INSTALL_TEST_DIR)/smoke_tests/ \
|
||||
--env_conf $(INSTALL_TEST_DIR)/smoke_tests/env_config.yml \
|
||||
-s --junitxml=$(INSTALL_TEST_DIR)/TEST-SamplesSmokeTests.xml
|
||||
env:
|
||||
IE_APP_PATH: $(INSTALL_DIR)/samples_bin
|
||||
IE_APP_PYTHON_PATH: $(PYTHON_SAMPLES_INSTALL_DIR)/
|
||||
SHARE: $(INSTALL_TEST_DIR)/smoke_tests/samples_smoke_tests_data/
|
||||
WORKSPACE: $(INSTALL_DIR)
|
||||
displayName: 'Samples Smoke Tests'
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
python3 -m pip install -r $(LAYER_TESTS_DIR)/requirements.txt
|
||||
$(RUN_PREFIX) python3 -m pytest $(LAYER_TESTS_DIR)/pytorch_tests/ -m precommit --junitxml=$(INSTALL_TEST_DIR)/TEST-pytorch.xmlTEST
|
||||
env:
|
||||
PYTHONPATH: $(REPO_DIR)/tools/mo/:$(LAYER_TESTS_DIR)
|
||||
TEST_DEVICE: CPU
|
||||
displayName: 'PyTorch Layer Tests'
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
python3 -m pip install -r $(LAYER_TESTS_DIR)/requirements.txt
|
||||
$(RUN_PREFIX) python3 -m pytest $(LAYER_TESTS_DIR)/tensorflow_tests/ --use_new_frontend -m precommit_tf_fe --junitxml=$(INSTALL_TEST_DIR)/TEST-tf_fe.xmlTEST
|
||||
env:
|
||||
PYTHONPATH: $(REPO_DIR)/tools/mo/:$(LAYER_TESTS_DIR)
|
||||
TEST_DEVICE: CPU
|
||||
displayName: 'TensorFlow 1 Layer Tests - TF FE'
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
python3 -m pip install -r $(LAYER_TESTS_DIR)/requirements.txt
|
||||
$(RUN_PREFIX) python3 -m pytest $(LAYER_TESTS_DIR)/tensorflow2_keras_tests/ --use_new_frontend -m precommit_tf_fe --junitxml=$(INSTALL_TEST_DIR)/TEST-tf2_fe.xmlTEST
|
||||
env:
|
||||
PYTHONPATH: $(REPO_DIR)/tools/mo/:$(LAYER_TESTS_DIR)
|
||||
TEST_DEVICE: CPU
|
||||
displayName: 'TensorFlow 2 Layer Tests - TF FE'
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
python3 -m pip install -r $(LAYER_TESTS_DIR)/requirements.txt
|
||||
$(RUN_PREFIX) python3 -m pytest $(LAYER_TESTS_DIR)/jax_tests/ -m precommit --junitxml=$(INSTALL_TEST_DIR)/TEST-jax.xmlTEST
|
||||
env:
|
||||
PYTHONPATH: $(LAYER_TESTS_DIR)
|
||||
TEST_DEVICE: CPU
|
||||
displayName: 'JAX Layer Tests - TF FE'
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
python3 -m pip install -r $(LAYER_TESTS_DIR)/requirements.txt
|
||||
$(RUN_PREFIX) python3 -m pytest $(LAYER_TESTS_DIR)/tensorflow_tests/test_tf_Roll.py --ir_version=10 --junitxml=$(INSTALL_TEST_DIR)/TEST-tf_Roll.xmlTEST
|
||||
env:
|
||||
PYTHONPATH: $(LAYER_TESTS_DIR)
|
||||
displayName: 'TensorFlow 1 Layer Tests - Legacy FE'
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
python3 -m pip install -r $(LAYER_TESTS_DIR)/requirements.txt
|
||||
$(RUN_PREFIX) python3 -m pytest $(LAYER_TESTS_DIR)/tensorflow2_keras_tests/test_tf2_keras_activation.py --ir_version=11 --junitxml=./TEST-tf2_Activation.xmlTEST -k "sigmoid"
|
||||
env:
|
||||
PYTHONPATH: $(LAYER_TESTS_DIR)
|
||||
TEST_DEVICE: CPU
|
||||
displayName: 'TensorFlow 2 Layer Tests - Legacy FE'
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
python3 -m pip install -r $(LAYER_TESTS_DIR)/requirements.txt
|
||||
$(RUN_PREFIX) python3 -m pytest $(LAYER_TESTS_DIR)/tensorflow_lite_tests/ --junitxml=$(INSTALL_TEST_DIR)/TEST-tfl_fe.xmlTEST
|
||||
env:
|
||||
PYTHONPATH: $(REPO_DIR)/tools/mo/:$(LAYER_TESTS_DIR)
|
||||
TEST_DEVICE: CPU
|
||||
displayName: 'TensorFlow Lite Layer Tests - TFL FE'
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
python3 -m pip install -r $(LAYER_TESTS_DIR)/requirements.txt
|
||||
$(RUN_PREFIX) python3 -m pytest $(LAYER_TESTS_DIR)/ovc_python_api_tests/ --junitxml=./TEST-test_ovc_convert.xmlTEST
|
||||
env:
|
||||
PYTHONPATH: $(LAYER_TESTS_DIR)
|
||||
TEST_DEVICE: CPU
|
||||
displayName: 'OVC Python API Tests'
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
python3 -m pip install -r $(LAYER_TESTS_DIR)/requirements.txt
|
||||
$(RUN_PREFIX) python3 -m pytest $(LAYER_TESTS_DIR)/mo_python_api_tests/ --junitxml=./TEST-test_mo_convert.xmlTEST
|
||||
env:
|
||||
PYTHONPATH: $(LAYER_TESTS_DIR)
|
||||
TEST_DEVICE: CPU
|
||||
displayName: 'MO Python API Tests'
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
python3 -m pip install -r $(LAYER_TESTS_DIR)/requirements.txt
|
||||
$(RUN_PREFIX) python3 -m pytest $(LAYER_TESTS_DIR)/py_frontend_tests --junitxml=./TEST-test_py_fontend.xml
|
||||
displayName: 'Python Frontend tests'
|
||||
|
||||
- task: PublishTestResults@2
|
||||
condition: always()
|
||||
inputs:
|
||||
testResultsFormat: 'JUnit' # Options: JUnit, NUnit, VSTest, xUnit, cTest
|
||||
testResultsFiles: '**/TEST-*.xml'
|
||||
#searchFolder: '$(BUILD_DIR)'
|
||||
mergeTestResults: false # Optional
|
||||
#failTaskOnFailedTests: false # Optional
|
||||
#testRunTitle: 'Pre/Post-Commit' # Optional
|
||||
buildPlatform: 'x64' # Optional
|
||||
buildConfiguration: 'Linux' # Optional
|
||||
#publishRunAttachments: true # Optional
|
||||
237
modules/openvino-master/.ci/azure/linux_arm64.yml
Normal file
@ -0,0 +1,237 @@
|
||||
trigger:
|
||||
branches:
|
||||
include:
|
||||
- 'master'
|
||||
- 'releases/*'
|
||||
paths:
|
||||
exclude:
|
||||
- '*/docs/*'
|
||||
- 'docs/*'
|
||||
- '*/*.md'
|
||||
- '*.md'
|
||||
- '*/layer_tests_summary/*'
|
||||
- '*/conformance/*'
|
||||
- 'tools/*'
|
||||
- 'tests/layer_tests/*'
|
||||
|
||||
pr:
|
||||
drafts: 'false'
|
||||
branches:
|
||||
include:
|
||||
- 'master'
|
||||
- 'releases/*'
|
||||
paths:
|
||||
exclude:
|
||||
- '*/docs/*'
|
||||
- 'docs/*'
|
||||
- '*/*.md'
|
||||
- '*.md'
|
||||
- '*/layer_tests_summary/*'
|
||||
- '*/conformance/*'
|
||||
- 'tools/*'
|
||||
- 'tests/layer_tests/*'
|
||||
|
||||
variables:
|
||||
- group: github
|
||||
|
||||
jobs:
|
||||
- job: linux_arm64
|
||||
# About 150% of total time
|
||||
timeoutInMinutes: '120'
|
||||
|
||||
pool:
|
||||
name: LIN_VMSS_VENV_F16S_U20_WU2
|
||||
|
||||
variables:
|
||||
system.debug: true
|
||||
VSTS_HTTP_RETRY: 5
|
||||
VSTS_HTTP_TIMEOUT: 200
|
||||
NUM_PROC: 2
|
||||
BUILD_TYPE: Release
|
||||
OPENVINO_REPO_DIR: $(Build.Repository.LocalPath)
|
||||
BUILD_OPENVINO: $(WORK_DIR)/build
|
||||
INSTALL_OPENVINO: $(WORK_DIR)/install_openvino
|
||||
WORK_DIR: $(Pipeline.Workspace)/_w
|
||||
SHARE_DIR: /mount/cinfsshare/onnxtestdata
|
||||
TMP_DIR: /mnt/tmp
|
||||
OPENVINO_CCACHE_DIR: $(SHARE_DIR)/ccache/master/linux_arm64
|
||||
LD_LIBRARY_PATH: $(Agent.ToolsDirectory)/Python/$(OV_PYTHON_VERSION)/x64/lib
|
||||
OV_PYTHON_VERSION_MAJOR_MINOR: 3.11
|
||||
OV_PYTHON_VERSION: $(OV_PYTHON_VERSION_MAJOR_MINOR).2 # Full version of Python its required for LD_LIBRARY_PATH. More details https://github.com/microsoft/azure-pipelines-tool-lib/blob/master/docs/overview.md#tool-cache
|
||||
|
||||
steps:
|
||||
- task: UsePythonVersion@0
|
||||
inputs:
|
||||
versionSpec: '$(OV_PYTHON_VERSION)' # Setting only major & minor version will download latest release from GH repo example 3.10 will be 3.10.10.
|
||||
addToPath: true
|
||||
disableDownloadFromRegistry: false
|
||||
architecture: 'x64'
|
||||
githubToken: $(auth_token)
|
||||
displayName: Setup Python 3.11
|
||||
name: setupPython
|
||||
- bash: |
|
||||
#!/bin/bash
|
||||
python -V
|
||||
|
||||
- script: |
|
||||
curl -H Metadata:true --noproxy "*" "http://169.254.169.254/metadata/instance?api-version=2019-06-01"
|
||||
whoami
|
||||
uname -a
|
||||
echo Python3 info ; which python3 ; python3 --version
|
||||
echo Python info ; which python ; python --version
|
||||
echo Java info ; which java ; java -version
|
||||
echo gcc info ; which gcc ; gcc --version
|
||||
echo cmake info ; which cmake ; cmake --version
|
||||
lsb_release
|
||||
env
|
||||
cat /proc/cpuinfo
|
||||
cat /proc/meminfo
|
||||
cat /etc/fstab
|
||||
vmstat -s
|
||||
df
|
||||
lsblk -o NAME,HCTL,SIZE,MOUNTPOINT | grep -i "sd"
|
||||
free -h
|
||||
echo "##vso[task.setvariable variable=NUM_PROC]$(nproc --all)"
|
||||
echo "NUM_PROC=$(NUM_PROC)"
|
||||
displayName: 'System information'
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
rm -rf $(WORK_DIR) ; mkdir $(WORK_DIR)
|
||||
mkdir -p $(BUILD_OPENVINO)
|
||||
mkdir -p $(INSTALL_OPENVINO)
|
||||
sudo rm -rf $(TMP_DIR) ; sudo mkdir $(TMP_DIR) ; sudo chmod 777 -R $(TMP_DIR)
|
||||
sudo mkdir -p $(SHARE_DIR)
|
||||
sudo apt --assume-yes update && sudo apt --assume-yes install nfs-common
|
||||
sudo mount -vvv -t nfs cinfsshare.file.core.windows.net:/cinfsshare/onnxtestdata $(SHARE_DIR) -o vers=4,minorversion=1,sec=sys
|
||||
mkdir -p $(OPENVINO_CCACHE_DIR)
|
||||
displayName: 'Make directories'
|
||||
|
||||
- checkout: self
|
||||
clean: 'true'
|
||||
path: openvino
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
python3 -m pip install --upgrade pip
|
||||
python3 -m pip install cmake
|
||||
python3 -m pip install -r $(OPENVINO_REPO_DIR)/src/bindings/python/requirements.txt
|
||||
python3 -m pip install -r $(OPENVINO_REPO_DIR)/src/bindings/python/wheel/requirements-dev.txt
|
||||
python3 -m pip install -r $(OPENVINO_REPO_DIR)/src/bindings/python/src/compatibility/openvino/requirements-dev.txt
|
||||
# install dependencies needed to build CPU plugin for ARM
|
||||
sudo -E apt --assume-yes install scons gcc-10-aarch64-linux-gnu g++-10-aarch64-linux-gnu
|
||||
# generic dependencies
|
||||
sudo -E apt --assume-yes install cmake ccache ninja-build unzip fdupes
|
||||
displayName: 'Install build dependencies'
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
echo deb [arch=amd64] http://archive.ubuntu.com/ubuntu/ focal main restricted > arm64-sources.list
|
||||
echo deb [arch=amd64] http://archive.ubuntu.com/ubuntu/ focal-updates main restricted >> arm64-sources.list
|
||||
echo deb [arch=amd64] http://archive.ubuntu.com/ubuntu/ focal universe >> arm64-sources.list
|
||||
echo deb [arch=amd64] http://archive.ubuntu.com/ubuntu/ focal-updates universe >> arm64-sources.list
|
||||
echo deb [arch=amd64] http://archive.ubuntu.com/ubuntu/ focal multiverse >> arm64-sources.list
|
||||
echo deb [arch=amd64] http://archive.ubuntu.com/ubuntu/ focal-updates multiverse >> arm64-sources.list
|
||||
echo deb [arch=amd64] http://archive.ubuntu.com/ubuntu/ focal-backports main restricted universe multiverse >> arm64-sources.list
|
||||
echo deb [arch=amd64] http://security.ubuntu.com/ubuntu/ focal-security main restricted >> arm64-sources.list
|
||||
echo deb [arch=amd64] http://security.ubuntu.com/ubuntu/ focal-security universe >> arm64-sources.list
|
||||
echo deb [arch=amd64] http://security.ubuntu.com/ubuntu/ focal-security multiverse >> arm64-sources.list
|
||||
echo deb [arch=arm64] http://ports.ubuntu.com/ubuntu-ports/ focal main >> arm64-sources.list
|
||||
echo deb [arch=arm64] http://ports.ubuntu.com/ubuntu-ports/ focal universe >> arm64-sources.list
|
||||
echo deb [arch=arm64] http://ports.ubuntu.com/ubuntu-ports/ focal-updates main >> arm64-sources.list
|
||||
echo deb [arch=arm64] http://ports.ubuntu.com/ubuntu-ports/ focal-security main >> arm64-sources.list
|
||||
sudo mv arm64-sources.list /etc/apt/sources.list.d/
|
||||
sudo -E dpkg --add-architecture arm64
|
||||
sudo -E apt-get update -o Dir::Etc::sourcelist=/etc/apt/sources.list.d/arm64-sources.list
|
||||
sudo -E apt-get install -y --no-install-recommends libpython3-dev:arm64
|
||||
displayName: 'Install arm64 libraries'
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
git submodule update --init -- $(OPENVINO_REPO_DIR)/src/plugins
|
||||
git submodule update --init -- $(OPENVINO_REPO_DIR)/thirdparty/gtest
|
||||
git submodule update --init -- $(OPENVINO_REPO_DIR)/thirdparty/open_model_zoo
|
||||
displayName: 'Init submodules for non Conan dependencies'
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
python3 -m pip install conan
|
||||
# install build profile compilers
|
||||
sudo -E apt --assume-yes install gcc g++
|
||||
# generate build profile
|
||||
conan profile detect
|
||||
# generate host profile for linux_arm64
|
||||
echo "include(default)" > $(BUILD_OPENVINO)/linux_arm64
|
||||
echo "[buildenv]" >> $(BUILD_OPENVINO)/linux_arm64
|
||||
echo "CC=aarch64-linux-gnu-gcc-10" >> $(BUILD_OPENVINO)/linux_arm64
|
||||
echo "CXX=aarch64-linux-gnu-g++-10" >> $(BUILD_OPENVINO)/linux_arm64
|
||||
# install OpenVINO dependencies
|
||||
conan install $(OPENVINO_REPO_DIR)/conanfile.txt \
|
||||
-pr:h $(BUILD_OPENVINO)/linux_arm64 \
|
||||
-s:h arch=armv8 \
|
||||
-of $(BUILD_OPENVINO)/dependencies \
|
||||
-b missing
|
||||
env:
|
||||
CMAKE_CXX_COMPILER_LAUNCHER: ccache
|
||||
CMAKE_C_COMPILER_LAUNCHER: ccache
|
||||
CCACHE_DIR: $(OPENVINO_CCACHE_DIR)
|
||||
CCACHE_TEMPDIR: $(TMP_DIR)/ccache
|
||||
CCACHE_BASEDIR: $(Pipeline.Workspace)
|
||||
CCACHE_MAXSIZE: 50G
|
||||
displayName: 'Install conan and dependencies'
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
source $(BUILD_OPENVINO)/dependencies/conanbuild.sh
|
||||
# TODO: return tests building once GPU plugin migrates to Plugin API 2.0
|
||||
cmake \
|
||||
-DCMAKE_VERBOSE_MAKEFILE=ON \
|
||||
-DBUILD_SHARED_LIBS=OFF \
|
||||
-DCMAKE_COMPILE_WARNING_AS_ERROR=ON \
|
||||
-DENABLE_CPPLINT=ON \
|
||||
-DENABLE_INTEL_GPU=ON \
|
||||
-DENABLE_PYTHON=ON \
|
||||
-DENABLE_WHEEL=ON \
|
||||
-DPYBIND11_PYTHONLIBS_OVERWRITE=OFF \
|
||||
-DPYTHON_MODULE_EXTENSION=$(aarch64-linux-gnu-python3-config --extension-suffix) \
|
||||
-DPYTHON_LIBRARY=/usr/lib/aarch64-linux-gnu/libc-2.31.so \
|
||||
-DPYTHON_INCLUDE_DIR=$(Agent.ToolsDirectory)/Python/$(OV_PYTHON_VERSION)/x64/include/python$(OV_PYTHON_VERSION_MAJOR_MINOR) \
|
||||
-DENABLE_DATA=OFF \
|
||||
-DENABLE_SYSTEM_TBB=ON \
|
||||
-DENABLE_SYSTEM_PROTOBUF=ON \
|
||||
-DENABLE_SYSTEM_SNAPPY=ON \
|
||||
-DENABLE_SYSTEM_PUGIXML=ON \
|
||||
-DCMAKE_CXX_COMPILER_LAUNCHER=ccache \
|
||||
-DCMAKE_C_COMPILER_LAUNCHER=ccache \
|
||||
-DARM_COMPUTE_SCONS_JOBS=$(NUM_PROC) \
|
||||
-DCMAKE_INSTALL_PREFIX=$(INSTALL_OPENVINO) \
|
||||
-DCMAKE_BUILD_TYPE=$(BUILD_TYPE) \
|
||||
-DENABLE_PYTHON_PACKAGING=ON \
|
||||
-S $(OPENVINO_REPO_DIR) \
|
||||
-B $(BUILD_OPENVINO)
|
||||
source $(BUILD_OPENVINO)/dependencies/deactivate_conanbuild.sh
|
||||
env:
|
||||
CMAKE_GENERATOR: Ninja
|
||||
CMAKE_TOOLCHAIN_FILE: $(BUILD_OPENVINO)/dependencies/conan_toolchain.cmake
|
||||
displayName: 'CMake configure'
|
||||
|
||||
- script: cmake --build $(BUILD_OPENVINO) --parallel --config $(BUILD_TYPE)
|
||||
env:
|
||||
CCACHE_DIR: $(OPENVINO_CCACHE_DIR)
|
||||
CCACHE_TEMPDIR: $(TMP_DIR)/ccache
|
||||
CCACHE_BASEDIR: $(Pipeline.Workspace)
|
||||
CCACHE_MAXSIZE: 50G
|
||||
displayName: 'Build OpenVINO Runtime'
|
||||
|
||||
- script: cmake --build $(BUILD_OPENVINO) --parallel --config $(BUILD_TYPE) --target install
|
||||
displayName: 'Install OpenVINO Runtime'
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
source $(BUILD_OPENVINO)/dependencies/conanbuild.sh
|
||||
$(INSTALL_OPENVINO)/samples/cpp/build_samples.sh
|
||||
source $(BUILD_OPENVINO)/dependencies/deactivate_conanbuild.sh
|
||||
env:
|
||||
CMAKE_GENERATOR: Ninja
|
||||
CMAKE_TOOLCHAIN_FILE: $(BUILD_OPENVINO)/dependencies/conan_toolchain.cmake
|
||||
displayName: 'Build OpenVINO C++ samples'
|
||||
@ -0,0 +1,172 @@
|
||||
trigger:
|
||||
branches:
|
||||
include:
|
||||
- 'master'
|
||||
- 'releases/*'
|
||||
paths:
|
||||
exclude:
|
||||
- '*/docs/*'
|
||||
- 'docs/*'
|
||||
- '*/*.md'
|
||||
- '*.md'
|
||||
- '*/layer_tests_summary/*'
|
||||
- '*/conformance/*'
|
||||
- 'tools/*'
|
||||
|
||||
pr:
|
||||
drafts: 'false'
|
||||
branches:
|
||||
include:
|
||||
- 'master'
|
||||
- 'releases/*'
|
||||
paths:
|
||||
exclude:
|
||||
- '*/docs/*'
|
||||
- 'docs/*'
|
||||
- '*/*.md'
|
||||
- '*.md'
|
||||
- '*/layer_tests_summary/*'
|
||||
- '*/conformance/*'
|
||||
- 'tools/*'
|
||||
|
||||
resources:
|
||||
repositories:
|
||||
- repository: testdata
|
||||
type: github
|
||||
endpoint: openvinotoolkit
|
||||
name: openvinotoolkit/testdata
|
||||
ref: master
|
||||
|
||||
variables:
|
||||
- group: github
|
||||
|
||||
jobs:
|
||||
- job: LinCC
|
||||
# About 150% of total time
|
||||
timeoutInMinutes: '90'
|
||||
|
||||
pool:
|
||||
name: LIN_VMSS_VENV_F16S_U20_WU2
|
||||
|
||||
variables:
|
||||
system.debug: true
|
||||
VSTS_HTTP_RETRY: 5
|
||||
VSTS_HTTP_TIMEOUT: 200
|
||||
BUILD_TYPE: Release
|
||||
REPO_DIR: $(Build.Repository.LocalPath)
|
||||
MODELS_PATH: $(REPO_DIR)/../testdata
|
||||
WORK_DIR: $(Pipeline.Workspace)/_w
|
||||
BUILD_DIR: $(WORK_DIR)/build
|
||||
INSTALL_DIR: $(WORK_DIR)/install_pkg
|
||||
SETUPVARS: $(INSTALL_DIR)/setupvars.sh
|
||||
LD_LIBRARY_PATH: $(Agent.ToolsDirectory)/Python/$(OV_PYTHON_VERSION)/x64/lib
|
||||
OV_PYTHON_VERSION: 3.11.2 # Full version of Python its required for LD_LIBRARY_PATH. More details https://github.com/microsoft/azure-pipelines-tool-lib/blob/master/docs/overview.md#tool-cache
|
||||
|
||||
steps:
|
||||
- task: UsePythonVersion@0
|
||||
inputs:
|
||||
versionSpec: '$(OV_PYTHON_VERSION)' # Setting only major & minor version will download latest release from GH repo example 3.10 will be 3.10.10.
|
||||
addToPath: true
|
||||
disableDownloadFromRegistry: false
|
||||
architecture: 'x64'
|
||||
githubToken: $(auth_token)
|
||||
displayName: Setup Python 3.11
|
||||
name: setupPython
|
||||
- bash: |
|
||||
#!/bin/bash
|
||||
python -V
|
||||
|
||||
- script: |
|
||||
curl -H Metadata:true --noproxy "*" "http://169.254.169.254/metadata/instance?api-version=2019-06-01"
|
||||
whoami
|
||||
uname -a
|
||||
echo Python3 info ; which python3 ; python3 --version
|
||||
echo Python info ; which python ; python --version
|
||||
echo Java info ; which java ; java -version
|
||||
echo gcc info ; which gcc ; gcc --version
|
||||
echo cmake info ; which cmake ; cmake --version
|
||||
lsb_release
|
||||
env
|
||||
cat /proc/cpuinfo
|
||||
cat /proc/meminfo
|
||||
cat /etc/fstab
|
||||
vmstat -s
|
||||
df
|
||||
lsblk -o NAME,HCTL,SIZE,MOUNTPOINT | grep -i "sd"
|
||||
free -h
|
||||
displayName: 'System info'
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
rm -rf $(WORK_DIR) ; mkdir $(WORK_DIR)
|
||||
rm -rf $(BUILD_DIR) ; mkdir $(BUILD_DIR)
|
||||
displayName: 'Make dir'
|
||||
|
||||
- checkout: self
|
||||
clean: 'true'
|
||||
submodules: 'true'
|
||||
path: openvino
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
sudo -E $(REPO_DIR)/install_build_dependencies.sh
|
||||
# Speed up build
|
||||
sudo apt -y --no-install-recommends install unzip
|
||||
wget https://github.com/ninja-build/ninja/releases/download/v1.10.2/ninja-linux.zip
|
||||
unzip ninja-linux.zip
|
||||
sudo cp -v ninja /usr/local/bin/
|
||||
displayName: 'Install dependencies'
|
||||
|
||||
- checkout: testdata
|
||||
clean: 'true'
|
||||
lfs: 'true'
|
||||
path: testdata
|
||||
|
||||
- task: CMake@1
|
||||
inputs:
|
||||
cmakeArgs: >
|
||||
-G "Ninja Multi-Config"
|
||||
-DENABLE_CPPLINT=OFF
|
||||
-DENABLE_GAPI_PREPROCESSING=OFF
|
||||
-DCMAKE_VERBOSE_MAKEFILE=ON
|
||||
-DCMAKE_COMPILE_WARNING_AS_ERROR=ON
|
||||
-DENABLE_FASTER_BUILD=ON
|
||||
-DENABLE_PROFILING_ITT=ON
|
||||
-DSELECTIVE_BUILD=COLLECT
|
||||
-S $(REPO_DIR)
|
||||
-B $(BUILD_DIR)
|
||||
displayName: 'Cmake CC COLLECT'
|
||||
|
||||
- script: cmake --build $(BUILD_DIR) --parallel --config $(BUILD_TYPE) --target openvino_intel_cpu_plugin openvino_ir_frontend benchmark_app sea_itt_lib
|
||||
displayName: 'Build CC COLLECT'
|
||||
|
||||
- script: ls -alR $(REPO_DIR)/bin/
|
||||
displayName: 'List bin files'
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
python3 $(REPO_DIR)/thirdparty/itt_collector/runtool/sea_runtool.py \
|
||||
--bindir $(REPO_DIR)/bin/intel64/Release -o $(BUILD_DIR)/itt_stat ! \
|
||||
$(REPO_DIR)/bin/intel64/Release/benchmark_app -niter 1 -nireq 1 \
|
||||
-m $(MODELS_PATH)/models/test_model/test_model_fp32.xml -d CPU
|
||||
displayName: 'Code usage analysis'
|
||||
|
||||
- task: CMake@1
|
||||
inputs:
|
||||
cmakeArgs: >
|
||||
-DSELECTIVE_BUILD=ON
|
||||
-DSELECTIVE_BUILD_STAT=$(BUILD_DIR)/*.csv
|
||||
-B $(BUILD_DIR)
|
||||
-S $(REPO_DIR)
|
||||
displayName: 'CMake CC ON'
|
||||
|
||||
- script: cmake --build $(BUILD_DIR) --parallel --config $(BUILD_TYPE) --target openvino_intel_cpu_plugin openvino_ir_frontend
|
||||
displayName: 'Build CC ON'
|
||||
|
||||
- script: ls -alR $(REPO_DIR)/bin/
|
||||
displayName: 'List bin files ON'
|
||||
|
||||
- script: |
|
||||
$(REPO_DIR)/bin/intel64/Release/benchmark_app -niter 1 -nireq 1 \
|
||||
-m $(MODELS_PATH)/models/test_model/test_model_fp32.xml -d CPU
|
||||
displayName: 'Use OpenVINO after CC'
|
||||
165
modules/openvino-master/.ci/azure/linux_coverity.yml
Normal file
@ -0,0 +1,165 @@
|
||||
resources:
|
||||
repositories:
|
||||
- repository: openvino_contrib
|
||||
type: github
|
||||
endpoint: openvinotoolkit
|
||||
name: openvinotoolkit/openvino_contrib
|
||||
ref: master
|
||||
|
||||
variables:
|
||||
- group: github
|
||||
|
||||
jobs:
|
||||
- job: Lin
|
||||
# About 150% of total time
|
||||
timeoutInMinutes: '90'
|
||||
|
||||
pool:
|
||||
name: LIN_VMSS_VENV_F16S_U20_WU2
|
||||
|
||||
variables:
|
||||
system.debug: true
|
||||
VSTS_HTTP_RETRY: 5
|
||||
VSTS_HTTP_TIMEOUT: 200
|
||||
BUILD_TYPE: Release
|
||||
REPO_DIR: $(Build.Repository.LocalPath)
|
||||
OPENVINO_CONTRIB_REPO_DIR: $(REPO_DIR)/../openvino_contrib
|
||||
WORK_DIR: $(Pipeline.Workspace)/_w
|
||||
BUILD_DIR: $(WORK_DIR)/build
|
||||
BUILD_SAMPLES_DIR: $(WORK_DIR)/build_samples
|
||||
INSTALL_DIR: $(WORK_DIR)/install_pkg
|
||||
SETUPVARS: $(INSTALL_DIR)/setupvars.sh
|
||||
TMP_DIR: /mnt/tmp
|
||||
SHARE_DIR: /mount/cinfsshare/onnxtestdata
|
||||
CCACHE_DIR: $(SHARE_DIR)/ccache/master/linux_coverity
|
||||
LD_LIBRARY_PATH: $(Agent.ToolsDirectory)/Python/$(OV_PYTHON_VERSION)/x64/lib
|
||||
OV_PYTHON_VERSION: 3.11.2 # Full version of Python its required for LD_LIBRARY_PATH. More details https://github.com/microsoft/azure-pipelines-tool-lib/blob/master/docs/overview.md#tool-cache
|
||||
|
||||
steps:
|
||||
- task: UsePythonVersion@0
|
||||
inputs:
|
||||
versionSpec: '$(OV_PYTHON_VERSION)' # Setting only major & minor version will download latest release from GH repo example 3.10 will be 3.10.10.
|
||||
addToPath: true
|
||||
disableDownloadFromRegistry: false
|
||||
architecture: 'x64'
|
||||
githubToken: $(auth_token)
|
||||
displayName: Setup Python 3.11
|
||||
name: setupPython
|
||||
- bash: |
|
||||
#!/bin/bash
|
||||
python -V
|
||||
|
||||
- script: |
|
||||
curl -H Metadata:true --noproxy "*" "http://169.254.169.254/metadata/instance?api-version=2019-06-01"
|
||||
whoami
|
||||
uname -a
|
||||
echo Python3 info ; which python3 ; python3 --version
|
||||
echo Python info ; which python ; python --version
|
||||
echo Java info ; which java ; java -version
|
||||
echo gcc info ; which gcc ; gcc --version
|
||||
echo cmake info ; which cmake ; cmake --version
|
||||
lsb_release
|
||||
env
|
||||
cat /proc/cpuinfo
|
||||
cat /proc/meminfo
|
||||
cat /etc/fstab
|
||||
vmstat -s
|
||||
df
|
||||
lsblk -o NAME,HCTL,SIZE,MOUNTPOINT | grep -i "sd"
|
||||
free -h
|
||||
displayName: 'System info'
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
rm -rf $(WORK_DIR) ; mkdir $(WORK_DIR)
|
||||
rm -rf $(BUILD_DIR) ; mkdir $(BUILD_DIR)
|
||||
rm -rf $(BUILD_SAMPLES_DIR) ; mkdir $(BUILD_SAMPLES_DIR)
|
||||
sudo rm -rf $(TMP_DIR) ; sudo mkdir $(TMP_DIR) ; sudo chmod 777 -R $(TMP_DIR)
|
||||
sudo mkdir -p $(SHARE_DIR)
|
||||
sudo apt --assume-yes update && sudo apt --assume-yes install nfs-common
|
||||
sudo mount -vvv -t nfs cinfsshare.file.core.windows.net:/cinfsshare/onnxtestdata $(SHARE_DIR) -o vers=4,minorversion=1,sec=sys
|
||||
mkdir -p $(CCACHE_DIR)
|
||||
displayName: 'Make dir'
|
||||
|
||||
- checkout: self
|
||||
clean: 'true'
|
||||
submodules: 'true'
|
||||
path: openvino
|
||||
|
||||
- checkout: openvino_contrib
|
||||
clean: 'true'
|
||||
submodules: 'true'
|
||||
path: openvino_contrib
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
sudo -E $(REPO_DIR)/install_build_dependencies.sh
|
||||
# Move jdk into contrib
|
||||
sudo apt --assume-yes install openjdk-11-jdk
|
||||
# Speed up build
|
||||
sudo apt -y --no-install-recommends install unzip
|
||||
wget https://github.com/ninja-build/ninja/releases/download/v1.10.2/ninja-linux.zip
|
||||
unzip ninja-linux.zip
|
||||
sudo cp -v ninja /usr/local/bin/
|
||||
displayName: 'Install dependencies'
|
||||
|
||||
- task: CMake@1
|
||||
inputs:
|
||||
# Coverity has too many PARSE_ERROR errors with ENABLE_FASTER_BUILD=ON. Disabling FASTER_BUILD.
|
||||
cmakeArgs: >
|
||||
-G "Ninja Multi-Config"
|
||||
-DENABLE_CPPLINT=OFF
|
||||
-DCMAKE_VERBOSE_MAKEFILE=ON
|
||||
-DENABLE_FASTER_BUILD=OFF
|
||||
-DENABLE_STRICT_DEPENDENCIES=OFF
|
||||
-DBUILD_nvidia_plugin=OFF
|
||||
-DOPENVINO_EXTRA_MODULES=$(OPENVINO_CONTRIB_REPO_DIR)/modules
|
||||
-DCMAKE_CXX_COMPILER_LAUNCHER=ccache
|
||||
-DCMAKE_C_COMPILER_LAUNCHER=ccache
|
||||
-S $(REPO_DIR)
|
||||
-B $(BUILD_DIR)
|
||||
displayName: "Cmake configure"
|
||||
|
||||
- script: ls -alR $(REPO_DIR)/temp/
|
||||
displayName: 'List temp SDKs'
|
||||
|
||||
- script: ccache --zero-stats --max-size=50G --show-config
|
||||
displayName: 'Clean ccache stats'
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
wget https://scan.coverity.com/download/linux64 --post-data "token=$(COVERITY_TOKEN)&project=openvino" -O coverity_tool.tgz
|
||||
tar xvf coverity_tool.tgz
|
||||
rm coverity_tool.tgz
|
||||
workingDirectory: $(WORK_DIR)
|
||||
displayName: 'Install coverity tool'
|
||||
|
||||
- script: |
|
||||
$(WORK_DIR)/cov-analysis*/bin/cov-build --dir $(BUILD_DIR)/cov-int \
|
||||
cmake --build $(BUILD_DIR) --parallel --config $(BUILD_TYPE)
|
||||
env:
|
||||
CCACHE_DIR: $(CCACHE_DIR)
|
||||
CCACHE_TEMPDIR: $(TMP_DIR)/ccache
|
||||
CCACHE_BASEDIR: $(Pipeline.Workspace)
|
||||
CCACHE_MAXSIZE: 50G
|
||||
displayName: 'Build Lin with Coverity'
|
||||
|
||||
- script: ccache --show-stats
|
||||
displayName: 'Show ccache stats'
|
||||
|
||||
- script: ls -alR $(REPO_DIR)/bin/
|
||||
displayName: 'List bin files'
|
||||
|
||||
- script: tar -C $(BUILD_DIR) -czvf openvino.tgz cov-int
|
||||
workingDirectory: $(BUILD_DIR)
|
||||
displayName: 'Pack cov-int folder for submission'
|
||||
|
||||
- script: |
|
||||
curl --form token=$(COVERITY_TOKEN) \
|
||||
--form email=$(COVERITY_USER) \
|
||||
--form file=@openvino.tgz \
|
||||
--form version="$(Build.SourceVersion)" \
|
||||
--form description="https://github.com/openvinotoolkit/openvino/runs/$(Build.BuildNumber)" \
|
||||
https://scan.coverity.com/builds?project=openvino
|
||||
workingDirectory: $(BUILD_DIR)
|
||||
displayName: 'Submit for analysis'
|
||||
147
modules/openvino-master/.ci/azure/linux_cuda.yml
Normal file
@ -0,0 +1,147 @@
|
||||
trigger:
|
||||
branches:
|
||||
include:
|
||||
- 'master'
|
||||
- 'releases/*'
|
||||
paths:
|
||||
exclude:
|
||||
- '*/docs/*'
|
||||
- 'docs/*'
|
||||
- '*/*.md'
|
||||
- '*.md'
|
||||
- '*/layer_tests_summary/*'
|
||||
- '*/conformance/*'
|
||||
- 'tools/*'
|
||||
- 'tests/layer_tests/*'
|
||||
|
||||
pr:
|
||||
drafts: 'false'
|
||||
branches:
|
||||
include:
|
||||
- 'master'
|
||||
- 'releases/*'
|
||||
paths:
|
||||
exclude:
|
||||
- '*/docs/*'
|
||||
- 'docs/*'
|
||||
- '*/*.md'
|
||||
- '*.md'
|
||||
- '*/layer_tests_summary/*'
|
||||
- '*/conformance/*'
|
||||
- 'tools/*'
|
||||
- 'tests/layer_tests/*'
|
||||
|
||||
resources:
|
||||
repositories:
|
||||
- repository: openvino_contrib
|
||||
type: github
|
||||
endpoint: openvinotoolkit
|
||||
name: openvinotoolkit/openvino_contrib
|
||||
ref: master
|
||||
|
||||
- repository: testdata
|
||||
type: github
|
||||
endpoint: openvinotoolkit
|
||||
name: openvinotoolkit/testdata
|
||||
ref: master
|
||||
|
||||
jobs:
|
||||
- job: CUDAPlugin_Lin
|
||||
timeoutInMinutes: '60'
|
||||
|
||||
pool:
|
||||
name: LIN_VMSS_VENV_F16S_U20_WU2
|
||||
|
||||
variables:
|
||||
system.debug: true
|
||||
VSTS_HTTP_RETRY: 5
|
||||
VSTS_HTTP_TIMEOUT: 200
|
||||
BUILD_TYPE: Release
|
||||
HOME_DIR: $(Agent.HomeDirectory)
|
||||
REPO_DIR: $(Build.Repository.LocalPath)
|
||||
OPENVINO_REPO_DIR: $(REPO_DIR)/../openvino
|
||||
MODELS_PATH: $(REPO_DIR)/../testdata
|
||||
WORK_DIR: $(Pipeline.Workspace)/_w
|
||||
BUILD_DIR: $(WORK_DIR)/build
|
||||
BIN_DIR: $(OPENVINO_REPO_DIR)/bin/intel64/$(BUILD_TYPE)
|
||||
INSTALL_DIR: $(WORK_DIR)/install_pkg
|
||||
SETUPVARS: $(INSTALL_DIR)/setupvars.sh
|
||||
GRADLE_VER: 7.1.1
|
||||
|
||||
steps:
|
||||
- script: |
|
||||
curl -H Metadata:true --noproxy "*" "http://169.254.169.254/metadata/instance?api-version=2019-06-01"
|
||||
echo # prev line output doesn't end with eol
|
||||
whoami
|
||||
uname -a
|
||||
echo Python3 info ; which python3 ; python3 --version
|
||||
echo Python info ; which python ; python --version
|
||||
echo Java info ; which java ; java -version
|
||||
echo gcc info ; which gcc ; gcc --version
|
||||
echo cmake info ; which cmake ; cmake --version
|
||||
lsb_release
|
||||
env
|
||||
cat /proc/cpuinfo
|
||||
cat /proc/meminfo
|
||||
cat /etc/fstab
|
||||
vmstat -s
|
||||
df
|
||||
lsblk -o NAME,HCTL,SIZE,MOUNTPOINT | grep -i "sd"
|
||||
free -h
|
||||
displayName: 'System info'
|
||||
|
||||
- script: |
|
||||
rm -rf $(WORK_DIR) ; mkdir $(WORK_DIR)
|
||||
rm -rf $(BUILD_DIR) ; mkdir $(BUILD_DIR)
|
||||
displayName: 'Make dir'
|
||||
|
||||
- checkout: self
|
||||
clean: 'true'
|
||||
submodules: 'true'
|
||||
path: openvino
|
||||
|
||||
- checkout: openvino_contrib
|
||||
clean: 'true'
|
||||
submodules: 'true'
|
||||
path: openvino_contrib
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
curl -fsSL https://get.docker.com -o get-docker.sh
|
||||
sudo sh get-docker.sh
|
||||
# Speed up build
|
||||
sudo apt --assume-yes install unzip
|
||||
wget https://github.com/ninja-build/ninja/releases/download/v1.10.2/ninja-linux.zip
|
||||
unzip ninja-linux.zip
|
||||
displayName: 'Install dependencies'
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
sudo docker pull openvino.azurecr.io/openvino_ci/cuda-ubuntu2004:2022.1
|
||||
sudo docker run --volume $(REPO_DIR)/../:/root/repos --volume $(WORK_DIR):/root/w \
|
||||
openvino.azurecr.io/openvino_ci/cuda-ubuntu2004:2022.1 \
|
||||
bash -c "
|
||||
sudo -E /root/repos/openvino/install_build_dependencies.sh &&
|
||||
python3 -m pip install -r /root/repos/openvino/src/bindings/python/requirements.txt &&
|
||||
cmake -GNinja \
|
||||
-DCMAKE_VERBOSE_MAKEFILE=ON \
|
||||
-DENABLE_CPPLINT=OFF \
|
||||
-DCMAKE_BUILD_TYPE=$(BUILD_TYPE) \
|
||||
-DOPENVINO_EXTRA_MODULES=/root/repos/openvino_contrib/modules/nvidia_plugin \
|
||||
-DENABLE_INTEL_CPU=OFF \
|
||||
-DENABLE_INTEL_GPU=OFF \
|
||||
-DENABLE_INTEL_GNA=OFF \
|
||||
-DENABLE_OV_TF_FRONTEND=OFF \
|
||||
-DENABLE_OV_PADDLE_FRONTEND=OFF \
|
||||
-DENABLE_OV_PYTORCH_FRONTEND=OFF \
|
||||
-DENABLE_OV_ONNX_FRONTEND=OFF \
|
||||
-DENABLE_PYTHON=OFF \
|
||||
-DENABLE_TESTS=ON \
|
||||
-DENABLE_DATA=OFF \
|
||||
-S /root/repos/openvino \
|
||||
-B /root/w/build &&
|
||||
cmake --build /root/w/build --parallel --config Release --verbose -- ov_nvidia_func_tests ov_nvidia_unit_tests"
|
||||
displayName: 'Docker build Lin'
|
||||
|
||||
- script: ls -alR $(OPENVINO_REPO_DIR)/bin/
|
||||
displayName: 'List bin files'
|
||||
443
modules/openvino-master/.ci/azure/linux_debian.yml
Normal file
@ -0,0 +1,443 @@
|
||||
trigger:
|
||||
branches:
|
||||
include:
|
||||
- 'master'
|
||||
- 'releases/*'
|
||||
paths:
|
||||
exclude:
|
||||
- '*/docs/*'
|
||||
- 'docs/*'
|
||||
- '*/*.md'
|
||||
- '*.md'
|
||||
- '*/layer_tests_summary/*'
|
||||
- '*/conformance/*'
|
||||
- 'tests/layer_tests/*'
|
||||
|
||||
pr:
|
||||
branches:
|
||||
include:
|
||||
- 'master'
|
||||
- 'releases/*'
|
||||
paths:
|
||||
exclude:
|
||||
- '*/docs/*'
|
||||
- 'docs/*'
|
||||
- '*/*.md'
|
||||
- '*.md'
|
||||
- '*/layer_tests_summary/*'
|
||||
- '*/conformance/*'
|
||||
- 'tests/layer_tests/*'
|
||||
|
||||
resources:
|
||||
repositories:
|
||||
- repository: testdata
|
||||
type: github
|
||||
endpoint: openvinotoolkit
|
||||
name: openvinotoolkit/testdata
|
||||
ref: master
|
||||
|
||||
jobs:
|
||||
- job: Lin_Debian
|
||||
# About 150% of total time
|
||||
timeoutInMinutes: '120'
|
||||
|
||||
pool:
|
||||
name: LIN_VMSS_VENV_F16S_U20_WU2
|
||||
|
||||
variables:
|
||||
system.debug: true
|
||||
VSTS_HTTP_RETRY: 5
|
||||
VSTS_HTTP_TIMEOUT: 200
|
||||
BUILD_TYPE: Release
|
||||
REPO_DIR: $(Build.Repository.LocalPath)
|
||||
MODELS_PATH: $(REPO_DIR)/../testdata
|
||||
WORK_DIR: $(Pipeline.Workspace)/_w
|
||||
BUILD_DIR: $(WORK_DIR)/build
|
||||
BUILD_SAMPLES_DIR: $(WORK_DIR)/build_samples
|
||||
BUILD_LAYER_TESTS_DIR: $(WORK_DIR)/build_layer_tests
|
||||
BUILD_SAMPLES_TESTS_DIR: $(WORK_DIR)/build_samples_tests
|
||||
INSTALL_DIR: $(WORK_DIR)/install_pkg
|
||||
INSTALL_TEST_DIR: $(INSTALL_DIR)/tests
|
||||
LAYER_TESTS_DIR: $(INSTALL_TEST_DIR)/layer_tests
|
||||
SAMPLES_INSTALL_DIR: /usr/share/openvino/samples
|
||||
PYTHON_SAMPLES_INSTALL_DIR: $(INSTALL_DIR)/share/openvino/samples/python
|
||||
PYTHON_WHEEL_INSTALL_DIR: $HOME/.local/lib/python3.8/site-packages
|
||||
BUILD_VENV: $(WORK_DIR)/build_venv
|
||||
TEST_VENV: $(WORK_DIR)/test_venv
|
||||
TMP_DIR: /mnt/tmp
|
||||
SHARE_DIR: /mount/cinfsshare/onnxtestdata
|
||||
CCACHE_DIR: $(SHARE_DIR)/ccache/master/linux
|
||||
|
||||
steps:
|
||||
- script: |
|
||||
curl -H Metadata:true --noproxy "*" "http://169.254.169.254/metadata/instance?api-version=2019-06-01"
|
||||
whoami
|
||||
uname -a
|
||||
echo Python3 info ; which python3 ; python3 --version
|
||||
echo Python info ; which python ; python --version
|
||||
echo gcc info ; which gcc ; gcc --version
|
||||
echo cmake info ; which cmake ; cmake --version
|
||||
lsb_release
|
||||
env
|
||||
cat /proc/cpuinfo
|
||||
cat /proc/meminfo
|
||||
cat /etc/fstab
|
||||
vmstat -s
|
||||
df
|
||||
lsblk -o NAME,HCTL,SIZE,MOUNTPOINT | grep -i "sd"
|
||||
free -h
|
||||
echo TargetBranch: $(System.PullRequest.TargetBranch)
|
||||
echo SourceBranch: $(Build.SourceBranch)
|
||||
displayName: 'System info'
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
rm -rf $(WORK_DIR) ; mkdir $(WORK_DIR)
|
||||
rm -rf $(BUILD_DIR) ; mkdir $(BUILD_DIR)
|
||||
rm -rf $(BUILD_SAMPLES_DIR) ; mkdir $(BUILD_SAMPLES_DIR)
|
||||
sudo rm -rf $(TMP_DIR) ; sudo mkdir $(TMP_DIR) ; sudo chmod 777 -R $(TMP_DIR)
|
||||
sudo mkdir -p $(SHARE_DIR)
|
||||
sudo apt --assume-yes update && sudo apt --assume-yes install nfs-common
|
||||
sudo mount -vvv -t nfs cinfsshare.file.core.windows.net:/cinfsshare/onnxtestdata $(SHARE_DIR) -o vers=4,minorversion=1,sec=sys
|
||||
mkdir -p $(CCACHE_DIR)
|
||||
displayName: 'Make dir'
|
||||
|
||||
- checkout: self
|
||||
clean: 'true'
|
||||
submodules: 'true'
|
||||
path: openvino
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
sudo -E $(REPO_DIR)/install_build_dependencies.sh
|
||||
# 'clang' is used as a default compiler
|
||||
sudo apt --assume-yes install clang
|
||||
sudo apt --assume-yes install --no-install-recommends libopencv-imgproc-dev libopencv-imgcodecs-dev
|
||||
# install build dependencies
|
||||
(cd $(WORK_DIR) && python3 -m venv build_venv)
|
||||
$(BUILD_VENV)/bin/python3 -m pip install -U pip
|
||||
$(BUILD_VENV)/bin/python3 -m pip install -r $(REPO_DIR)/src/bindings/python/wheel/requirements-dev.txt
|
||||
$(BUILD_VENV)/bin/python3 -m pip install -r $(REPO_DIR)/src/bindings/python/requirements.txt
|
||||
# For running Python API tests
|
||||
$(BUILD_VENV)/bin/python3 -m pip install -r $(REPO_DIR)/src/bindings/python/src/compatibility/openvino/requirements-dev.txt
|
||||
# For running Paddle frontend unit tests
|
||||
$(BUILD_VENV)/bin/python3 -m pip install -r $(REPO_DIR)/src/frontends/paddle/tests/requirements.txt
|
||||
# For running ONNX frontend unit tests
|
||||
$(BUILD_VENV)/bin/python3 -m pip install -r $(REPO_DIR)/src/frontends/onnx/tests/requirements.txt
|
||||
# For running TensorFlow frontend unit tests
|
||||
$(BUILD_VENV)/bin/python3 -m pip install -r $(REPO_DIR)/src/frontends/tensorflow/tests/requirements.txt
|
||||
# For MO unit tests
|
||||
(cd $(WORK_DIR) && python3 -m venv test_venv)
|
||||
$(TEST_VENV)/bin/python3 -m pip install -U pip
|
||||
$(TEST_VENV)/bin/python3 -m pip install -r $(REPO_DIR)/tools/mo/requirements_mxnet.txt
|
||||
$(TEST_VENV)/bin/python3 -m pip install -r $(REPO_DIR)/tools/mo/requirements_caffe.txt
|
||||
$(TEST_VENV)/bin/python3 -m pip install -r $(REPO_DIR)/tools/mo/requirements_kaldi.txt
|
||||
$(TEST_VENV)/bin/python3 -m pip install -r $(REPO_DIR)/tools/mo/requirements_onnx.txt
|
||||
$(TEST_VENV)/bin/python3 -m pip install -r $(REPO_DIR)/tools/mo/requirements_tf2.txt
|
||||
$(TEST_VENV)/bin/python3 -m pip install -r $(REPO_DIR)/tools/mo/requirements_dev.txt
|
||||
$(TEST_VENV)/bin/python3 -m pip install -r $(REPO_DIR)/src/frontends/paddle/tests/requirements.txt
|
||||
# for Python API tests
|
||||
/usr/bin/python3 -m pip install -r $(REPO_DIR)/src/bindings/python/requirements_test.txt
|
||||
/usr/bin/python3 -m pip install -r $(REPO_DIR)/tools/mo/requirements.txt
|
||||
/usr/bin/python3 -m pip uninstall -y numpy # apt-get install python3-numpy will be used
|
||||
# Speed up build
|
||||
sudo apt -y --no-install-recommends install unzip
|
||||
wget https://github.com/ninja-build/ninja/releases/download/v1.10.2/ninja-linux.zip
|
||||
unzip ninja-linux.zip
|
||||
sudo cp -v ninja /usr/local/bin/
|
||||
# Speed up tests
|
||||
git clone https://github.com/google/gtest-parallel.git
|
||||
displayName: 'Install build dependencies'
|
||||
|
||||
# Should be after 'Install dependencies' because Git lfs is not installed
|
||||
- checkout: testdata
|
||||
clean: 'true'
|
||||
lfs: 'true'
|
||||
path: testdata
|
||||
|
||||
- task: CMake@1
|
||||
inputs:
|
||||
# CMake must get Python 3.x version by default
|
||||
cmakeArgs: >
|
||||
-GNinja
|
||||
-DENABLE_CPPLINT=OFF
|
||||
-DCMAKE_BUILD_TYPE=$(BUILD_TYPE)
|
||||
-DCMAKE_COMPILE_WARNING_AS_ERROR=ON
|
||||
-DENABLE_PYTHON=ON
|
||||
-DENABLE_INTEL_GNA=OFF
|
||||
-DPYTHON_EXECUTABLE=$(BUILD_VENV)/bin/python3
|
||||
-DENABLE_TESTS=ON
|
||||
-DENABLE_FASTER_BUILD=ON
|
||||
-DENABLE_STRICT_DEPENDENCIES=OFF
|
||||
-DENABLE_SYSTEM_SNAPPY=ON
|
||||
-DCMAKE_CXX_COMPILER_LAUNCHER=ccache
|
||||
-DCMAKE_C_COMPILER_LAUNCHER=ccache
|
||||
-DCMAKE_CXX_LINKER_LAUNCHER=ccache
|
||||
-DCMAKE_C_LINKER_LAUNCHER=ccache
|
||||
-DENABLE_PYTHON_PACKAGING=ON
|
||||
-DCPACK_GENERATOR=DEB
|
||||
-S $(REPO_DIR)
|
||||
-B $(BUILD_DIR)
|
||||
displayName: 'CMake OpenVINO'
|
||||
|
||||
- script: ls -alR $(REPO_DIR)/temp/
|
||||
displayName: 'List temp SDKs'
|
||||
|
||||
- script: ccache --zero-stats --max-size=50G --show-config
|
||||
displayName: 'Clean ccache stats'
|
||||
|
||||
- script: cmake --build $(BUILD_DIR) --parallel --config $(BUILD_TYPE)
|
||||
env:
|
||||
CCACHE_DIR: $(CCACHE_DIR)
|
||||
CCACHE_TEMPDIR: $(TMP_DIR)/ccache
|
||||
CCACHE_BASEDIR: $(Pipeline.Workspace)
|
||||
CCACHE_MAXSIZE: 50G
|
||||
displayName: 'Build Lin'
|
||||
|
||||
- script: ccache --show-stats
|
||||
displayName: 'Show ccache stats'
|
||||
|
||||
- script: ls -alR $(REPO_DIR)/bin/
|
||||
displayName: 'List bin files'
|
||||
|
||||
- task: CMake@1
|
||||
inputs:
|
||||
cmakeArgs: >
|
||||
-GNinja
|
||||
-S $(REPO_DIR)/tests/layer_tests
|
||||
-B $(BUILD_LAYER_TESTS_DIR)
|
||||
displayName: 'CMake Layer Tests'
|
||||
|
||||
- script: cmake --build $(BUILD_LAYER_TESTS_DIR) --parallel --config $(BUILD_TYPE)
|
||||
displayName: 'Build Layer Tests'
|
||||
|
||||
# to check that wheel packages tested later, contain all all the dependencies like TBB or pugixml
|
||||
- script: sudo apt-get remove libtbb2 libpugixml1v5 -y
|
||||
displayName: 'Remove debian dependencies'
|
||||
|
||||
- script: cmake -DCOMPONENT=python_wheels -DCMAKE_INSTALL_PREFIX=$(INSTALL_DIR) -P $(BUILD_DIR)/cmake_install.cmake
|
||||
displayName: 'Install wheel packages'
|
||||
|
||||
- script: cmake -DCOMPONENT=python_samples -DCMAKE_INSTALL_PREFIX=$(INSTALL_DIR) -P $(BUILD_DIR)/cmake_install.cmake
|
||||
displayName: 'Install Python Samples'
|
||||
|
||||
- script: cmake -DCOMPONENT=tests -DCMAKE_INSTALL_PREFIX=$(INSTALL_DIR) -P $(BUILD_LAYER_TESTS_DIR)/cmake_install.cmake
|
||||
displayName: 'Install Layer Tests'
|
||||
|
||||
- script: cmake -DCMAKE_INSTALL_PREFIX=$(INSTALL_DIR) -DCOMPONENT=tests -P $(BUILD_DIR)/cmake_install.cmake
|
||||
displayName: 'Install tests'
|
||||
|
||||
- script: ls -alR $(INSTALL_DIR)
|
||||
displayName: 'List install test files'
|
||||
|
||||
- script: |
|
||||
sudo apt-get install libtbb-dev libpugixml-dev -y
|
||||
cmake --build $(BUILD_DIR) --config $(BUILD_TYPE) --target package --parallel
|
||||
displayName: 'Build Debian packages'
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
# install debian packages from previous release
|
||||
sudo apt-get -y update
|
||||
sudo apt-get install --no-install-recommends gnupg wget -y
|
||||
wget https://apt.repos.intel.com/intel-gpg-keys/GPG-PUB-KEY-INTEL-SW-PRODUCTS.PUB
|
||||
sudo apt-key add GPG-PUB-KEY-INTEL-SW-PRODUCTS.PUB
|
||||
echo "deb https://apt.repos.intel.com/openvino/2023 ubuntu20 main" | sudo tee /etc/apt/sources.list.d/intel-openvino-2023.list
|
||||
sudo apt-get update -o Dir::Etc::sourcelist=/etc/apt/sources.list.d/intel-openvino-2023.list
|
||||
sudo apt-get install openvino -y
|
||||
# install our local one and make sure the conflicts are resolved
|
||||
sudo apt-get install --no-install-recommends dpkg-dev -y
|
||||
rm -r _CPack_Packages
|
||||
dpkg-scanpackages . /dev/null | gzip -9c > Packages.gz
|
||||
echo "deb [trusted=yes] file:$(BUILD_DIR) ./" | sudo tee /etc/apt/sources.list.d/openvino-local.list
|
||||
sudo apt-get update
|
||||
sudo apt-get install openvino -y
|
||||
workingDirectory: $(BUILD_DIR)
|
||||
displayName: 'Install Debian packages'
|
||||
|
||||
- script: ls -alR $(INSTALL_DIR)
|
||||
displayName: 'List install files'
|
||||
|
||||
- script: rm -fr $(BUILD_DIR)
|
||||
displayName: 'Clean build dir'
|
||||
|
||||
- script: $(SAMPLES_INSTALL_DIR)/cpp/build_samples.sh -i $(INSTALL_DIR)
|
||||
displayName: 'Build cpp samples - gcc'
|
||||
|
||||
- script: $(SAMPLES_INSTALL_DIR)/cpp/build_samples.sh -i $(INSTALL_DIR)
|
||||
displayName: 'Build cpp samples - clang'
|
||||
env:
|
||||
CC: clang
|
||||
CXX: clang++
|
||||
|
||||
- script: $(SAMPLES_INSTALL_DIR)/c/build_samples.sh -i $(INSTALL_DIR)
|
||||
displayName: 'Build c samples'
|
||||
|
||||
- script: $(INSTALL_TEST_DIR)/ov_core_unit_tests --gtest_print_time=1 --gtest_filter=-*IE_GPU* --gtest_output=xml:$(INSTALL_TEST_DIR)/TEST-NGraphUT.xml
|
||||
env:
|
||||
LD_LIBRARY_PATH: $(INSTALL_TEST_DIR)
|
||||
displayName: 'OV Core UT'
|
||||
|
||||
- script: |
|
||||
$(INSTALL_TEST_DIR)/ov_proxy_plugin_tests --gtest_print_time=1 --gtest_output=xml:$(INSTALL_TEST_DIR)/TEST-OVProxyTests.xml
|
||||
env:
|
||||
LD_LIBRARY_PATH: $(INSTALL_TEST_DIR)
|
||||
displayName: 'OV Proxy Tests'
|
||||
|
||||
- script: |
|
||||
$(INSTALL_TEST_DIR)/ov_hetero_func_tests --gtest_print_time=1 --gtest_output=xml:$(INSTALL_TEST_DIR)/TEST-OVHeteroFuncTests.xml
|
||||
env:
|
||||
LD_LIBRARY_PATH: $(INSTALL_TEST_DIR)
|
||||
displayName: 'OV Hetero Func Tests'
|
||||
|
||||
- script: $(INSTALL_TEST_DIR)/ov_onnx_frontend_tests --gtest_print_time=1 --gtest_filter=-*IE_GPU* --gtest_output=xml:$(INSTALL_TEST_DIR)/TEST-ONNXFrontend.xml
|
||||
env:
|
||||
LD_LIBRARY_PATH: $(INSTALL_TEST_DIR)
|
||||
displayName: 'ONNX Frontend Tests'
|
||||
|
||||
# TODO Reenable PDPD after paddlepaddle==2.5.0 with compliant protobuf is released (ticket 95904)
|
||||
- script: $(INSTALL_TEST_DIR)/paddle_tests --gtest_print_time=1 --gtest_output=xml:$(INSTALL_TEST_DIR)/TEST-Paddle.xml
|
||||
env:
|
||||
LD_LIBRARY_PATH: $(INSTALL_TEST_DIR)
|
||||
displayName: 'Paddle Frontend UT'
|
||||
enabled: 'false'
|
||||
|
||||
- script: $(INSTALL_TEST_DIR)/ov_tensorflow_frontend_tests --gtest_print_time=1 --gtest_output=xml:$(INSTALL_TEST_DIR)/TEST-Tensorflow.xml
|
||||
env:
|
||||
LD_LIBRARY_PATH: $(INSTALL_TEST_DIR)
|
||||
displayName: 'TensorFlow Frontend Unit Tests'
|
||||
|
||||
- script: $(INSTALL_TEST_DIR)/ov_tensorflow_common_tests --gtest_print_time=1 --gtest_output=xml:$(INSTALL_TEST_DIR)/TEST-TensorflowCommon.xml
|
||||
env:
|
||||
LD_LIBRARY_PATH: $(INSTALL_TEST_DIR)
|
||||
displayName: 'TensorFlow Common Unit Tests'
|
||||
|
||||
- script: $(INSTALL_TEST_DIR)/ov_tensorflow_lite_frontend_tests --gtest_print_time=1 --gtest_output=xml:$(INSTALL_TEST_DIR)/TEST-TensorflowLite.xml
|
||||
env:
|
||||
LD_LIBRARY_PATH: $(INSTALL_TEST_DIR)
|
||||
displayName: 'TensorFlow Lite Frontend Unit Tests'
|
||||
|
||||
- script: $(INSTALL_TEST_DIR)/ov_snippets_func_tests --gtest_output=xml:$(INSTALL_TEST_DIR)/TEST-ov_snippets_func_tests.xml
|
||||
displayName: 'Snippets Func Tests'
|
||||
|
||||
- script: $(INSTALL_TEST_DIR)/ov_cpu_unit_tests --gtest_output=xml:$(INSTALL_TEST_DIR)/TEST-ov_cpu_unit_tests.xml
|
||||
displayName: 'Intel CPU Unit Tests'
|
||||
|
||||
- script: $(INSTALL_TEST_DIR)/ov_auto_unit_tests --gtest_output=xml:$(INSTALL_TEST_DIR)/TEST-ov_auto_unit_tests.xml
|
||||
displayName: 'AUTO UT'
|
||||
|
||||
- script: $(INSTALL_TEST_DIR)/ov_template_func_tests --gtest_filter=*smoke* --gtest_output=xml:$(INSTALL_TEST_DIR)/TEST-templateFuncTests.xml
|
||||
env:
|
||||
LD_LIBRARY_PATH: $(INSTALL_TEST_DIR)
|
||||
displayName: 'TEMPLATE FuncTests'
|
||||
|
||||
- script: $(INSTALL_TEST_DIR)/InferenceEngineCAPITests --gtest_output=xml:$(INSTALL_TEST_DIR)/TEST-InferenceEngineCAPITests.xml
|
||||
env:
|
||||
DATA_PATH: $(MODELS_PATH)
|
||||
MODELS_PATH: $(MODELS_PATH)
|
||||
displayName: 'IE CAPITests'
|
||||
|
||||
- script: $(INSTALL_TEST_DIR)/ov_capi_test --gtest_output=xml:$(INSTALL_TEST_DIR)/TEST-ov_capi_test.xml
|
||||
env:
|
||||
DATA_PATH: $(MODELS_PATH)
|
||||
MODELS_PATH: $(MODELS_PATH)
|
||||
displayName: 'OV CAPITests'
|
||||
|
||||
# Skip test_onnx/test_zoo_models and test_onnx/test_backend due to long execution time
|
||||
- script: |
|
||||
/usr/bin/python3 -m pytest -s $(INSTALL_TEST_DIR)/pyngraph \
|
||||
--junitxml=$(INSTALL_TEST_DIR)/TEST-Pyngraph.xml \
|
||||
--ignore=$(INSTALL_TEST_DIR)/pyngraph/tests/test_onnx/test_zoo_models.py \
|
||||
--ignore=$(INSTALL_TEST_DIR)/pyngraph/tests/test_onnx/test_backend.py
|
||||
env:
|
||||
LD_LIBRARY_PATH: $(INSTALL_TEST_DIR)
|
||||
displayName: 'nGraph and IE Python Bindings Tests'
|
||||
|
||||
- script: |
|
||||
/usr/bin/python3 -m pytest -s $(INSTALL_TEST_DIR)/pyopenvino \
|
||||
--junitxml=$(INSTALL_TEST_DIR)/TEST-Pyngraph.xml \
|
||||
--ignore=$(INSTALL_TEST_DIR)/pyopenvino/tests/test_utils/test_utils.py -v
|
||||
env:
|
||||
# Required by python imports to load requires libraries
|
||||
# - tests install dir for mock_py
|
||||
LD_LIBRARY_PATH: $(INSTALL_TEST_DIR)
|
||||
# For python imports to import pybind_mock_frontend
|
||||
PYTHONPATH: $(INSTALL_TEST_DIR):$(REPO_DIR)/tools/mo/
|
||||
displayName: 'Python API 2.0 Tests'
|
||||
|
||||
# Skip test_onnx/test_zoo_models and test_onnx/test_backend due to long execution time
|
||||
- script: |
|
||||
/usr/bin/python3 -m pytest -s $(REPO_DIR)/src/frontends/onnx/tests \
|
||||
--ignore=$(REPO_DIR)/src/frontends/onnx/tests/test_python/test_zoo_models.py \
|
||||
--ignore=$(REPO_DIR)/src/frontends/onnx/tests/test_python/test_backend.py -v
|
||||
env:
|
||||
LD_LIBRARY_PATH: $(INSTALL_TEST_DIR)
|
||||
PYTHONPATH: $(INSTALL_TEST_DIR)
|
||||
displayName: 'ONNX Frontend Python Tests'
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
# TODO: fix 'No mock frontend API available'
|
||||
$(TEST_VENV)/bin/python3 -m pip install openvino-dev --find-links=$(INSTALL_DIR)/tools
|
||||
$(TEST_VENV)/bin/python3 -m pytest -s $(INSTALL_TEST_DIR)/mo/unit_tests --junitxml=$(INSTALL_TEST_DIR)/TEST-ModelOptimizer.xml
|
||||
env:
|
||||
PYTHONPATH: $(REPO_DIR)/tools/ovc/
|
||||
displayName: 'Model Optimizer UT'
|
||||
|
||||
# run not all smoke filter to save time in post-commit
|
||||
- script: $(INSTALL_TEST_DIR)/ov_cpu_func_tests --gtest_filter=*OVCLass*:*CoreThreadingTests* --gtest_print_time=1 --gtest_output=xml:$(INSTALL_TEST_DIR)/TEST-ov_cpu_func_tests.xml
|
||||
displayName: 'CPU FuncTests'
|
||||
|
||||
- task: CMake@1
|
||||
inputs:
|
||||
cmakeArgs: >
|
||||
-GNinja
|
||||
-S $(REPO_DIR)/tests/samples_tests
|
||||
-B $(BUILD_SAMPLES_TESTS_DIR)
|
||||
displayName: 'CMake Samples Tests'
|
||||
|
||||
- script: cmake -DCOMPONENT=tests -DCMAKE_INSTALL_PREFIX=$(INSTALL_DIR) -P $(BUILD_SAMPLES_TESTS_DIR)/cmake_install.cmake
|
||||
displayName: 'Install Samples Tests'
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
/usr/bin/python3 -m pip install -r $(INSTALL_TEST_DIR)/smoke_tests/requirements.txt
|
||||
# GNA isn't a part of Debian package, so filter out that tests
|
||||
/usr/bin/python3 -m pytest $(INSTALL_TEST_DIR)/smoke_tests/ -k "not GNA" --env_conf $(INSTALL_TEST_DIR)/smoke_tests/env_config.yml -s --junitxml=$(INSTALL_TEST_DIR)/TEST-SamplesSmokeTests.xml
|
||||
env:
|
||||
IE_APP_PATH: $(INSTALL_DIR)/samples_bin
|
||||
LD_LIBRARY_PATH: $(INSTALL_DIR)/samples_bin
|
||||
IE_APP_PYTHON_PATH: $(PYTHON_SAMPLES_INSTALL_DIR)/
|
||||
SHARE: $(INSTALL_TEST_DIR)/smoke_tests/samples_smoke_tests_data/
|
||||
WORKSPACE: $(INSTALL_DIR)
|
||||
displayName: 'Samples Smoke Tests'
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
$(TEST_VENV)/bin/python3 -m pip install -r $(LAYER_TESTS_DIR)/requirements.txt
|
||||
$(TEST_VENV)/bin/python3 -m pytest $(LAYER_TESTS_DIR)/tensorflow_tests/test_tf_Roll.py --ir_version=10 --junitxml=$(INSTALL_TEST_DIR)/TEST-tf_Roll.xmlTEST
|
||||
env:
|
||||
PYTHONPATH: $(REPO_DIR)/tools/ovc/:$(LAYER_TESTS_DIR)
|
||||
displayName: 'TensorFlow 1 Layer Tests - Legacy FE'
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
$(TEST_VENV)/bin/python3 -m pip install -r $(LAYER_TESTS_DIR)/requirements.txt
|
||||
$(RUN_PREFIX) $(TEST_VENV)/bin/python3 -m pytest $(LAYER_TESTS_DIR)/tensorflow_lite_tests/ --junitxml=$(INSTALL_TEST_DIR)/TEST-tfl_fe.xmlTEST
|
||||
env:
|
||||
PYTHONPATH: $(REPO_DIR)/tools/ovc/:$(REPO_DIR)/tools/mo/:$(LAYER_TESTS_DIR)
|
||||
TEST_DEVICE: CPU
|
||||
displayName: 'TensorFlow Lite Layer Tests - TFL FE'
|
||||
|
||||
- task: PublishTestResults@2
|
||||
condition: always()
|
||||
inputs:
|
||||
testResultsFormat: 'JUnit' # Options: JUnit, NUnit, VSTest, xUnit, cTest
|
||||
testResultsFiles: '**/TEST-*.xml'
|
||||
#searchFolder: '$(BUILD_DIR)'
|
||||
mergeTestResults: false # Optional
|
||||
#failTaskOnFailedTests: false # Optional
|
||||
#testRunTitle: 'Pre/Post-Commit' # Optional
|
||||
buildPlatform: 'x64' # Optional
|
||||
buildConfiguration: 'Linux' # Optional
|
||||
#publishRunAttachments: true # Optional
|
||||
80
modules/openvino-master/.ci/azure/linux_lohika.yml
Normal file
@ -0,0 +1,80 @@
|
||||
#resources:
|
||||
# repositories:
|
||||
# - repository: testdata
|
||||
# type: github
|
||||
# endpoint: openvinotoolkit
|
||||
# name: openvinotoolkit/testdata
|
||||
# ref: master
|
||||
|
||||
jobs:
|
||||
- job: Lin_lohika
|
||||
# About 150% of total time
|
||||
timeoutInMinutes: '90'
|
||||
|
||||
pool:
|
||||
name: LIN_LOHIKA
|
||||
|
||||
variables:
|
||||
system.debug: true
|
||||
# VSTS_HTTP_RETRY: 5
|
||||
# VSTS_HTTP_TIMEOUT: 200
|
||||
# BUILD_TYPE: Release
|
||||
# REPO_DIR: $(Build.Repository.LocalPath)
|
||||
# MODELS_PATH: $(REPO_DIR)/../testdata
|
||||
# WORK_DIR: $(Pipeline.Workspace)/_w
|
||||
# BUILD_DIR: $(WORK_DIR)/build
|
||||
|
||||
steps:
|
||||
- script: git -C ~/work/openvino fetch origin $(Build.SourceBranch)
|
||||
displayName: fetch
|
||||
|
||||
# - checkout: self
|
||||
# clean: 'true'
|
||||
# submodules: 'true'
|
||||
# path: openvino
|
||||
|
||||
- checkout: none
|
||||
|
||||
- script: git -C ~/work/openvino checkout -m $(Build.SourceVersion) && git -C ~/work/openvino submodule update --init --recursive
|
||||
displayName: checkout
|
||||
|
||||
# Should be after 'Install dependencies' because Git lfs is not installed
|
||||
# - checkout: testdata
|
||||
# clean: 'true'
|
||||
# submodules: 'true'
|
||||
# lfs: 'true'
|
||||
# path: testdata
|
||||
|
||||
- script: env -C ~/work ./configreleasenolto.sh
|
||||
displayName: CMake
|
||||
|
||||
# - task: CMake@1
|
||||
# inputs:
|
||||
# # CMake must get Python 3.x version by default
|
||||
# cmakeArgs: >
|
||||
# -GNinja
|
||||
# -DENABLE_CPPLINT=OFF
|
||||
# -DCMAKE_VERBOSE_MAKEFILE=ON
|
||||
# -DCMAKE_BUILD_TYPE=$(BUILD_TYPE)
|
||||
# -DENABLE_PYTHON=ON
|
||||
# -DPYTHON_EXECUTABLE=/usr/bin/python3.8
|
||||
# -DENABLE_TESTS=ON
|
||||
# -DENABLE_OV_ONNX_FRONTEND=ON
|
||||
# -DENABLE_FASTER_BUILD=ON
|
||||
# -DENABLE_STRICT_DEPENDENCIES=OFF
|
||||
# -DOPENVINO_EXTRA_MODULES=$(OPENVINO_CONTRIB_REPO_DIR)/modules
|
||||
# -S $(REPO_DIR)
|
||||
# -B $(BUILD_DIR)
|
||||
|
||||
- script: |
|
||||
env -C ~/work
|
||||
./buildreleasenolto.sh
|
||||
libopenvino_gapi_preproc.so
|
||||
openvino_intel_cpu_plugin
|
||||
openvino_intel_gpu_plugin
|
||||
ov_gpu_unit_tests
|
||||
gpuFuncTests
|
||||
displayName: Build Lin
|
||||
|
||||
- script: ~/work/testreleasenolto.sh
|
||||
displayName: cldnn tests
|
||||
126
modules/openvino-master/.ci/azure/linux_ngraph_onnx.yml
Normal file
@ -0,0 +1,126 @@
|
||||
trigger:
|
||||
branches:
|
||||
include:
|
||||
- 'master'
|
||||
- 'releases/*'
|
||||
paths:
|
||||
exclude:
|
||||
- '*/docs/*'
|
||||
- 'docs/*'
|
||||
- '*/*.md'
|
||||
- '*.md'
|
||||
- '*/layer_tests_summary/*'
|
||||
- '*/conformance/*'
|
||||
- 'tools/*'
|
||||
- 'tests/layer_tests/*'
|
||||
|
||||
pr:
|
||||
drafts: 'false'
|
||||
branches:
|
||||
include:
|
||||
- 'master'
|
||||
- releases/*
|
||||
paths:
|
||||
exclude:
|
||||
- '*/docs/*'
|
||||
- 'docs/*'
|
||||
- '*/*.md'
|
||||
- '*.md'
|
||||
- '*/layer_tests_summary/*'
|
||||
- '*/conformance/*'
|
||||
- 'tools/*'
|
||||
- 'tests/layer_tests/*'
|
||||
|
||||
jobs:
|
||||
- job: OpenVINO_ONNX_CI
|
||||
strategy:
|
||||
matrix:
|
||||
Release:
|
||||
BUILD_TYPE: 'Release'
|
||||
TOX_COMMAND: 'tox && tox -e zoo_models'
|
||||
Debug:
|
||||
BUILD_TYPE: 'Debug'
|
||||
TOX_COMMAND: 'tox'
|
||||
maxParallel: '2'
|
||||
|
||||
# About 300% of total time
|
||||
timeoutInMinutes: '90'
|
||||
|
||||
pool:
|
||||
name: LIN_VMSS_VENV_ONNX_U20_WU2
|
||||
|
||||
variables:
|
||||
system.debug: true
|
||||
VSTS_HTTP_RETRY: 5
|
||||
VSTS_HTTP_TIMEOUT: 200
|
||||
REPO_DIR: $(Build.Repository.LocalPath)
|
||||
WORK_DIR: $(Pipeline.Workspace)/_w
|
||||
MODELS_DIR: /mount/cinfsshare/onnxtestdata
|
||||
TMP_DIR: /mnt/tmp
|
||||
ONNX_MODEL_ZOO_SHA: "d58213534f2a4d1c4b19ba62b3bb5f544353256e"
|
||||
|
||||
|
||||
steps:
|
||||
- script: |
|
||||
curl -H Metadata:true --noproxy "*" "http://169.254.169.254/metadata/instance?api-version=2019-06-01"
|
||||
whoami
|
||||
uname -a
|
||||
echo Python3 info ; which python3 ; python3 --version
|
||||
echo Python info ; which python ; python --version
|
||||
echo gcc info ; which gcc ; gcc --version
|
||||
echo cmake info ; which cmake ; cmake --version
|
||||
lsb_release
|
||||
env
|
||||
cat /proc/cpuinfo
|
||||
cat /proc/meminfo
|
||||
cat /etc/fstab
|
||||
vmstat -s
|
||||
df
|
||||
lsblk -o NAME,HCTL,SIZE,MOUNTPOINT | grep -i "sd"
|
||||
free -h
|
||||
displayName: 'System info'
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
rm -rf $(WORK_DIR) ; mkdir $(WORK_DIR)
|
||||
sudo mkdir -p $(MODELS_DIR)
|
||||
sudo apt --assume-yes update && sudo apt --assume-yes install nfs-common
|
||||
sudo apt install nfs-common -y
|
||||
sudo mount -vvv -t nfs cinfsshare.file.core.windows.net:/cinfsshare/onnxtestdata $(MODELS_DIR) -o vers=4,minorversion=1,sec=sys
|
||||
mkdir -p $(MODELS_DIR)/models_data
|
||||
displayName: 'Make dirs'
|
||||
|
||||
- checkout: self
|
||||
clean: 'true'
|
||||
submodules: 'true'
|
||||
path: openvino
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
apt-get update && apt-get install -y lsb-release && apt-get clean all
|
||||
curl -fsSL https://get.docker.com -o get-docker.sh
|
||||
sudo sh get-docker.sh
|
||||
displayName: 'Install dependencies'
|
||||
|
||||
- script:
|
||||
src/frontends/onnx/tests/tests_python/model_zoo_preprocess.sh -d $(MODELS_DIR)/models_data -o -s "$(ONNX_MODEL_ZOO_SHA)"
|
||||
displayName: 'Update models'
|
||||
condition: ne(variables['BUILD_TYPE'], 'Debug')
|
||||
|
||||
- script: |
|
||||
sudo docker build \
|
||||
--tag=openvino-onnx-ci-image \
|
||||
--file=.ci/openvino-onnx/Dockerfile \
|
||||
--build-arg BUILD_TYPE=$(BUILD_TYPE) .
|
||||
displayName: 'Docker build $(BUILD_TYPE)'
|
||||
|
||||
- script: sudo fallocate -l 64G /swapfile ; sudo mkswap /swapfile ; sudo swapon /swapfile ; df ; free -h
|
||||
displayName: 'Create swap'
|
||||
|
||||
- script: |
|
||||
sudo docker run \
|
||||
--name openvino-onnx-ci-container \
|
||||
--volume $(MODELS_DIR)/models_data/model_zoo/onnx_model_zoo_$(ONNX_MODEL_ZOO_SHA):/root/.onnx/model_zoo/onnx_model_zoo \
|
||||
--volume $(MODELS_DIR)/msft:/root/.onnx/model_zoo/MSFT openvino-onnx-ci-image \
|
||||
/bin/bash -c "$(TOX_COMMAND)"
|
||||
displayName: 'Docker run $(BUILD_TYPE)'
|
||||
207
modules/openvino-master/.ci/azure/linux_onnxruntime.yml
Normal file
@ -0,0 +1,207 @@
|
||||
trigger:
|
||||
branches:
|
||||
include:
|
||||
- 'master'
|
||||
- 'releases/*'
|
||||
paths:
|
||||
exclude:
|
||||
- '*/docs/*'
|
||||
- 'docs/*'
|
||||
- '*/*.md'
|
||||
- '*.md'
|
||||
- '*/layer_tests_summary/*'
|
||||
- '*/conformance/*'
|
||||
- 'tools/*'
|
||||
- 'tests/layer_tests/*'
|
||||
|
||||
pr:
|
||||
drafts: 'false'
|
||||
branches:
|
||||
include:
|
||||
- 'master'
|
||||
- 'releases/*'
|
||||
paths:
|
||||
exclude:
|
||||
- '*/docs/*'
|
||||
- 'docs/*'
|
||||
- '*/*.md'
|
||||
- '*.md'
|
||||
- '*/layer_tests_summary/*'
|
||||
- '*/conformance/*'
|
||||
- 'tools/*'
|
||||
- 'tests/layer_tests/*'
|
||||
|
||||
variables:
|
||||
- group: github
|
||||
|
||||
jobs:
|
||||
- job: onnxruntime
|
||||
timeoutInMinutes: '90'
|
||||
|
||||
pool:
|
||||
name: LIN_VMSS_VENV_ONNX_U20_WU2
|
||||
|
||||
variables:
|
||||
system.debug: true
|
||||
VSTS_HTTP_RETRY: 5
|
||||
VSTS_HTTP_TIMEOUT: 200
|
||||
BUILD_TYPE: Release
|
||||
REPO_DIR: $(Build.Repository.LocalPath)
|
||||
ONNXRUNTIME_REPO_DIR: $(REPO_DIR)/../onnxruntime
|
||||
WORK_DIR: $(Pipeline.Workspace)/_w
|
||||
MODELS_DIR: /mount/cinfsshare/onnxtestdata
|
||||
TMP_DIR: /mnt/tmp
|
||||
INSTALL_DIR: $(WORK_DIR)/install_pkg/openvino
|
||||
BUILD_DIR: $(WORK_DIR)/build
|
||||
ONNXRUNTIME_UTILS: $(REPO_DIR)/.ci/azure/ci_utils/onnxruntime
|
||||
ONNXRUNTIME_BUILD_DIR: $(ONNXRUNTIME_REPO_DIR)/build
|
||||
LD_LIBRARY_PATH: $(Agent.ToolsDirectory)/Python/$(OV_PYTHON_VERSION)/x64/lib
|
||||
OV_PYTHON_VERSION: 3.11.2 # Full version of Python its required for LD_LIBRARY_PATH. More details https://github.com/microsoft/azure-pipelines-tool-lib/blob/master/docs/overview.md#tool-cache
|
||||
|
||||
steps:
|
||||
- task: UsePythonVersion@0
|
||||
inputs:
|
||||
versionSpec: '$(OV_PYTHON_VERSION)' # Setting only major & minor version will download latest release from GH repo example 3.10 will be 3.10.10.
|
||||
addToPath: true
|
||||
disableDownloadFromRegistry: false
|
||||
architecture: 'x64'
|
||||
githubToken: $(auth_token)
|
||||
displayName: Setup Python 3.11
|
||||
name: setupPython
|
||||
- bash: |
|
||||
#!/bin/bash
|
||||
python -V
|
||||
|
||||
- script: |
|
||||
curl -H Metadata:true --noproxy "*" "http://169.254.169.254/metadata/instance?api-version=2019-06-01"
|
||||
whoami
|
||||
uname -a
|
||||
echo Python3 info ; which python3 ; python3 --version
|
||||
echo Python info ; which python ; python --version
|
||||
echo gcc info ; which gcc ; gcc --version
|
||||
echo cmake info ; which cmake ; cmake --version
|
||||
lsb_release
|
||||
env
|
||||
cat /proc/cpuinfo
|
||||
cat /proc/meminfo
|
||||
cat /etc/fstab
|
||||
vmstat -s
|
||||
df
|
||||
lsblk -o NAME,HCTL,SIZE,MOUNTPOINT | grep -i "sd"
|
||||
free -h
|
||||
displayName: 'System info'
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
rm -rf $(WORK_DIR) ; mkdir $(WORK_DIR)
|
||||
sudo rm -rf $(TMP_DIR) ; sudo mkdir $(TMP_DIR) ; sudo chmod 777 -R $(TMP_DIR)
|
||||
sudo mkdir -p $(MODELS_DIR)
|
||||
sudo apt --assume-yes update && sudo apt --assume-yes install nfs-common
|
||||
sudo mount -vvv -t nfs cinfsshare.file.core.windows.net:/cinfsshare/onnxtestdata $(MODELS_DIR) -o vers=4,minorversion=1,sec=sys
|
||||
displayName: 'Make dirs'
|
||||
|
||||
- checkout: self
|
||||
clean: 'true'
|
||||
submodules: 'true'
|
||||
path: openvino
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
branch=`tr -s '\n ' < $(ONNXRUNTIME_UTILS)/version`
|
||||
git clone --branch $branch --single-branch --recursive https://github.com/microsoft/onnxruntime.git $(ONNXRUNTIME_REPO_DIR)
|
||||
displayName: 'Clone onnxruntime'
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
sudo -E $(REPO_DIR)/install_build_dependencies.sh
|
||||
# Speed up build
|
||||
sudo apt -y --no-install-recommends install unzip
|
||||
wget https://github.com/ninja-build/ninja/releases/download/v1.10.2/ninja-linux.zip
|
||||
unzip ninja-linux.zip
|
||||
sudo cp -v ninja /usr/local/bin/
|
||||
displayName: 'Install dependencies'
|
||||
|
||||
- task: CMake@1
|
||||
inputs:
|
||||
# CMake must get Python 3.x version by default
|
||||
cmakeArgs: >
|
||||
-GNinja
|
||||
-DCMAKE_BUILD_TYPE=$(BUILD_TYPE)
|
||||
-DCMAKE_COMPILE_WARNING_AS_ERROR=ON
|
||||
-DENABLE_INTEL_GNA=OFF
|
||||
-DENABLE_INTEL_GPU=OFF
|
||||
-DENABLE_CPPLINT=OFF
|
||||
-DENABLE_PROFILING_ITT=OFF
|
||||
-DENABLE_SAMPLES=OFF
|
||||
-DENABLE_OV_TF_FRONTEND=OFF
|
||||
-DENABLE_OV_PADDLE_FRONTEND=OFF
|
||||
-DENABLE_OV_PYTORCH_FRONTEND=OFF
|
||||
-DENABLE_OPENVINO_DEBUG=OFF
|
||||
-S $(REPO_DIR)
|
||||
-B $(BUILD_DIR)
|
||||
|
||||
- script: cmake --build $(BUILD_DIR) --parallel --config $(BUILD_TYPE)
|
||||
displayName: 'Build Lin ONNX'
|
||||
|
||||
- script: ls -alR $(REPO_DIR)/bin/
|
||||
displayName: 'List bin files'
|
||||
|
||||
- script: cmake -DCMAKE_INSTALL_PREFIX=$(INSTALL_DIR) -P $(BUILD_DIR)/cmake_install.cmake
|
||||
displayName: 'Install OpenVINO'
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
source $(INSTALL_DIR)/setupvars.sh
|
||||
$(ONNXRUNTIME_REPO_DIR)/build.sh \
|
||||
--config RelWithDebInfo \
|
||||
--use_openvino CPU_FP32 \
|
||||
--build_shared_lib \
|
||||
--parallel \
|
||||
--skip_tests \
|
||||
--build_dir $(ONNXRUNTIME_BUILD_DIR)
|
||||
env:
|
||||
CXXFLAGS: "-Wno-error=deprecated-declarations"
|
||||
displayName: 'Build Lin ONNX Runtime'
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
source $(INSTALL_DIR)/setupvars.sh
|
||||
skip_tests=$(tr -s '\n ' ':' < $(ONNXRUNTIME_UTILS)/skip_tests)
|
||||
./onnxruntime_test_all --gtest_filter=-$skip_tests
|
||||
workingDirectory: $(ONNXRUNTIME_BUILD_DIR)/RelWithDebInfo
|
||||
displayName: 'Run onnxruntime_test_all'
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
source $(INSTALL_DIR)/setupvars.sh
|
||||
./onnxruntime_shared_lib_test --gtest_filter=-CApiTest.test_custom_op_openvino_wrapper_library
|
||||
workingDirectory: $(ONNXRUNTIME_BUILD_DIR)/RelWithDebInfo
|
||||
displayName: 'Run onnxruntime_shared_lib_test'
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
source $(INSTALL_DIR)/setupvars.sh
|
||||
./onnxruntime_global_thread_pools_test
|
||||
workingDirectory: $(ONNXRUNTIME_BUILD_DIR)/RelWithDebInfo
|
||||
displayName: 'Run onnxruntime_global_thread_pools_test'
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
source $(INSTALL_DIR)/setupvars.sh
|
||||
./onnxruntime_api_tests_without_env
|
||||
workingDirectory: $(ONNXRUNTIME_BUILD_DIR)/RelWithDebInfo
|
||||
displayName: 'Run onnxruntime_api_tests_without_env'
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
source $(INSTALL_DIR)/setupvars.sh
|
||||
./onnx_test_runner "$(ONNXRUNTIME_REPO_DIR)/cmake/external/onnx/onnx/backend/test/data/pytorch-converted"
|
||||
workingDirectory: $(ONNXRUNTIME_BUILD_DIR)/RelWithDebInfo
|
||||
displayName: 'Run pytorch-converted tests'
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
source $(INSTALL_DIR)/setupvars.sh
|
||||
./onnx_test_runner "$(ONNXRUNTIME_REPO_DIR)/cmake/external/onnx/onnx/backend/test/data/pytorch-operator"
|
||||
workingDirectory: $(ONNXRUNTIME_BUILD_DIR)/RelWithDebInfo
|
||||
displayName: 'Run pytorch-operator tests'
|
||||
243
modules/openvino-master/.ci/azure/mac.yml
Normal file
@ -0,0 +1,243 @@
|
||||
trigger:
|
||||
branches:
|
||||
include:
|
||||
- master
|
||||
- 'releases/*'
|
||||
paths:
|
||||
exclude:
|
||||
- '*/docs/*'
|
||||
- 'docs/*'
|
||||
- '*/*.md'
|
||||
- '*.md'
|
||||
- '*/layer_tests_summary/*'
|
||||
- '*/conformance/*'
|
||||
- 'tests/layer_tests/*'
|
||||
|
||||
pr:
|
||||
drafts: 'false'
|
||||
branches:
|
||||
include:
|
||||
- 'master'
|
||||
- 'releases/*'
|
||||
paths:
|
||||
exclude:
|
||||
- '*/docs/*'
|
||||
- 'docs/*'
|
||||
- '*/*.md'
|
||||
- '*.md'
|
||||
- '*/layer_tests_summary/*'
|
||||
- '*/conformance/*'
|
||||
- 'tests/layer_tests/*'
|
||||
|
||||
resources:
|
||||
repositories:
|
||||
- repository: openvino_contrib
|
||||
type: github
|
||||
endpoint: openvinotoolkit
|
||||
name: openvinotoolkit/openvino_contrib
|
||||
ref: master
|
||||
|
||||
- repository: testdata
|
||||
type: github
|
||||
endpoint: openvinotoolkit
|
||||
name: openvinotoolkit/testdata
|
||||
ref: master
|
||||
|
||||
variables:
|
||||
- group: github
|
||||
|
||||
jobs:
|
||||
- job: Mac
|
||||
# About 250% of total time (perfomace of Mac hosts is unstable, 360 is max)
|
||||
timeoutInMinutes: '360'
|
||||
|
||||
pool:
|
||||
vmImage: 'macOS-11'
|
||||
|
||||
variables:
|
||||
system.debug: true
|
||||
VSTS_HTTP_RETRY: 5
|
||||
VSTS_HTTP_TIMEOUT: 200
|
||||
BUILD_TYPE: Release
|
||||
REPO_DIR: $(Build.Repository.LocalPath)
|
||||
OPENVINO_CONTRIB_REPO_DIR: $(REPO_DIR)/../openvino_contrib
|
||||
MODELS_PATH: $(REPO_DIR)/../testdata
|
||||
WORK_DIR: $(Pipeline.Workspace)/_w
|
||||
BUILD_DIR: $(WORK_DIR)/build
|
||||
INSTALL_DIR: $(WORK_DIR)/install_pkg
|
||||
INSTALL_TEST_DIR: $(INSTALL_DIR)/tests
|
||||
SETUPVARS: . $(INSTALL_DIR)/setupvars.sh
|
||||
TMP_DIR: /tmp
|
||||
CCACHE_DIR: $(WORK_DIR)/ccache/mac
|
||||
|
||||
steps:
|
||||
- task: UsePythonVersion@0
|
||||
inputs:
|
||||
versionSpec: '3.11.2'
|
||||
addToPath: true
|
||||
architecture: 'x64'
|
||||
githubToken: $(auth_token)
|
||||
displayName: Setup Python 3.11
|
||||
name: setupPython
|
||||
|
||||
- script: |
|
||||
whoami
|
||||
uname -a
|
||||
echo Python3 info ; which python3 ; python3 --version
|
||||
echo Python info ; which python ; python --version
|
||||
echo Java info ; which java ; java -version
|
||||
echo gcc info ; which gcc ; gcc --version
|
||||
echo cmake info ; which cmake ; cmake --version
|
||||
xcrun --sdk macosx --show-sdk-version
|
||||
env
|
||||
sysctl -a
|
||||
displayName: 'System info'
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
rm -rf $(WORK_DIR) ; mkdir $(WORK_DIR)
|
||||
rm -rf $(BUILD_DIR) ; mkdir $(BUILD_DIR)
|
||||
displayName: 'Make dir'
|
||||
|
||||
- checkout: self
|
||||
clean: 'true'
|
||||
submodules: 'true'
|
||||
path: openvino
|
||||
|
||||
- checkout: openvino_contrib
|
||||
clean: 'true'
|
||||
submodules: 'true'
|
||||
path: openvino_contrib
|
||||
|
||||
- checkout: testdata
|
||||
clean: 'true'
|
||||
lfs: 'true'
|
||||
path: testdata
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
brew install cython automake
|
||||
python3 -m pip install -r $(REPO_DIR)/src/frontends/onnx/tests/requirements.txt
|
||||
# Speed up build
|
||||
brew install ninja ccache
|
||||
displayName: 'Install dependencies'
|
||||
|
||||
- script: |
|
||||
export PATH="/usr/local/opt/cython/bin:$PATH"
|
||||
cmake \
|
||||
-G Ninja \
|
||||
-DENABLE_CPPLINT=OFF \
|
||||
-DCMAKE_VERBOSE_MAKEFILE=ON \
|
||||
-DCMAKE_BUILD_TYPE=$(BUILD_TYPE) \
|
||||
-DCMAKE_COMPILE_WARNING_AS_ERROR=ON \
|
||||
-DENABLE_PYTHON=ON \
|
||||
-DENABLE_STRICT_DEPENDENCIES=OFF \
|
||||
-DOPENVINO_EXTRA_MODULES=$(OPENVINO_CONTRIB_REPO_DIR)/modules \
|
||||
-DCMAKE_CXX_COMPILER_LAUNCHER=ccache \
|
||||
-DCMAKE_C_COMPILER_LAUNCHER=ccache \
|
||||
-DBUILD_nvidia_plugin=OFF \
|
||||
-S $(REPO_DIR) \
|
||||
-B $(BUILD_DIR)
|
||||
displayName: 'CMake OpenVINO'
|
||||
|
||||
- script: ls -alR $(REPO_DIR)/temp/
|
||||
displayName: 'List temp SDKs'
|
||||
|
||||
- task: Cache@2
|
||||
inputs:
|
||||
key: 'ccache | "$(Agent.OS)"'
|
||||
path: $(CCACHE_DIR)
|
||||
restoreKeys: |
|
||||
ccache | "$(Agent.OS)"
|
||||
displayName: Cache
|
||||
enabled: 'false'
|
||||
|
||||
- script: ccache --zero-stats --max-size=10G --show-config
|
||||
displayName: 'Clean ccache stats'
|
||||
|
||||
- script: cmake --build $(BUILD_DIR) --parallel --config $(BUILD_TYPE)
|
||||
env:
|
||||
CCACHE_DIR: $(CCACHE_DIR)
|
||||
CCACHE_TEMPDIR: $(TMP_DIR)/ccache
|
||||
CCACHE_BASEDIR: $(Pipeline.Workspace)
|
||||
CCACHE_MAXSIZE: 10G
|
||||
displayName: 'Build Mac'
|
||||
|
||||
- script: ccache --show-stats
|
||||
displayName: 'Show ccache stats'
|
||||
|
||||
- script: ls -alR $(REPO_DIR)/bin/
|
||||
displayName: 'List bin files'
|
||||
|
||||
- script: cmake -DCMAKE_INSTALL_PREFIX=$(INSTALL_DIR) -P $(BUILD_DIR)/cmake_install.cmake
|
||||
displayName: 'Install'
|
||||
|
||||
- script: ls -alR $(INSTALL_DIR)
|
||||
displayName: 'List install files'
|
||||
|
||||
- script: cmake -DCMAKE_INSTALL_PREFIX=$(INSTALL_DIR) -DCOMPONENT=tests -P $(BUILD_DIR)/cmake_install.cmake
|
||||
displayName: 'Install tests'
|
||||
|
||||
- script: ls -alR $(INSTALL_DIR)
|
||||
displayName: 'List install files'
|
||||
|
||||
- script: $(SETUPVARS) && $(INSTALL_TEST_DIR)/ov_core_unit_tests --gtest_print_time=1 --gtest_filter=-*IE_GPU* --gtest_output=xml:$(INSTALL_TEST_DIR)/TEST-OVCoreUT.xml
|
||||
displayName: 'OV Core UT'
|
||||
enabled: 'false'
|
||||
|
||||
- script: $(SETUPVARS) && $(INSTALL_TEST_DIR)/ov_proxy_plugin_tests --gtest_print_time=1 --gtest_output=xml:$(INSTALL_TEST_DIR)/TEST-OVProxyTests.xml
|
||||
displayName: 'OV Proxy Plugin Tests'
|
||||
enabled: 'false'
|
||||
|
||||
- script: $(SETUPVARS) && $(INSTALL_TEST_DIR)/ov_hetero_func_tests --gtest_print_time=1 --gtest_output=xml:$(INSTALL_TEST_DIR)/TEST-OVHeteroFuncTests.xml
|
||||
displayName: 'OV Hetero Func Tests'
|
||||
enabled: 'false'
|
||||
|
||||
- script: $(SETUPVARS) && $(INSTALL_TEST_DIR)/ov_ir_frontend_tests --gtest_print_time=1 --gtest_output=xml:$(INSTALL_TEST_DIR)/TEST-IRFrontend.xml
|
||||
displayName: 'IR Frontend Tests'
|
||||
enabled: 'false'
|
||||
|
||||
- script: $(SETUPVARS) && $(INSTALL_TEST_DIR)/ov_onnx_frontend_tests --gtest_print_time=1 --gtest_filter=-*IE_GPU*--gtest_output=xml:$(INSTALL_TEST_DIR)/TEST-ONNXFrontend.xml
|
||||
displayName: 'ONNX Frontend Tests'
|
||||
enabled: 'false'
|
||||
|
||||
- script: $(SETUPVARS) && $(INSTALL_TEST_DIR)/ov_cpu_unit_tests --gtest_output=xml:$(INSTALL_TEST_DIR)/TEST-ov_cpu_unit_tests.xml
|
||||
displayName: 'Intel CPU Unit Tests'
|
||||
enabled: 'false'
|
||||
|
||||
- script: $(SETUPVARS) && $(INSTALL_TEST_DIR)/ov_auto_unit_tests --gtest_output=xml:$(INSTALL_TEST_DIR)/TEST-ov_auto_unit_tests.xml
|
||||
displayName: 'AUTO UT'
|
||||
enabled: 'false'
|
||||
|
||||
- script: $(SETUPVARS) && $(INSTALL_TEST_DIR)/ov_cpu_func_tests --gtest_filter=*smoke* --gtest_print_time=1 --gtest_output=xml:$(INSTALL_TEST_DIR)/TEST-ov_cpu_func_tests.xml
|
||||
displayName: 'CPU FuncTests'
|
||||
enabled: 'false'
|
||||
|
||||
- script: |
|
||||
$(SETUPVARS) && $(INSTALL_TEST_DIR)/InferenceEngineCAPITests --gtest_output=xml:$(INSTALL_TEST_DIR)/TEST-InferenceEngineCAPITests.xml
|
||||
env:
|
||||
DATA_PATH: $(MODELS_PATH)
|
||||
MODELS_PATH: $(MODELS_PATH)
|
||||
displayName: 'IE CAPITests'
|
||||
enabled: 'false'
|
||||
|
||||
- script: |
|
||||
$(SETUPVARS) && $(INSTALL_TEST_DIR)/ov_capi_test --gtest_output=xml:$(INSTALL_TEST_DIR)/TEST-ov_capi_test.xml
|
||||
env:
|
||||
DATA_PATH: $(MODELS_PATH)
|
||||
MODELS_PATH: $(MODELS_PATH)
|
||||
displayName: 'IE CAPITests'
|
||||
enabled: 'false'
|
||||
|
||||
- task: PublishTestResults@2
|
||||
condition: always()
|
||||
inputs:
|
||||
testResultsFormat: 'JUnit' # Options: JUnit, NUnit, VSTest, xUnit, cTest
|
||||
testResultsFiles: '**/TEST-*.xml'
|
||||
#searchFolder: '$(BUILD_DIR)'
|
||||
mergeTestResults: false # Optional
|
||||
#failTaskOnFailedTests: false # Optional
|
||||
#testRunTitle: 'Pre/Post-Commit' # Optional
|
||||
buildPlatform: 'x64' # Optional
|
||||
buildConfiguration: 'Mac' # Optional
|
||||
#publishRunAttachments: true # Optional
|
||||
349
modules/openvino-master/.ci/azure/windows.yml
Normal file
@ -0,0 +1,349 @@
|
||||
trigger:
|
||||
branches:
|
||||
include:
|
||||
- 'master'
|
||||
- 'releases/*'
|
||||
paths:
|
||||
exclude:
|
||||
- '*/docs/*'
|
||||
- 'docs/*'
|
||||
- '*/*.md'
|
||||
- '*.md'
|
||||
- '*/layer_tests_summary/*'
|
||||
- '*/conformance/*'
|
||||
|
||||
pr:
|
||||
branches:
|
||||
include:
|
||||
- 'master'
|
||||
- 'releases/*'
|
||||
paths:
|
||||
exclude:
|
||||
- '*/docs/*'
|
||||
- 'docs/*'
|
||||
- '*/*.md'
|
||||
- '*.md'
|
||||
- '*/layer_tests_summary/*'
|
||||
- '*/conformance/*'
|
||||
|
||||
resources:
|
||||
repositories:
|
||||
- repository: openvino_contrib
|
||||
type: github
|
||||
endpoint: openvinotoolkit
|
||||
name: openvinotoolkit/openvino_contrib
|
||||
ref: master
|
||||
|
||||
- repository: testdata
|
||||
type: github
|
||||
endpoint: openvinotoolkit
|
||||
name: openvinotoolkit/testdata
|
||||
ref: master
|
||||
|
||||
jobs:
|
||||
- job: Win
|
||||
strategy:
|
||||
matrix:
|
||||
Static:
|
||||
CMAKE_BUILD_SHARED_LIBS: 'OFF'
|
||||
# Dynamic:
|
||||
# CMAKE_BUILD_SHARED_LIBS: 'ON'
|
||||
maxParallel: '2'
|
||||
|
||||
# About 150% of total time
|
||||
timeoutInMinutes: '270' #Temporary change
|
||||
|
||||
pool:
|
||||
name: WIN_VMSS_VENV_D8S_WU2
|
||||
|
||||
variables:
|
||||
system.debug: true
|
||||
VSTS_HTTP_RETRY: 5
|
||||
VSTS_HTTP_TIMEOUT: 200
|
||||
BUILD_TYPE: Release
|
||||
REPO_DIR: $(Build.Repository.LocalPath)
|
||||
OPENVINO_CONTRIB_REPO_DIR: $(REPO_DIR)\..\openvino_contrib
|
||||
MODELS_PATH: $(REPO_DIR)\..\testdata
|
||||
WORK_DIR: $(Pipeline.Workspace)\_w
|
||||
BUILD_DIR: $(WORK_DIR)\build
|
||||
BUILD_SAMPLES_DIR: $(WORK_DIR)\build_samples
|
||||
BUILD_SAMPLES_TESTS_DIR: $(WORK_DIR)\build_samples_tests
|
||||
MSVS_VARS_PATH: C:\Program Files (x86)\Microsoft Visual Studio\2019\Enterprise\VC\Auxiliary\Build\vcvars64.bat
|
||||
MSVC_COMPILER_PATH: C:\Program Files (x86)\Microsoft Visual Studio\2019\Enterprise\VC\Tools\MSVC\14.24.28314\bin\Hostx64\x64\cl.exe
|
||||
INSTALL_DIR: $(WORK_DIR)\install_pkg
|
||||
INSTALL_TEST_DIR: $(INSTALL_DIR)\tests
|
||||
SETUPVARS: $(INSTALL_DIR)\setupvars.bat
|
||||
PYTHON_DIR: C:\hostedtoolcache\windows\Python\3.11.2\x64
|
||||
CMAKE_VERSION: 3.24.0
|
||||
CMAKE_CMD: $(WORK_DIR)\cmake-$(CMAKE_VERSION)-windows-x86_64\cmake-$(CMAKE_VERSION)-windows-x86_64\bin\cmake.exe
|
||||
OV_CMAKE_TOOLCHAIN_FILE: $(REPO_DIR)\cmake\toolchains\mt.runtime.win32.toolchain.cmake
|
||||
PYTHON_VENV_DIR: $(WORK_DIR)\.venv
|
||||
|
||||
|
||||
steps:
|
||||
- script: |
|
||||
rd /Q /S $(WORK_DIR) & mkdir $(WORK_DIR)
|
||||
rd /Q /S $(BUILD_DIR) & mkdir $(BUILD_DIR)
|
||||
rd /Q /S $(WORK_DIR) & mkdir C:\hostedtoolcache\windows\Python\3.11.2
|
||||
rd /Q /S $(BUILD_DIR) & mkdir C:\hostedtoolcache\windows\Python\3.11.2\x64
|
||||
rd /Q /S $(BUILD_SAMPLES_DIR) & mkdir $(BUILD_SAMPLES_DIR)
|
||||
rd /Q /S $(BUILD_SAMPLES_TESTS_DIR) & mkdir $(BUILD_SAMPLES_TESTS_DIR)
|
||||
displayName: 'Make dir'
|
||||
|
||||
- script: curl -O https://www.python.org/ftp/python/3.11.2/python-3.11.2-amd64.exe
|
||||
displayName: 'Download Python'
|
||||
workingDirectory: $(WORK_DIR)
|
||||
|
||||
- script: |
|
||||
python-3.11.2-amd64.exe /passive InstallAllUsers=0 Include_launcher=0 TargetDir=C:\hostedtoolcache\windows\Python\3.11.2\x64 && ^
|
||||
cp C:\hostedtoolcache\windows\Python\3.8.2\x64.complete C:\hostedtoolcache\windows\Python\3.11.2\x64.complete
|
||||
displayName: 'Install Python'
|
||||
workingDirectory: $(WORK_DIR)
|
||||
|
||||
- task: UsePythonVersion@0
|
||||
displayName: 'Use Python'
|
||||
inputs:
|
||||
versionSpec: '3.11.2'
|
||||
disableDownloadFromRegistry: true
|
||||
|
||||
- script: |
|
||||
powershell -command "Invoke-RestMethod -Headers @{\"Metadata\"=\"true\"} -Method GET -Uri http://169.254.169.254/metadata/instance/compute?api-version=2019-06-01 | format-custom"
|
||||
tree C:\hostedtoolcache\windows\Python
|
||||
where python
|
||||
python --version
|
||||
where java
|
||||
java -version
|
||||
wmic computersystem get TotalPhysicalMemory
|
||||
wmic cpu list
|
||||
wmic logicaldisk get description,name
|
||||
wmic VOLUME list
|
||||
set
|
||||
displayName: 'System info'
|
||||
|
||||
- checkout: self
|
||||
clean: 'true'
|
||||
submodules: 'true'
|
||||
path: openvino
|
||||
|
||||
- checkout: openvino_contrib
|
||||
clean: 'true'
|
||||
submodules: 'true'
|
||||
path: openvino_contrib
|
||||
|
||||
- checkout: testdata
|
||||
clean: 'true'
|
||||
lfs: 'true'
|
||||
path: testdata
|
||||
|
||||
- script: |
|
||||
python -m pip install --upgrade pip
|
||||
rem For running Python API tests
|
||||
python -m pip install -r $(REPO_DIR)\src\bindings\python\src\compatibility\openvino\requirements-dev.txt
|
||||
python -m pip install -r $(REPO_DIR)\src\bindings\python\wheel\requirements-dev.txt
|
||||
python -m pip install -r $(REPO_DIR)\src\bindings\python\requirements.txt
|
||||
rem For running Paddle frontend unit tests
|
||||
# TODO Reenable PDPD after paddlepaddle==2.5.0 with compliant protobuf is released (ticket 95904)
|
||||
#python -m pip install -r $(REPO_DIR)\src\frontends\paddle\tests\requirements.txt
|
||||
rem For running ONNX frontend unit tests
|
||||
python -m pip install -r $(REPO_DIR)\src\frontends\onnx\tests\requirements.txt
|
||||
rem For running TensorFlow frontend unit tests
|
||||
python -m pip install -r $(REPO_DIR)\src\frontends\tensorflow\tests\requirements.txt
|
||||
rem For MO unit tests
|
||||
python -m pip install -r $(REPO_DIR)\tools\mo\requirements.txt
|
||||
python -m pip install -r $(REPO_DIR)\tools\mo\requirements_dev.txt
|
||||
rem Speed up build
|
||||
powershell -command "Invoke-WebRequest https://github.com/Kitware/CMake/releases/download/v$(CMAKE_VERSION)/cmake-$(CMAKE_VERSION)-windows-x86_64.zip -OutFile cmake-$(CMAKE_VERSION)-windows-x86_64.zip"
|
||||
powershell -command "Expand-Archive -Force cmake-$(CMAKE_VERSION)-windows-x86_64.zip"
|
||||
powershell -command "Invoke-WebRequest https://github.com/ninja-build/ninja/releases/download/v1.10.2/ninja-win.zip -OutFile ninja-win.zip"
|
||||
powershell -command "Expand-Archive -Force ninja-win.zip"
|
||||
workingDirectory: $(WORK_DIR)
|
||||
displayName: 'Install dependencies'
|
||||
|
||||
- powershell: |
|
||||
Write-Host "##vso[task.setvariable variable=CMAKE_TOOLCHAIN_FILE]$(OV_CMAKE_TOOLCHAIN_FILE)"
|
||||
condition: eq(variables['CMAKE_BUILD_SHARED_LIBS'], 'ON')
|
||||
displayName: "Set cmake toolchain"
|
||||
|
||||
- script: |
|
||||
set PATH=$(WORK_DIR)\ninja-win;%PATH% && ^
|
||||
call "$(MSVS_VARS_PATH)" && $(CMAKE_CMD) ^
|
||||
-G "Ninja Multi-Config" ^
|
||||
-DENABLE_CPPLINT=OFF ^
|
||||
-DENABLE_ONEDNN_FOR_GPU=$(CMAKE_BUILD_SHARED_LIBS) ^
|
||||
-DBUILD_SHARED_LIBS=$(CMAKE_BUILD_SHARED_LIBS) ^
|
||||
-DENABLE_FASTER_BUILD=ON ^
|
||||
-DENABLE_TESTS=ON ^
|
||||
-DCMAKE_COMPILE_WARNING_AS_ERROR=ON ^
|
||||
-DENABLE_STRICT_DEPENDENCIES=OFF ^
|
||||
-DENABLE_PYTHON=ON ^
|
||||
-DBUILD_nvidia_plugin=OFF ^
|
||||
-DCUSTOM_OPERATIONS="calculate_grid;complex_mul;fft;grid_sample;sparse_conv;sparse_conv_transpose" ^
|
||||
-DPYTHON_EXECUTABLE="C:\hostedtoolcache\windows\Python\3.11.2\x64\python.exe" ^
|
||||
-DPYTHON_INCLUDE_DIR="C:\hostedtoolcache\windows\Python\3.11.2\x64\include" ^
|
||||
-DPYTHON_LIBRARY="C:\hostedtoolcache\windows\Python\3.11.2\x64\libs\python311.lib" ^
|
||||
-DOPENVINO_EXTRA_MODULES=$(OPENVINO_CONTRIB_REPO_DIR)\modules ^
|
||||
-DCMAKE_C_COMPILER:PATH="$(MSVC_COMPILER_PATH)" ^
|
||||
-DCMAKE_CXX_COMPILER:PATH="$(MSVC_COMPILER_PATH)" ^
|
||||
-S $(REPO_DIR) ^
|
||||
-B $(BUILD_DIR)
|
||||
displayName: 'CMake OpenVINO'
|
||||
|
||||
- script: dir $(REPO_DIR)\temp\ /s
|
||||
displayName: 'List temp SDKs'
|
||||
|
||||
- script: |
|
||||
set PATH=$(WORK_DIR)\ninja-win;%PATH% && ^
|
||||
call "$(MSVS_VARS_PATH)" && $(CMAKE_CMD) --build $(BUILD_DIR) --parallel --config Release"
|
||||
displayName: 'Build Win'
|
||||
|
||||
- script: dir $(REPO_DIR)\bin\ /s
|
||||
displayName: 'List bin files'
|
||||
|
||||
- script: $(CMAKE_CMD) -DCMAKE_INSTALL_PREFIX=$(INSTALL_DIR) -P $(BUILD_DIR)/cmake_install.cmake
|
||||
displayName: 'Install'
|
||||
|
||||
- script: dir $(INSTALL_DIR) /s
|
||||
displayName: 'List install files'
|
||||
|
||||
- script: python -m pip install openvino-dev --find-links=$(INSTALL_DIR)\tools
|
||||
displayName: 'Install Wheels'
|
||||
|
||||
- script: |
|
||||
call "$(MSVS_VARS_PATH)" && ^
|
||||
$(CMAKE_CMD) ^
|
||||
-DCMAKE_C_COMPILER:PATH="$(MSVC_COMPILER_PATH)" ^
|
||||
-DCMAKE_CXX_COMPILER:PATH="$(MSVC_COMPILER_PATH)" ^
|
||||
-S $(REPO_DIR)\tests\samples_tests ^
|
||||
-B $(BUILD_SAMPLES_TESTS_DIR)
|
||||
displayName: 'CMake Samples Tests'
|
||||
|
||||
- script: $(CMAKE_CMD) -DCOMPONENT=tests -DCMAKE_INSTALL_PREFIX=$(INSTALL_DIR) -P $(BUILD_SAMPLES_TESTS_DIR)\cmake_install.cmake
|
||||
displayName: 'Install Samples Tests'
|
||||
|
||||
- script: |
|
||||
$(INSTALL_DIR)\samples\cpp\build_samples_msvc.bat -i $(INSTALL_DIR)
|
||||
if not exist %USERPROFILE%\Documents\Intel\OpenVINO\openvino_cpp_samples_build\ exit 1
|
||||
displayName: 'Build cpp samples'
|
||||
|
||||
- script: |
|
||||
$(INSTALL_DIR)\samples\c\build_samples_msvc.bat -i $(INSTALL_DIR)
|
||||
if not exist %USERPROFILE%\Documents\Intel\OpenVINO\openvino_c_samples_build\ exit 1
|
||||
displayName: 'Build c samples'
|
||||
|
||||
- script: python -m pip install -r $(INSTALL_TEST_DIR)\smoke_tests\requirements.txt
|
||||
displayName: 'Install dependencies for samples smoke tests'
|
||||
|
||||
- script: |
|
||||
call $(SETUPVARS) && ^
|
||||
python -m pytest $(INSTALL_DIR)\tests\smoke_tests\ --env_conf $(INSTALL_TEST_DIR)\smoke_tests\env_config.yml -s --junitxml=$(INSTALL_TEST_DIR)/TEST-SamplesSmokeTests.xml
|
||||
env:
|
||||
IE_APP_PATH: $(INSTALL_DIR)\samples_bin
|
||||
IE_APP_PYTHON_PATH: $(INSTALL_DIR)\samples\python\
|
||||
SHARE: $(INSTALL_DIR)\tests\smoke_tests\samples_smoke_tests_data\
|
||||
WORKSPACE: $(INSTALL_DIR)
|
||||
displayName: 'Samples Smoke Tests'
|
||||
|
||||
- script: $(CMAKE_CMD) -DCMAKE_INSTALL_PREFIX=$(INSTALL_DIR) -DCOMPONENT=tests -P $(BUILD_DIR)\cmake_install.cmake
|
||||
displayName: 'Install tests'
|
||||
|
||||
- script: dir $(INSTALL_DIR) /s
|
||||
displayName: 'List install files'
|
||||
|
||||
- script: rd /Q /S $(BUILD_DIR)
|
||||
displayName: 'Clean build dir'
|
||||
|
||||
- script: call $(SETUPVARS) && $(INSTALL_TEST_DIR)\ov_core_unit_tests --gtest_print_time=1 --gtest_filter=-*IE_GPU* --gtest_output=xml:$(INSTALL_TEST_DIR)\TEST-NGraphUT.xml
|
||||
displayName: 'OV Core UT'
|
||||
|
||||
- script: call $(SETUPVARS) && $(INSTALL_TEST_DIR)\ov_inference_functional_tests --gtest_print_time=1 --gtest_output=xml:$(INSTALL_TEST_DIR)\TEST-InferenceFunc.xml
|
||||
displayName: 'Inference Func Tests'
|
||||
|
||||
- script: call $(SETUPVARS) && $(INSTALL_TEST_DIR)\ov_inference_unit_tests --gtest_print_time=1 --gtest_output=xml:$(INSTALL_TEST_DIR)\TEST-InferenceUnit.xml
|
||||
displayName: 'Inference Unit Tests'
|
||||
|
||||
- script: call $(SETUPVARS) && $(INSTALL_TEST_DIR)\ov_proxy_plugin_tests --gtest_print_time=1 --gtest_output=xml:$(INSTALL_TEST_DIR)\TEST-OVProxyTests.xml
|
||||
displayName: 'OV Proxy Plugin Tests'
|
||||
|
||||
- script: call $(SETUPVARS) && $(INSTALL_TEST_DIR)\ov_hetero_func_tests --gtest_print_time=1 --gtest_output=xml:$(INSTALL_TEST_DIR)\TEST-OVHeteroFuncTests.xml
|
||||
displayName: 'OV Hetero Func Tests'
|
||||
|
||||
- script: call $(SETUPVARS) && $(INSTALL_TEST_DIR)\ov_conditional_compilation_tests --gtest_print_time=1 --gtest_output=xml:$(INSTALL_TEST_DIR)\TEST-ConditionalCompilation.xml
|
||||
displayName: 'Conditional Compilation Tests'
|
||||
|
||||
- script: call $(SETUPVARS) && $(INSTALL_TEST_DIR)\ov_ir_frontend_tests --gtest_print_time=1 --gtest_output=xml:$(INSTALL_TEST_DIR)\TEST-IRFrontend.xml
|
||||
displayName: 'IR Frontend Tests'
|
||||
|
||||
- script: call $(SETUPVARS) && $(INSTALL_TEST_DIR)\ov_onnx_frontend_tests --gtest_print_time=1 --gtest_filter=-*IE_GPU* --gtest_output=xml:$(INSTALL_TEST_DIR)\TEST-ONNXFrontend.xml
|
||||
displayName: 'ONNX Frontend Tests'
|
||||
|
||||
# TODO Reenable PDPD after paddlepaddle==2.5.0 with compliant protobuf is released (ticket 95904)
|
||||
#- script: call $(SETUPVARS) && $(INSTALL_TEST_DIR)\paddle_tests --gtest_print_time=1 --gtest_output=xml:$(INSTALL_TEST_DIR)\TEST-Paddle.xml
|
||||
# displayName: 'Paddle Frontend UT'
|
||||
|
||||
- script: call $(SETUPVARS) && $(INSTALL_TEST_DIR)\ov_tensorflow_frontend_tests --gtest_print_time=1 --gtest_output=xml:$(INSTALL_TEST_DIR)\TEST-Tensorflow.xml
|
||||
displayName: 'TensorFlow Frontend Unit Tests'
|
||||
|
||||
- script: call $(SETUPVARS) && $(INSTALL_TEST_DIR)\ov_tensorflow_common_tests --gtest_print_time=1 --gtest_output=xml:$(INSTALL_TEST_DIR)\TEST-TensorflowCommon.xml
|
||||
displayName: 'TensorFlow Common Unit Tests'
|
||||
|
||||
- script: call $(SETUPVARS) && $(INSTALL_TEST_DIR)\ov_tensorflow_lite_frontend_tests --gtest_print_time=1 --gtest_output=xml:$(INSTALL_TEST_DIR)\TEST-TensorflowLite.xml
|
||||
displayName: 'TensorFlow Lite Frontend Unit Tests'
|
||||
|
||||
- script: call $(SETUPVARS) && $(INSTALL_TEST_DIR)\ov_lp_transformations_tests --gtest_print_time=1 --gtest_output=xml:$(INSTALL_TEST_DIR)\LpTransformations.xml
|
||||
displayName: 'Low Precision Transformations Tests'
|
||||
|
||||
- script: call $(SETUPVARS) && $(INSTALL_TEST_DIR)\ov_transformations_tests --gtest_print_time=1 --gtest_output=xml:$(INSTALL_TEST_DIR)\Transformations.xml
|
||||
displayName: 'Transformations Tests'
|
||||
|
||||
- script: call $(SETUPVARS) && $(INSTALL_TEST_DIR)\ov_legacy_transformations_tests --gtest_print_time=1 --gtest_output=xml:$(INSTALL_TEST_DIR)\LegacyTransformations.xml
|
||||
displayName: 'Legacy Transformations Tests'
|
||||
|
||||
- script: call $(SETUPVARS) && $(INSTALL_TEST_DIR)\ov_util_tests --gtest_print_time=1 --gtest_output=xml:$(INSTALL_TEST_DIR)\CommonUtilTests.xml
|
||||
displayName: 'Common Utils Tests'
|
||||
|
||||
- script: call $(SETUPVARS) && $(INSTALL_TEST_DIR)\InferenceEngineUnitTests --gtest_output=xml:$(INSTALL_TEST_DIR)\TEST-InferenceEngineUnitTests.xml
|
||||
displayName: 'IE UT old'
|
||||
|
||||
- script: call $(SETUPVARS) && $(INSTALL_TEST_DIR)\ov_snippets_func_tests --gtest_output=xml:$(INSTALL_TEST_DIR)\TEST-ov_snippets_func_tests.xml
|
||||
displayName: 'Snippets Func Tests'
|
||||
|
||||
- script: call $(SETUPVARS) && $(INSTALL_TEST_DIR)\ov_cpu_unit_tests --gtest_output=xml:$(INSTALL_TEST_DIR)\TEST-ov_cpu_unit_tests.xml
|
||||
displayName: 'Intel CPU Unit Tests'
|
||||
|
||||
- script: call $(SETUPVARS) && $(INSTALL_TEST_DIR)\ov_gna_unit_tests --gtest_output=xml:$(INSTALL_TEST_DIR)\TEST-ov_gna_unit_tests.xml
|
||||
displayName: 'GNA UT'
|
||||
|
||||
- script: call $(SETUPVARS) && $(INSTALL_TEST_DIR)\ov_auto_unit_tests --gtest_output=xml:$(INSTALL_TEST_DIR)\TEST-ov_auto_unit_tests.xml
|
||||
displayName: 'AUTO UT'
|
||||
|
||||
- script: call $(SETUPVARS) && $(INSTALL_TEST_DIR)\ov_auto_batch_unit_tests --gtest_output=xml:$(INSTALL_TEST_DIR)\TEST-ov_auto_batch_unit_tests.xml
|
||||
displayName: 'AutoBatch UT'
|
||||
|
||||
- script: call $(SETUPVARS) && $(INSTALL_TEST_DIR)\ov_template_func_tests --gtest_output=xml:$(INSTALL_TEST_DIR)\TEST-templateFuncTests.xml
|
||||
displayName: 'TEMPLATE FuncTests'
|
||||
|
||||
- script: call $(SETUPVARS) && $(INSTALL_TEST_DIR)\ov_auto_batch_func_tests --gtest_output=xml:$(INSTALL_TEST_DIR)\TEST-ov_auto_batch_func_tests.xml
|
||||
displayName: 'AutoBatch FuncTests'
|
||||
|
||||
- script: call $(SETUPVARS) && $(INSTALL_TEST_DIR)\InferenceEngineCAPITests --gtest_output=xml:$(INSTALL_TEST_DIR)\TEST-InferenceEngineCAPITests.xml
|
||||
displayName: 'IE CAPITests'
|
||||
|
||||
- script: call $(SETUPVARS) && $(INSTALL_TEST_DIR)\ov_capi_test --gtest_output=xml:$(INSTALL_TEST_DIR)\TEST-ov_capi_test.xml
|
||||
displayName: 'OV CAPITests'
|
||||
|
||||
- script: call $(SETUPVARS) && $(INSTALL_TEST_DIR)\ov_cpu_func_tests --gtest_filter=*smoke* --gtest_output=xml:$(INSTALL_TEST_DIR)\TEST-ov_cpu_func_tests.xml
|
||||
displayName: 'CPU FuncTests'
|
||||
condition: and(succeeded(), eq(variables['CMAKE_BUILD_SHARED_LIBS'], 'OFF'))
|
||||
|
||||
- task: PublishTestResults@2
|
||||
condition: always()
|
||||
inputs:
|
||||
testResultsFormat: 'JUnit' # Options: JUnit, NUnit, VSTest, xUnit, cTest
|
||||
testResultsFiles: '**/TEST-*.xml'
|
||||
#searchFolder: '$(BUILD_DIR)'
|
||||
mergeTestResults: false # Optional
|
||||
#failTaskOnFailedTests: false # Optional
|
||||
#testRunTitle: 'Pre/Post-Commit' # Optional
|
||||
buildPlatform: 'x64' # Optional
|
||||
buildConfiguration: 'Windows' # Optional
|
||||
#publishRunAttachments: true # Optional
|
||||
@ -0,0 +1,181 @@
|
||||
trigger:
|
||||
branches:
|
||||
include:
|
||||
- 'master'
|
||||
- 'releases/*'
|
||||
paths:
|
||||
exclude:
|
||||
- '*/docs/*'
|
||||
- 'docs/*'
|
||||
- '*/*.md'
|
||||
- '*.md'
|
||||
- '*/layer_tests_summary/*'
|
||||
- '*/conformance/*'
|
||||
- 'tools/*'
|
||||
|
||||
pr:
|
||||
drafts: 'false'
|
||||
branches:
|
||||
include:
|
||||
- 'master'
|
||||
- 'releases/*'
|
||||
paths:
|
||||
exclude:
|
||||
- '*/docs/*'
|
||||
- 'docs/*'
|
||||
- '*/*.md'
|
||||
- '*.md'
|
||||
- '*/layer_tests_summary/*'
|
||||
- '*/conformance/*'
|
||||
- 'tools/*'
|
||||
|
||||
resources:
|
||||
repositories:
|
||||
- repository: testdata
|
||||
type: github
|
||||
endpoint: openvinotoolkit
|
||||
name: openvinotoolkit/testdata
|
||||
ref: master
|
||||
|
||||
variables:
|
||||
- group: github
|
||||
|
||||
jobs:
|
||||
- job: WinCC
|
||||
# About 150% of total time
|
||||
timeoutInMinutes: '120'
|
||||
|
||||
pool:
|
||||
name: WIN_VMSS_VENV_F8S_WU2
|
||||
|
||||
variables:
|
||||
system.debug: true
|
||||
VSTS_HTTP_RETRY: 5
|
||||
VSTS_HTTP_TIMEOUT: 200
|
||||
BUILD_TYPE: Release
|
||||
REPO_DIR: $(Build.Repository.LocalPath)
|
||||
MODELS_PATH: $(REPO_DIR)\..\testdata
|
||||
WORK_DIR: $(Pipeline.Workspace)\_w
|
||||
BUILD_DIR: $(WORK_DIR)\build
|
||||
BUILD_DIR_2: $(WORK_DIR)\build_s
|
||||
MSVS_VARS_PATH: C:\Program Files (x86)\Microsoft Visual Studio\2019\Enterprise\VC\Auxiliary\Build\vcvars64.bat
|
||||
MSVC_COMPILER_PATH: C:\Program Files (x86)\Microsoft Visual Studio\2019\Enterprise\VC\Tools\MSVC\14.24.28314\bin\Hostx64\x64\cl.exe
|
||||
INSTALL_DIR: $(WORK_DIR)\install_pkg
|
||||
SETUPVARS: $(INSTALL_DIR)\setupvars.bat
|
||||
|
||||
steps:
|
||||
- task: UsePythonVersion@0
|
||||
inputs:
|
||||
versionSpec: '3.11.2'
|
||||
addToPath: true
|
||||
architecture: 'x64'
|
||||
githubToken: $(auth_token)
|
||||
displayName: Setup Python 3.11
|
||||
name: setupPython
|
||||
|
||||
- script: |
|
||||
powershell -command "Invoke-RestMethod -Headers @{\"Metadata\"=\"true\"} -Method GET -Uri http://169.254.169.254/metadata/instance/compute?api-version=2019-06-01 | format-custom"
|
||||
where python
|
||||
python --version
|
||||
where java
|
||||
java -version
|
||||
where cmake
|
||||
cmake --version
|
||||
wmic computersystem get TotalPhysicalMemory
|
||||
wmic cpu list
|
||||
wmic logicaldisk get description,name
|
||||
wmic VOLUME list
|
||||
set
|
||||
displayName: 'System info'
|
||||
|
||||
- script: |
|
||||
rd /Q /S $(WORK_DIR) & mkdir $(WORK_DIR)
|
||||
rd /Q /S $(BUILD_DIR) & mkdir $(BUILD_DIR)
|
||||
rd /Q /S $(BUILD_DIR_2) & mkdir $(BUILD_DIR_2)
|
||||
displayName: 'Make dir'
|
||||
|
||||
- checkout: self
|
||||
clean: 'true'
|
||||
submodules: 'true'
|
||||
path: openvino
|
||||
|
||||
- script: |
|
||||
rem Speed up build
|
||||
powershell -command "Invoke-WebRequest https://github.com/ninja-build/ninja/releases/download/v1.10.2/ninja-win.zip -OutFile ninja-win.zip"
|
||||
powershell -command "Expand-Archive -Force ninja-win.zip"
|
||||
workingDirectory: $(WORK_DIR)
|
||||
displayName: 'Install dependencies'
|
||||
|
||||
- checkout: testdata
|
||||
clean: 'true'
|
||||
lfs: 'true'
|
||||
path: testdata
|
||||
|
||||
- script: |
|
||||
set PATH=$(WORK_DIR)\ninja-win;%PATH%
|
||||
call "$(MSVS_VARS_PATH)" && cmake ^
|
||||
-G Ninja ^
|
||||
-DENABLE_CPPLINT=OFF ^
|
||||
-DENABLE_GAPI_PREPROCESSING=OFF ^
|
||||
-DENABLE_PLUGINS_XML=ON ^
|
||||
-DCMAKE_COMPILE_WARNING_AS_ERROR=ON ^
|
||||
-DCMAKE_BUILD_TYPE=$(BUILD_TYPE) ^
|
||||
-DENABLE_PROFILING_ITT=ON ^
|
||||
-DSELECTIVE_BUILD=COLLECT ^
|
||||
-DCMAKE_C_COMPILER:PATH="$(MSVC_COMPILER_PATH)" ^
|
||||
-DCMAKE_CXX_COMPILER:PATH="$(MSVC_COMPILER_PATH)" ^
|
||||
-S $(REPO_DIR) ^
|
||||
-B $(BUILD_DIR)
|
||||
displayName: 'CMake CC COLLECT'
|
||||
|
||||
- script: dir $(REPO_DIR)\temp\ /s
|
||||
displayName: 'List temp SDKs'
|
||||
|
||||
- script: |
|
||||
call "$(MSVS_VARS_PATH)" && cmake --build $(BUILD_DIR) --config $(BUILD_TYPE) --parallel --target ^
|
||||
openvino_intel_cpu_plugin openvino_ir_frontend benchmark_app sea_itt_lib
|
||||
displayName: 'Build CC COLLECT'
|
||||
|
||||
- script: dir $(REPO_DIR)\bin\ /s
|
||||
displayName: 'List bin files'
|
||||
|
||||
- script: |
|
||||
set path=%path%;$(REPO_DIR)\temp\tbb\bin
|
||||
call "$(MSVS_VARS_PATH)" && python thirdparty\itt_collector\runtool\sea_runtool.py --bindir $(REPO_DIR)\bin\intel64\$(BUILD_TYPE) -o $(BUILD_DIR)\itt_stat ! $(REPO_DIR)\bin\intel64\$(BUILD_TYPE)\benchmark_app.exe -niter 1 -nireq 1 -m $(MODELS_PATH)\models\test_model\test_model_fp32.xml -d CPU
|
||||
workingDirectory: $(REPO_DIR)
|
||||
displayName: 'Code usage analysis'
|
||||
|
||||
- script: dir $(BUILD_DIR)\*.csv /s /p
|
||||
displayName: 'List csv files'
|
||||
|
||||
- script: |
|
||||
call "$(MSVS_VARS_PATH)" && cmake ^
|
||||
-G "Visual Studio 16 2019" ^
|
||||
-DVERBOSE_BUILD=ON ^
|
||||
-DENABLE_CPPLINT=OFF ^
|
||||
-DENABLE_GAPI_PREPROCESSING=OFF ^
|
||||
-DENABLE_PROFILING_ITT=OFF ^
|
||||
-DSELECTIVE_BUILD=ON ^
|
||||
-DCMAKE_COMPILE_WARNING_AS_ERROR=ON ^
|
||||
-DSELECTIVE_BUILD_STAT=$(BUILD_DIR)\*.csv ^
|
||||
-DCMAKE_C_COMPILER:PATH="$(MSVC_COMPILER_PATH)" ^
|
||||
-DCMAKE_CXX_COMPILER:PATH="$(MSVC_COMPILER_PATH)" ^
|
||||
-S $(REPO_DIR) ^
|
||||
-B $(BUILD_DIR_2)
|
||||
displayName: 'CMake CC ON'
|
||||
|
||||
- script: cmake --build $(BUILD_DIR_2) --config $(BUILD_TYPE) --parallel --target ^
|
||||
openvino_intel_cpu_plugin openvino_ir_frontend benchmark_app
|
||||
displayName: 'Build CC ON'
|
||||
|
||||
- script: dir $(REPO_DIR)\bin\ /s
|
||||
displayName: 'List bin files ON'
|
||||
|
||||
- script: type $(BUILD_DIR_2)\src\common\conditional_compilation\conditional_compilation_gen.h
|
||||
displayName: 'Check conditional_compilation_gen.h header'
|
||||
|
||||
- script: |
|
||||
set path=%path%;$(REPO_DIR)\temp\tbb\bin
|
||||
$(REPO_DIR)\bin\intel64\$(BUILD_TYPE)\benchmark_app.exe -niter 1 -nireq 1 -m $(MODELS_PATH)\models\test_model\test_model_fp32.xml -d CPU
|
||||
workingDirectory: $(REPO_DIR)
|
||||
displayName: 'Use OpenVINO after CC'
|
||||
76
modules/openvino-master/.ci/openvino-onnx/Dockerfile
Normal file
@ -0,0 +1,76 @@
|
||||
FROM ubuntu:23.04
|
||||
|
||||
LABEL version=2021.03.30.1
|
||||
|
||||
# Build configuration arguments
|
||||
ARG BUILD_TYPE=Release
|
||||
|
||||
ARG http_proxy
|
||||
ARG https_proxy
|
||||
ENV http_proxy ${http_proxy}
|
||||
ENV https_proxy ${https_proxy}
|
||||
|
||||
ENV CI=true
|
||||
ENV DEBIAN_FRONTEND=noninteractive
|
||||
ENV PYTHONUNBUFFERED 1
|
||||
|
||||
# Install base dependencies
|
||||
RUN apt-get update && apt-get install -y locales && apt-get clean autoclean && apt-get autoremove -y
|
||||
|
||||
# Set the locale to en_US.UTF-8
|
||||
RUN locale-gen en_US.UTF-8
|
||||
ENV LANG en_US.UTF-8
|
||||
ENV LANGUAGE en_US:en
|
||||
ENV LC_ALL en_US.UTF-8
|
||||
|
||||
RUN apt-get update && apt-get -y --no-install-recommends install \
|
||||
# OpenVINO dependencies
|
||||
build-essential \
|
||||
ninja-build \
|
||||
cmake \
|
||||
curl \
|
||||
git \
|
||||
unzip \
|
||||
libtbb-dev \
|
||||
libpugixml-dev \
|
||||
wget \
|
||||
# Python dependencies
|
||||
python3 \
|
||||
python3-pip \
|
||||
python3-dev \
|
||||
pybind11-dev \
|
||||
python3-virtualenv \
|
||||
cython3 \
|
||||
tox && \
|
||||
apt-get clean autoclean && \
|
||||
apt-get autoremove -y
|
||||
|
||||
# Build OpenVINO
|
||||
COPY . /openvino/
|
||||
WORKDIR /openvino/build
|
||||
RUN cmake .. \
|
||||
-G Ninja \
|
||||
-DCMAKE_BUILD_TYPE=${BUILD_TYPE} \
|
||||
-DENABLE_INTEL_GNA=OFF \
|
||||
-DENABLE_INTEL_GPU=OFF \
|
||||
-DENABLE_HETERO=OFF \
|
||||
-DENABLE_MULTI=OFF \
|
||||
-DENABLE_AUTO_BATCH=OFF \
|
||||
-DENABLE_GAPI_PREPROCESSING=OFF \
|
||||
-DENABLE_CPPLINT=OFF \
|
||||
-DENABLE_NCC_STYLE=OFF \
|
||||
-DENABLE_PROFILING_ITT=OFF \
|
||||
-DENABLE_SAMPLES=OFF \
|
||||
-DENABLE_OV_PADDLE_FRONTEND=OFF \
|
||||
-DENABLE_OV_PYTORCH_FRONTEND=ON \
|
||||
-DENABLE_OV_TF_FRONTEND=OFF \
|
||||
-DENABLE_OPENVINO_DEBUG=OFF \
|
||||
-DCMAKE_INSTALL_PREFIX=/openvino/dist
|
||||
RUN ninja install
|
||||
|
||||
# Run tests via tox
|
||||
WORKDIR /openvino/src/bindings/python
|
||||
ENV OpenVINO_DIR=/openvino/dist/runtime/cmake
|
||||
ENV LD_LIBRARY_PATH=/openvino/dist/runtime/lib/intel64:/openvino/dist/runtime/3rdparty/tbb/lib
|
||||
ENV PYTHONPATH=/openvino/bin/intel64/${BUILD_TYPE}/python:/openvino/tools/mo:${PYTHONPATH}
|
||||
CMD tox
|
||||
14
modules/openvino-master/.ci/pot/Jenkinsfile
vendored
Normal file
@ -0,0 +1,14 @@
|
||||
#!groovy
|
||||
|
||||
|
||||
properties([
|
||||
parameters([
|
||||
string(defaultValue: '',
|
||||
description: 'Pipeline shared library version (branch/tag/commit). Determined automatically if empty',
|
||||
name: 'library_version')
|
||||
])
|
||||
])
|
||||
|
||||
loadOpenVinoLibrary {
|
||||
potEntrypoint(this)
|
||||
}
|
||||
79
modules/openvino-master/.gitattributes
vendored
Normal file
@ -0,0 +1,79 @@
|
||||
###############################################################################
|
||||
# Set default behavior to automatically normalize line endings.
|
||||
###############################################################################
|
||||
* text=auto
|
||||
###############################################################################
|
||||
# Set default behavior for command prompt diff.
|
||||
#
|
||||
# This is need for earlier builds of msysgit that does not have it on by
|
||||
# default for csharp files.
|
||||
# Note: This is only used by command line
|
||||
###############################################################################
|
||||
#*.cs diff=csharp
|
||||
*.py text eol=lf
|
||||
###############################################################################
|
||||
# Set the merge driver for project and solution files
|
||||
#
|
||||
# Merging from the command prompt will add diff markers to the files if there
|
||||
# are conflicts (Merging from VS is not affected by the settings below, in VS
|
||||
# the diff markers are never inserted). Diff markers may cause the following
|
||||
# file extensions to fail to load in VS. An alternative would be to treat
|
||||
# these files as binary and thus will always conflict and require user
|
||||
# intervention with every merge. To do so, just uncomment the entries below
|
||||
###############################################################################
|
||||
#*.sln merge=binary
|
||||
#*.csproj merge=binary
|
||||
#*.vbproj merge=binary
|
||||
#*.vcxproj merge=binary
|
||||
#*.vcproj merge=binary
|
||||
#*.dbproj merge=binary
|
||||
#*.fsproj merge=binary
|
||||
#*.lsproj merge=binary
|
||||
#*.wixproj merge=binary
|
||||
#*.modelproj merge=binary
|
||||
#*.sqlproj merge=binary
|
||||
#*.wwaproj merge=binary
|
||||
###############################################################################
|
||||
# behavior for image files
|
||||
#
|
||||
# image files are treated as binary by default.
|
||||
###############################################################################
|
||||
#*.jpg binary
|
||||
#*.png binary
|
||||
#*.gif binary
|
||||
###############################################################################
|
||||
# diff behavior for common document formats
|
||||
#
|
||||
# Convert binary document formats to text before diffing them. This feature
|
||||
# is only available from the command line. Turn it on by uncommenting the
|
||||
# entries below.
|
||||
###############################################################################
|
||||
#*.doc diff=astextplain
|
||||
#*.DOC diff=astextplain
|
||||
#*.docx diff=astextplain
|
||||
#*.DOCX diff=astextplain
|
||||
#*.dot diff=astextplain
|
||||
#*.DOT diff=astextplain
|
||||
#*.pdf diff=astextplain
|
||||
#*.PDF diff=astextplain
|
||||
#*.rtf diff=astextplain
|
||||
#*.RTF diff=astextplain
|
||||
*.PNG filter=lfs diff=lfs merge=lfs -text
|
||||
*.png filter=lfs diff=lfs merge=lfs -text
|
||||
*.jpg filter=lfs diff=lfs merge=lfs -text
|
||||
*.gif filter=lfs diff=lfs merge=lfs -text
|
||||
*.vsdx filter=lfs diff=lfs merge=lfs -text
|
||||
*.bmp filter=lfs diff=lfs merge=lfs -text
|
||||
*.svg filter=lfs diff=lfs merge=lfs -text
|
||||
|
||||
#POT attributes
|
||||
tools/pot/tests/data/test_cases_refs/* filter=lfs diff=lfs merge=lfs -text
|
||||
tools/pot/tests/data/models/*/* filter=lfs diff=lfs merge=lfs -text
|
||||
tools/pot/tests/data/reference_models/* filter=lfs diff=lfs merge=lfs -text
|
||||
tools/pot/tests/data/video/* filter=lfs diff=lfs merge=lfs -text
|
||||
tools/pot/tests/data/reference_fake_quantize_conf/* filter=lfs diff=lfs merge=lfs -text
|
||||
/tools/pot/tests/** -pot_package
|
||||
/tools/pot/tools/auxilary/** -pot_package
|
||||
/tools/pot/tools/run_series_experiments.py -pot_package
|
||||
/tools/pot/.pylintrc -pot_package
|
||||
/tools/pot/README_dev.md -pot_package
|
||||
131
modules/openvino-master/.github/CODEOWNERS
vendored
Normal file
@ -0,0 +1,131 @@
|
||||
# See help here: https://help.github.com/en/github/creating-cloning-and-archiving-repositories/about-code-owners
|
||||
|
||||
* @openvinotoolkit/openvino-maintainers
|
||||
|
||||
# CI
|
||||
/Jenkinsfile @openvinotoolkit/openvino-ci-maintainers
|
||||
/.github/ @openvinotoolkit/openvino-ci-maintainers
|
||||
/.ci/ @openvinotoolkit/openvino-ci-maintainers
|
||||
/.github/CODEOWNERS @openvinotoolkit/openvino-admins @openvinotoolkit/openvino-maintainers
|
||||
|
||||
# Licensing:
|
||||
/licensing/ @openvinotoolkit/openvino-legal-maintainers
|
||||
/LICENSE @openvinotoolkit/openvino-legal-maintainers
|
||||
|
||||
# OpenVINO Scripts:
|
||||
/scripts/ @openvinotoolkit/openvino-scripts-maintainers
|
||||
/scripts/install_dependencies/ @openvinotoolkit/openvino-configuration-mgmt @openvinotoolkit/openvino-scripts-maintainers
|
||||
/install_build_dependencies.sh @openvinotoolkit/openvino-scripts-maintainers @openvinotoolkit/openvino-ie-maintainers
|
||||
|
||||
# OpenVINO Core:
|
||||
/src/inference/ @openvinotoolkit/openvino-ie-maintainers
|
||||
/src/common/conditional_compilation/ @openvinotoolkit/openvino-ie-maintainers
|
||||
/src/common/itt/ @openvinotoolkit/openvino-ie-maintainers
|
||||
/src/common/preprocessing/ @openvinotoolkit/openvino-ie-maintainers
|
||||
/src/common/util/ @openvinotoolkit/openvino-ie-maintainers
|
||||
/thirdparty/ @openvinotoolkit/openvino-ie-maintainers
|
||||
/.gitmodules @openvinotoolkit/openvino-ie-maintainers
|
||||
|
||||
/src/bindings/python/ @openvinotoolkit/openvino-ie-python-api-maintainers
|
||||
/src/bindings/c/ @openvinotoolkit/openvino-c-api-maintainers
|
||||
/src/common/*transformations/ @openvinotoolkit/openvino-ie-transformations-maintainers
|
||||
/src/core/ @openvinotoolkit/openvino-ngraph-maintainers
|
||||
|
||||
# OpenVINO Samples:
|
||||
/samples/c/ @openvinotoolkit/openvino-samples-maintainers @openvinotoolkit/openvino-c-api-maintainers
|
||||
/samples/cpp/ @openvinotoolkit/openvino-samples-maintainers @openvinotoolkit/openvino-maintainers
|
||||
/samples/python/ @openvinotoolkit/openvino-samples-maintainers @openvinotoolkit/openvino-ie-python-api-maintainers
|
||||
/thirdparty/zlib/ @openvinotoolkit/openvino-samples-maintainers
|
||||
/thirdparty/json/ @openvinotoolkit/openvino-samples-maintainers
|
||||
/thirdparty/gflags/ @openvinotoolkit/openvino-samples-maintainers
|
||||
/thirdparty/cnpy/ @openvinotoolkit/openvino-samples-maintainers
|
||||
|
||||
# OpenVINO Func Tests:
|
||||
/src/tests/ @openvinotoolkit/openvino-ie-tests-maintainers @openvinotoolkit/openvino-ie-test-developers
|
||||
/src/frontends/tests/frontend/shared/ @openvinotoolkit/openvino-ie-tests-maintainers
|
||||
/thirdparty/gtest/ @openvinotoolkit/openvino-ie-tests-maintainers
|
||||
|
||||
# OpenVINO CPU:
|
||||
/src/plugins/intel_cpu/ @openvinotoolkit/openvino-ie-cpu-maintainers @openvinotoolkit/openvino-ie-cpu-developers
|
||||
/src/common/snippets/ @openvinotoolkit/openvino-ie-cpu-maintainers
|
||||
/thirdparty/xbyak/ @openvinotoolkit/openvino-ie-cpu-maintainers
|
||||
|
||||
# OpenVINO LPT
|
||||
/src/common/low_precision_transformations/ @openvinotoolkit/openvino-ie-lpt-maintainers
|
||||
|
||||
# OpenVINO GPU:
|
||||
/src/plugins/intel_gpu/ @openvinotoolkit/openvino-ie-gpu-maintainers @openvinotoolkit/openvino-ie-gpu-developers
|
||||
/src/tests/**/gpu/ @openvinotoolkit/openvino-ie-gpu-maintainers
|
||||
/thirdparty/ocl/ @openvinotoolkit/openvino-ie-gpu-maintainers @openvinotoolkit/openvino-ie-gpu-developers
|
||||
|
||||
# OpenVINO GNA:
|
||||
/src/plugins/intel_gna/ @openvinotoolkit/openvino-ie-gna-maintainers
|
||||
|
||||
# OpenVINO Auto (MULTI) plugin:
|
||||
/src/plugins/auto/ @openvinotoolkit/openvino-ie-auto-multi-maintainers
|
||||
|
||||
# OpenVINO Auto (Batch) plugin:
|
||||
/src/plugins/auto_batch/ @openvinotoolkit/openvino-auto-batch-maintainers
|
||||
|
||||
# OpenVINO Hetero plugin:
|
||||
/src/plugins/hetero/ @openvinotoolkit/openvino-hetero-maintainers
|
||||
|
||||
# OpenVINO Template plugin:
|
||||
/src/plugins/template/ @openvinotoolkit/openvino-ie-template-maintainers
|
||||
|
||||
# OpenVINO Frontends:
|
||||
/src/frontends/common/ @openvinotoolkit/openvino-common-frontend-maintainers
|
||||
/src/frontends/ir/ @openvinotoolkit/openvino-ir-frontend-maintainers
|
||||
/src/frontends/paddle/ @openvinotoolkit/openvino-ie-paddle-maintainers
|
||||
/src/frontends/tensorflow/ @openvinotoolkit/openvino-tf-frontend-maintainers
|
||||
/src/frontends/tensorflow_common/ @openvinotoolkit/openvino-tf-frontend-maintainers
|
||||
/src/frontends/tensorflow_lite/ @openvinotoolkit/openvino-tf-frontend-maintainers
|
||||
/src/frontends/pytorch/ @openvinotoolkit/openvino-pytorch-frontend-maintainers
|
||||
|
||||
# OpenVINO ONNX Frontend:
|
||||
/src/frontends/onnx/ @openvinotoolkit/openvino-onnx-frontend-maintainers
|
||||
/thirdparty/onnx/ @openvinotoolkit/openvino-onnx-frontend-maintainers @openvinotoolkit/openvino-ie-maintainers
|
||||
/thirdparty/protobuf/ @openvinotoolkit/openvino-onnx-frontend-maintainers @openvinotoolkit/openvino-tf-frontend-maintainers @openvinotoolkit/openvino-ie-maintainers
|
||||
|
||||
# QA Tests:
|
||||
/tests/ @openvinotoolkit/openvino-tests-maintainers
|
||||
/tests/layer_tests/ @openvinotoolkit/openvino-tests-maintainers @openvinotoolkit/openvino-mo-maintainers
|
||||
/tests/layer_tests/pytorch_tests/ @openvinotoolkit/openvino-pytorch-frontend-maintainers
|
||||
/tests/layer_tests/tensorflow_tests @openvinotoolkit/openvino-tf-frontend-maintainers
|
||||
/tests/layer_tests/jax_tests @openvinotoolkit/openvino-tf-frontend-maintainers
|
||||
/tests/model_hub_tests @openvinotoolkit/openvino-tf-frontend-maintainers
|
||||
|
||||
# Tools:
|
||||
/tools/ @openvinotoolkit/openvino-tools-maintainers
|
||||
/tools/benchmark_tool/ @openvinotoolkit/openvino-ie-python-api-maintainers
|
||||
/tools/legacy/ @openvinotoolkit/openvino-samples-maintainers
|
||||
/tools/openvino_dev/ @openvinotoolkit/openvino-tools-maintainers @openvinotoolkit/openvino-ie-python-api-maintainers
|
||||
/tools/mo/ @openvinotoolkit/openvino-mo-maintainers
|
||||
/tools/ovc/ @openvinotoolkit/openvino-mo-maintainers
|
||||
/tools/pot/ @openvinotoolkit/openvino-pot-maintainers
|
||||
/thirdparty/open_model_zoo/ @openvinotoolkit/omz-maintainers @openvinotoolkit/openvino-pot-maintainers
|
||||
|
||||
# Documentation
|
||||
/docs/ @openvinotoolkit/openvino-docs-maintainers
|
||||
/docs/CMakeLists.txt @openvinotoolkit/openvino-ie-maintainers
|
||||
/**/*.md @openvinotoolkit/openvino-docs-maintainers
|
||||
/**/*.svg @openvinotoolkit/openvino-docs-maintainers
|
||||
/docs/MO_DG/ @openvinotoolkit/openvino-docs-maintainers @openvinotoolkit/openvino-mo-maintainers
|
||||
/docs/OV_Runtime_UG/ @openvinotoolkit/openvino-docs-maintainers @openvinotoolkit/openvino-ie-maintainers
|
||||
/docs/IE_PLUGIN_DG/ @openvinotoolkit/openvino-docs-maintainers @openvinotoolkit/openvino-ie-maintainers
|
||||
/docs/Extensibility_UG/ @openvinotoolkit/openvino-docs-maintainers @openvinotoolkit/openvino-ie-maintainers
|
||||
/docs/snippets/ @openvinotoolkit/openvino-docs-maintainers @openvinotoolkit/openvino-ie-maintainers
|
||||
/docs/OV_Runtime_UG/supported_plugins/ARM_CPU.md @openvinotoolkit/openvino-docs-maintainers @openvinotoolkit/openvino_contrib-arm_plugin-maintainers
|
||||
/docs/OV_Runtime_UG/supported_plugins/CPU.md @openvinotoolkit/openvino-docs-maintainers @openvinotoolkit/openvino-ie-cpu-maintainers
|
||||
/docs/OV_Runtime_UG/supported_plugins/GNA.md @openvinotoolkit/openvino-docs-maintainers @openvinotoolkit/openvino-ie-gna-maintainers
|
||||
/docs/OV_Runtime_UG/supported_plugins/GPU*.md @openvinotoolkit/openvino-docs-maintainers @openvinotoolkit/openvino-ie-gpu-maintainers
|
||||
|
||||
# Configuration management
|
||||
/**/setup.py @openvinotoolkit/openvino-configuration-mgmt
|
||||
/**/requirements*.* @openvinotoolkit/openvino-configuration-mgmt
|
||||
/docs/requirements.txt @openvinotoolkit/openvino-docs-maintainers @openvinotoolkit/openvino-configuration-mgmt
|
||||
|
||||
# CMake scripts
|
||||
/**/cmake/ @openvinotoolkit/openvino-ie-maintainers
|
||||
/**/*.cmake @openvinotoolkit/openvino-ie-maintainers
|
||||
/CMakeLists.txt @openvinotoolkit/openvino-ie-maintainers
|
||||
109
modules/openvino-master/.github/ISSUE_TEMPLATE/bug.yml
vendored
Normal file
@ -0,0 +1,109 @@
|
||||
name: Bug Report
|
||||
description: Create a report to help us improve
|
||||
title: "[Bug]: "
|
||||
labels: ["bug", "support_request"]
|
||||
body:
|
||||
- type: markdown
|
||||
attributes:
|
||||
value: |
|
||||
Please provide all the necessary information to expedite the response.
|
||||
- type: input
|
||||
id: ov_version
|
||||
attributes:
|
||||
label: OpenVINO Version
|
||||
description: OpenVINO version, branch, or tag in OpenVINO GitHub
|
||||
placeholder: 2021.4.0 LTS / Master Branch / tag 2022.3.0
|
||||
validations:
|
||||
required: true
|
||||
- type: dropdown
|
||||
id: os
|
||||
attributes:
|
||||
label: Operating System
|
||||
description: What OS are you using?
|
||||
options:
|
||||
- Ubuntu 18.04 (LTS)
|
||||
- Ubuntu 20.04 (LTS)
|
||||
- Windows System
|
||||
- Red Hat Enterprise Linux 8
|
||||
- Android System
|
||||
- Raspbian Stretch OS
|
||||
- macOS Systems for Intel CPU
|
||||
- macOS Systems for Apple Silicon
|
||||
- WebAssembly
|
||||
- Other (Please specify in description)
|
||||
validations:
|
||||
required: true
|
||||
- type: dropdown
|
||||
id: device_use
|
||||
attributes:
|
||||
label: Device used for inference
|
||||
description: What hardware are you using for inference?
|
||||
options:
|
||||
- CPU
|
||||
- GPU
|
||||
- GNA
|
||||
- NCS2 (Intel Movidius)
|
||||
- HDDL
|
||||
- AUTO
|
||||
- HETERO
|
||||
- BATCH
|
||||
validations:
|
||||
required: true
|
||||
- type: dropdown
|
||||
id: framework
|
||||
attributes:
|
||||
label: Framework
|
||||
description: Framework used in model optimization
|
||||
options:
|
||||
- TensorFlow 1
|
||||
- Keras (TensorFlow 2)
|
||||
- Caffe
|
||||
- ONNX
|
||||
- PyTorch
|
||||
- mxnet
|
||||
- PaddlePaddle
|
||||
validations:
|
||||
required: false
|
||||
- type: input
|
||||
id: model_name
|
||||
attributes:
|
||||
label: Model used
|
||||
description: Please provide us the link to your model in the description
|
||||
placeholder: ResNet50 / YOLOv4
|
||||
validations:
|
||||
required: false
|
||||
- type: textarea
|
||||
id: bug_description
|
||||
attributes:
|
||||
label: Issue description
|
||||
description: What issue are you having, and what did you expect to happen instead?
|
||||
placeholder: Please provide a detailed description of what happened
|
||||
value: "Error when performing model optimization on yolov4 model."
|
||||
validations:
|
||||
required: true
|
||||
- type: textarea
|
||||
id: step_by_step
|
||||
attributes:
|
||||
label: Step-by-step reproduction
|
||||
description: How can we reproduce your issue?
|
||||
placeholder: Please provide detailed instructions on how to reproduce the issue
|
||||
validations:
|
||||
required: false
|
||||
- type: textarea
|
||||
id: logs
|
||||
attributes:
|
||||
label: Relevant log output
|
||||
description: Please copy and paste any relevant log output. This will be automatically formatted into code, so no need for backticks.
|
||||
render: shell
|
||||
- type: checkboxes
|
||||
id: terms
|
||||
attributes:
|
||||
label: Issue submission checklist
|
||||
description: By submitting this issue, you agree to follow our [Code of Conduct](https://github.com/intel/intel-one-mono/blob/main/CODE_OF_CONDUCT.md)
|
||||
options:
|
||||
- label: I report the issue. It's not a question
|
||||
required: true
|
||||
- label: I checked the problem with the documentation, FAQ, open issues, Stack Overflow, etc., and have not found the solution
|
||||
required: true
|
||||
- label: There is reproducer code and related data files such as images, videos, models, etc.
|
||||
required: true
|
||||
95
modules/openvino-master/.github/ISSUE_TEMPLATE/build.yml
vendored
Normal file
@ -0,0 +1,95 @@
|
||||
name: Build Issue Report
|
||||
description: This report is for the build/installation issue
|
||||
title: "[Build]: "
|
||||
labels: ["build", "support_request"]
|
||||
body:
|
||||
- type: markdown
|
||||
attributes:
|
||||
value: |
|
||||
Please provide all the necessary information to expedite the response.
|
||||
- type: input
|
||||
id: ov_version
|
||||
attributes:
|
||||
label: OpenVINO Version
|
||||
description: OpenVINO version, branch, or tag in OpenVINO GitHub
|
||||
placeholder: 2021.4.0 LTS / Master Branch / tag 2022.3.0
|
||||
validations:
|
||||
required: true
|
||||
- type: dropdown
|
||||
id: os
|
||||
attributes:
|
||||
label: Operating System
|
||||
description: What OS are you using?
|
||||
options:
|
||||
- Ubuntu 18.04 (LTS)
|
||||
- Ubuntu 20.04 (LTS)
|
||||
- Ubuntu 22.04 (LTS)
|
||||
- Windows System
|
||||
- Red Hat Enterprise Linux 8
|
||||
- OpenSUSE
|
||||
- Android System
|
||||
- Raspbian Stretch OS
|
||||
- macOS Systems for Intel CPU
|
||||
- macOS Systems for Apple Silicon
|
||||
- WebAssembly
|
||||
- WSL2 for Windows
|
||||
- Other (Please specify in description)
|
||||
validations:
|
||||
required: true
|
||||
- type: dropdown
|
||||
id: architecture
|
||||
attributes:
|
||||
label: Hardware Architecture
|
||||
description: What is your hardware architecture used in this test?
|
||||
options:
|
||||
- x86 (64 bits)
|
||||
- x86 (32 bits)
|
||||
- ARM (64 bits)
|
||||
- ARM (32 bits)
|
||||
- RISC-V
|
||||
- Other (please specify in the description)
|
||||
validations:
|
||||
required: true
|
||||
- type: textarea
|
||||
id: target_platform
|
||||
attributes:
|
||||
label: Target Platform
|
||||
description: |
|
||||
You can also provide us full system log with the following command
|
||||
Windows cmd - "systeminfo"
|
||||
Linux terminal - "lscpu" and "lscpu -e"
|
||||
placeholder: Paste your full platform/system information here
|
||||
validations:
|
||||
required: false
|
||||
- type: textarea
|
||||
id: build_description
|
||||
attributes:
|
||||
label: Build issue description
|
||||
description: What issue are you facing during the build/installation?
|
||||
placeholder: Please provide a detailed description of what happened
|
||||
validations:
|
||||
required: true
|
||||
- type: textarea
|
||||
id: build_script
|
||||
attributes:
|
||||
label: Build scrip or step-by-step to reproduce
|
||||
description: How can we reproduce your issue?
|
||||
placeholder: Please provide detailed instructions on how to reproduce the issue
|
||||
validations:
|
||||
required: false
|
||||
- type: textarea
|
||||
id: build_logs
|
||||
attributes:
|
||||
label: Relevant log output
|
||||
description: Please copy and paste any relevant log output. This will be automatically formatted into code, so there is no need for backticks.
|
||||
render: shell
|
||||
- type: checkboxes
|
||||
id: terms
|
||||
attributes:
|
||||
label: Issue submission checklist
|
||||
description: By submitting this issue, you agree to follow our [Code of Conduct](https://github.com/intel/intel-one-mono/blob/main/CODE_OF_CONDUCT.md)
|
||||
options:
|
||||
- label: I report the issue. It's not a question
|
||||
required: true
|
||||
- label: I checked the problem with the documentation, FAQ, open issues, Stack Overflow, etc., and have not found the solution
|
||||
required: true
|
||||
30
modules/openvino-master/.github/ISSUE_TEMPLATE/documentation.yml
vendored
Normal file
@ -0,0 +1,30 @@
|
||||
name: Documentation issue Report
|
||||
description: This report is related to the documentation
|
||||
title: "[Docs]: "
|
||||
labels: ["docs", "support_request"]
|
||||
body:
|
||||
- type: markdown
|
||||
attributes:
|
||||
value: |
|
||||
Please provide all the necessary information to expedite the response.
|
||||
- type: input
|
||||
id: doc_link
|
||||
attributes:
|
||||
label: Documentation link
|
||||
description: Please provide the link for the documentation issue
|
||||
placeholder: e.g. intel.com/content/www/us/en/developer/tools/openvino-toolkit/system-requirements.html
|
||||
validations:
|
||||
required: true
|
||||
- type: textarea
|
||||
id: build_description
|
||||
attributes:
|
||||
label: Build issue description
|
||||
description: What issue are you facing during the build/installation?
|
||||
placeholder: Please provide a detailed description of what happened
|
||||
validations:
|
||||
required: true
|
||||
label: Issue submission checklist
|
||||
description: By submitting this issue, you agree to follow our [Code of Conduct](https://github.com/intel/intel-one-mono/blob/main/CODE_OF_CONDUCT.md)
|
||||
options:
|
||||
- label: I report the documentation issue. It's not a question
|
||||
required: true
|
||||
33
modules/openvino-master/.github/ISSUE_TEMPLATE/feature_request.yml
vendored
Normal file
@ -0,0 +1,33 @@
|
||||
name: Feature request
|
||||
description: Suggest a feature or improvement for the OpenVINO toolkit
|
||||
title: "[Feature Request]: "
|
||||
labels: ["enhancement", "feature"]
|
||||
assignees:
|
||||
- octocat
|
||||
body:
|
||||
- type: textarea
|
||||
id: request_description
|
||||
attributes:
|
||||
label: Request Description
|
||||
description: What is the request you would like us to improve on?
|
||||
placeholder: Please provide a detailed description of your request
|
||||
value: "To have OpenVINO support yolov8 model (with description)"
|
||||
validations:
|
||||
required: true
|
||||
- type: textarea
|
||||
id: feature_usecase
|
||||
attributes:
|
||||
label: Feature Use Case
|
||||
description: What is the use case of the feature you are proposing?
|
||||
placeholder: Please provide the use case where this will be useful
|
||||
value: "Recent autonomous vehicles have been using the yolov8 model to perform object segmentation."
|
||||
validations:
|
||||
required: false
|
||||
- type: checkboxes
|
||||
id: check2
|
||||
attributes:
|
||||
label: Issue submission checklist
|
||||
description: By submitting this issue, you agree to follow our [Code of Conduct](https://github.com/intel/intel-one-mono/blob/main/CODE_OF_CONDUCT.md)
|
||||
options:
|
||||
- label: The feature request or improvement must be related to OpenVINO
|
||||
required: true
|
||||
148
modules/openvino-master/.github/ISSUE_TEMPLATE/performance.yml
vendored
Normal file
@ -0,0 +1,148 @@
|
||||
name: Performance Issue Report
|
||||
description: This report is for the performance-related issue
|
||||
title: "[Performance]: "
|
||||
labels: ["performance", "support_request"]
|
||||
body:
|
||||
- type: markdown
|
||||
attributes:
|
||||
value: |
|
||||
Please provide all the necessary information to expedite the response.
|
||||
- type: input
|
||||
id: ov_version
|
||||
attributes:
|
||||
label: OpenVINO Version
|
||||
description: OpenVINO version, branch, or tag in OpenVINO GitHub
|
||||
placeholder: 2021.4.0 LTS / Master Branch / tag 2022.3.0
|
||||
validations:
|
||||
required: false
|
||||
- type: dropdown
|
||||
id: os
|
||||
attributes:
|
||||
label: Operating System
|
||||
description: What OS are you using?
|
||||
options:
|
||||
- Ubuntu 18.04 (LTS)
|
||||
- Ubuntu 20.04 (LTS)
|
||||
- Ubuntu 22.04 (LTS)
|
||||
- Windows System
|
||||
- Red Hat Enterprise Linux 8
|
||||
- OpenSUSE
|
||||
- Android System
|
||||
- Raspbian Stretch OS
|
||||
- macOS Systems for Intel CPU
|
||||
- macOS Systems for Apple Silicon
|
||||
- WebAssembly
|
||||
- WSL2 on Windows
|
||||
- Other (Please specify in description)
|
||||
validations:
|
||||
required: true
|
||||
- type: dropdown
|
||||
id: device_use
|
||||
attributes:
|
||||
label: Device used for inference
|
||||
description: What hardware are you using for inference?
|
||||
options:
|
||||
- CPU
|
||||
- iGPU
|
||||
- dGPU
|
||||
- NPU
|
||||
validations:
|
||||
required: false
|
||||
- type: dropdown
|
||||
id: openvino_installation
|
||||
attributes:
|
||||
label: OpenVINO installation
|
||||
description: How do you install OpenVINO on your system?
|
||||
options:
|
||||
- PyPi
|
||||
- Docker
|
||||
- Build from source
|
||||
- Other virtual machines
|
||||
validations:
|
||||
required: true
|
||||
- type: dropdown
|
||||
id: openvino_api
|
||||
attributes:
|
||||
label: Programming Language
|
||||
description: What is the programming language you use in your performance test?
|
||||
options:
|
||||
- Python
|
||||
- C++
|
||||
- Other
|
||||
validations:
|
||||
required: true
|
||||
- type: dropdown
|
||||
id: architecture
|
||||
attributes:
|
||||
label: Hardware Architecture
|
||||
description: What is your hardware architecture used in this test?
|
||||
options:
|
||||
- x86 (64 bits)
|
||||
- x86 (32 bits)
|
||||
- ARM (64 bits)
|
||||
- ARM (32 bits)
|
||||
- RISC-V
|
||||
- Other (please specify in the description)
|
||||
validations:
|
||||
required: true
|
||||
- type: input
|
||||
id: model_name
|
||||
attributes:
|
||||
label: Model used
|
||||
description: Please provide us the link to your model in the description
|
||||
placeholder: ResNet50 / YOLOv4
|
||||
validations:
|
||||
required: true
|
||||
- type: dropdown
|
||||
id: model_quantized
|
||||
attributes:
|
||||
label: Quantized model?
|
||||
description: Is your model quantized with model optimized?
|
||||
options:
|
||||
- 'Yes'
|
||||
- 'No'
|
||||
validations:
|
||||
required: true
|
||||
- type: textarea
|
||||
id: target_platform
|
||||
attributes:
|
||||
label: Target Platform
|
||||
description: |
|
||||
You can also provide us full system log with the following command
|
||||
Windows cmd - "systeminfo"
|
||||
Linux terminal - "lscpu" and "lscpu -e"
|
||||
placeholder: Paste your full platform/system information here
|
||||
validations:
|
||||
required: false
|
||||
- type: textarea
|
||||
id: performance_description
|
||||
attributes:
|
||||
label: Performance issue description
|
||||
description: What issue are you having, and what did you expect to happen instead?
|
||||
placeholder: Please provide a detailed description of what happened
|
||||
value: |
|
||||
let us know the application you use for performance tests
|
||||
e.g., hello_classification.py / benchmark_app / own test script
|
||||
if you are using your own test script, could this reproduce using benchmark_app?
|
||||
validations:
|
||||
required: true
|
||||
- type: textarea
|
||||
id: step_by_step
|
||||
attributes:
|
||||
label: Step-by-step reproduction
|
||||
description: How can we reproduce your issue?
|
||||
placeholder: Please provide detailed instructions on how to reproduce the issue
|
||||
validations:
|
||||
required: false
|
||||
- type: checkboxes
|
||||
id: terms
|
||||
attributes:
|
||||
label: Issue submission checklist
|
||||
description: By submitting this issue, you agree to follow our [Code of Conduct](https://github.com/intel/intel-one-mono/blob/main/CODE_OF_CONDUCT.md)
|
||||
options:
|
||||
- label: I report the performance issue. It's not a question
|
||||
required: true
|
||||
- label: I checked the problem with the documentation, FAQ, open issues, Stack Overflow, etc., and have not found the solution
|
||||
required: true
|
||||
- label: There is reproducer code and related data files such as images, videos, models, etc.
|
||||
required: true
|
||||
36
modules/openvino-master/.github/ISSUE_TEMPLATE/pre_release_feedback.yml
vendored
Normal file
@ -0,0 +1,36 @@
|
||||
name: Pre-release Feedback
|
||||
description: Feedback on Pre-release OpenVINO
|
||||
title: "[Pre-Release Feedback]:"
|
||||
labels: ["Pre-release", "support_request"]
|
||||
body:
|
||||
- type: input
|
||||
id: pre_version
|
||||
attributes:
|
||||
label: Pre-release Version
|
||||
description: What is the pre-release version you are using?
|
||||
placeholder: 2023.0.0.dev20230427
|
||||
validations:
|
||||
required: true
|
||||
- type: textarea
|
||||
id: feedback
|
||||
attributes:
|
||||
label: Pre-release feedback
|
||||
description: What is the issue or feedback on the pre-release?
|
||||
placeholder: Please describe the issue and/or feedback
|
||||
value: "Inference performance drop in OpenVINO 2022.4."
|
||||
validations:
|
||||
required: true
|
||||
- type: textarea
|
||||
id: thoughts
|
||||
attributes:
|
||||
label: New Feature Feedback?
|
||||
description: Do you have any feedback on the new features released in the pre-release?
|
||||
placeholder: Any thoughts on the new feature are welcome
|
||||
validations:
|
||||
required: false
|
||||
- type: markdown
|
||||
attributes:
|
||||
value: |
|
||||
By submitting this issue, you agree to follow our [Code of Conduct](https://github.com/intel/intel-one-mono/blob/main/CODE_OF_CONDUCT.md)
|
||||
Have a new feature you would like to see in OpenVINO? File a feature request <a href="https://github.com/openvinotoolkit/openvino/issues/new/choose">HERE</a>.
|
||||
You can also implement the feature by following the <a href="https://github.com/openvinotoolkit/openvino/blob/master/CONTRIBUTING.md">CONTRIBUTING.MD</a>
|
||||
159
modules/openvino-master/.github/dependabot.yml
vendored
Normal file
@ -0,0 +1,159 @@
|
||||
# See help here: https://docs.github.com/en/free-pro-team@latest/github/administering-a-repository/enabling-and-disabling-version-updates
|
||||
|
||||
version: 2
|
||||
updates:
|
||||
#
|
||||
# Python product dependencies
|
||||
#
|
||||
|
||||
# Python API, Frontends
|
||||
- package-ecosystem: pip
|
||||
directory: "/src/bindings/python/"
|
||||
schedule:
|
||||
interval: "daily"
|
||||
time: "09:00"
|
||||
timezone: "Poland"
|
||||
open-pull-requests-limit: 3
|
||||
assignees:
|
||||
- "jiwaszki"
|
||||
- "p-wysocki"
|
||||
- "akuporos"
|
||||
- "rkazants"
|
||||
- "ceciliapeng2011"
|
||||
- "meiyang-intel"
|
||||
- "mbencer"
|
||||
- "tomdol"
|
||||
- "jane-intel"
|
||||
versioning-strategy: increase-if-necessary
|
||||
|
||||
# Tests
|
||||
- package-ecosystem: pip
|
||||
directory: "/tests"
|
||||
schedule:
|
||||
interval: "daily"
|
||||
time: "09:00"
|
||||
timezone: "Poland"
|
||||
open-pull-requests-limit: 3
|
||||
assignees:
|
||||
- "jiwaszki"
|
||||
- "p-wysocki"
|
||||
- "akuporos"
|
||||
- "rkazants"
|
||||
versioning-strategy: increase-if-necessary
|
||||
|
||||
# Model Optimizer, openvino_dev and Benchmark tool
|
||||
- package-ecosystem: pip
|
||||
directory: "/tools"
|
||||
schedule:
|
||||
interval: "daily"
|
||||
time: "09:00"
|
||||
timezone: "Asia/Dubai"
|
||||
open-pull-requests-limit: 3
|
||||
assignees:
|
||||
- "rkazants"
|
||||
- "andrei-kochin"
|
||||
- "jiwaszki"
|
||||
- "p-wysocki"
|
||||
- "akuporos"
|
||||
- "Wovchena"
|
||||
allow:
|
||||
- dependency-name: "*"
|
||||
dependency-type: "production"
|
||||
versioning-strategy: increase-if-necessary
|
||||
|
||||
# POT requirements
|
||||
- package-ecosystem: pip
|
||||
directory: "/tools/pot"
|
||||
schedule:
|
||||
interval: "daily"
|
||||
time: "09:00"
|
||||
timezone: "Asia/Dubai"
|
||||
open-pull-requests-limit: 3
|
||||
assignees:
|
||||
- "AlexKoff88"
|
||||
- "KodiaqQ"
|
||||
- "jiwaszki"
|
||||
- "p-wysocki"
|
||||
- "akuporos"
|
||||
- "rkazants"
|
||||
versioning-strategy: increase-if-necessary
|
||||
|
||||
#
|
||||
# Python Samples
|
||||
#
|
||||
|
||||
- package-ecosystem: pip
|
||||
directory: "/samples/python/hello_reshape_ssd/"
|
||||
schedule:
|
||||
interval: "daily"
|
||||
time: "09:00"
|
||||
timezone: "Asia/Dubai"
|
||||
open-pull-requests-limit: 3
|
||||
assignees:
|
||||
- "Wovchena"
|
||||
- "jiwaszki"
|
||||
- "p-wysocki"
|
||||
- "akuporos"
|
||||
- "rkazants"
|
||||
versioning-strategy: increase-if-necessary
|
||||
|
||||
- package-ecosystem: pip
|
||||
directory: "/samples/python/classification_sample_async/"
|
||||
schedule:
|
||||
interval: "daily"
|
||||
time: "09:00"
|
||||
timezone: "Asia/Dubai"
|
||||
open-pull-requests-limit: 3
|
||||
assignees:
|
||||
- "Wovchena"
|
||||
- "jiwaszki"
|
||||
- "p-wysocki"
|
||||
- "akuporos"
|
||||
- "rkazants"
|
||||
versioning-strategy: increase-if-necessary
|
||||
|
||||
- package-ecosystem: pip
|
||||
directory: "/samples/python/benchmark/bert_benchmark/"
|
||||
schedule:
|
||||
interval: "daily"
|
||||
time: "09:00"
|
||||
timezone: "Asia/Dubai"
|
||||
open-pull-requests-limit: 3
|
||||
assignees:
|
||||
- "Wovchena"
|
||||
- "jiwaszki"
|
||||
- "p-wysocki"
|
||||
- "akuporos"
|
||||
- "rkazants"
|
||||
versioning-strategy: increase-if-necessary
|
||||
|
||||
- package-ecosystem: pip
|
||||
directory: "/samples/python/hello_classification/"
|
||||
schedule:
|
||||
interval: "daily"
|
||||
time: "09:00"
|
||||
timezone: "Asia/Dubai"
|
||||
open-pull-requests-limit: 3
|
||||
assignees:
|
||||
- "Wovchena"
|
||||
- "jiwaszki"
|
||||
- "p-wysocki"
|
||||
- "akuporos"
|
||||
- "rkazants"
|
||||
versioning-strategy: increase-if-necessary
|
||||
|
||||
#
|
||||
# Github actions - CI
|
||||
#
|
||||
|
||||
# Github actions
|
||||
- package-ecosystem: github-actions
|
||||
directory: "/"
|
||||
schedule:
|
||||
interval: "daily"
|
||||
time: "09:00"
|
||||
timezone: "Asia/Dubai"
|
||||
assignees:
|
||||
- "ilyachur"
|
||||
- "ilya-lavrenov"
|
||||
open-pull-requests-limit: 3
|
||||
13
modules/openvino-master/.github/dependency_review.yml
vendored
Normal file
@ -0,0 +1,13 @@
|
||||
fail-on-severity: 'low'
|
||||
allow-licenses:
|
||||
- 'BSD-2-Clause'
|
||||
- 'BSD-3-Clause'
|
||||
- 'BSD-2-Clause AND BSD-3-Clause'
|
||||
- 'MIT'
|
||||
- 'Apache-2.0'
|
||||
fail-on-scopes:
|
||||
- 'runtime'
|
||||
- 'development'
|
||||
- 'unknown'
|
||||
license-check: true
|
||||
vulnerability-check: true
|
||||
0
modules/openvino-master/.github/github_org_control/__init__.py
vendored
Normal file
139
modules/openvino-master/.github/github_org_control/check_org.py
vendored
Normal file
@ -0,0 +1,139 @@
|
||||
# Copyright (C) 2018-2021 Intel Corporation
|
||||
# SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
"""
|
||||
Check GitHub organization and invite members
|
||||
"""
|
||||
|
||||
# pylint: disable=fixme,no-member,too-many-locals
|
||||
|
||||
import sys
|
||||
from pathlib import Path
|
||||
from argparse import ArgumentParser
|
||||
|
||||
sys.path.append(str(Path(__file__).resolve().parents[1]))
|
||||
from github_org_control.configs import Config
|
||||
from github_org_control.github_api import GithubOrgApi, get_dev_emails, print_users
|
||||
from github_org_control.ldap_api import LdapApi, print_user_info, InfoLevel
|
||||
|
||||
|
||||
def remove_members(gh_api, cfg_emails, org_emails, dev_emails, org_emails_no_in_ldap):
|
||||
"""Checks and remove members"""
|
||||
print(
|
||||
f"\n{'=' * 10} Check accounts below and remove from the GitHub organization or "
|
||||
f"configuration {'=' * 10}"
|
||||
)
|
||||
|
||||
cfg_emails_no_in_org = sorted(cfg_emails.difference(org_emails))
|
||||
print(
|
||||
f"\nCfg developer emails - absent in GitHub organization {len(cfg_emails_no_in_org)}:",
|
||||
"; ".join(cfg_emails_no_in_org),
|
||||
)
|
||||
|
||||
non_member_ignored_logins = set(Config().IGNORE_LOGINS).difference(
|
||||
set(gh_api.org_members_by_login.keys())
|
||||
)
|
||||
print(
|
||||
f"\nIgnored logins - absent in GitHub organization {len(non_member_ignored_logins)}:\n",
|
||||
"\n".join(non_member_ignored_logins),
|
||||
)
|
||||
|
||||
org_emails_no_in_dev = sorted(org_emails.difference(dev_emails))
|
||||
print(
|
||||
f"\nOrg member emails - absent in cfg and LDAP PDLs {len(org_emails_no_in_dev)}:",
|
||||
"; ".join(org_emails_no_in_dev),
|
||||
)
|
||||
|
||||
print(
|
||||
f"\nOrg member emails - absent in LDAP at all {len(org_emails_no_in_ldap)}:",
|
||||
"; ".join(sorted(org_emails_no_in_ldap)),
|
||||
)
|
||||
|
||||
print("\nOrg members - no real name:")
|
||||
members_to_fix_name = sorted(gh_api.members_to_fix_name, key=lambda member: member.email)
|
||||
print_users(members_to_fix_name)
|
||||
print(
|
||||
"\nOrg member emails - no real name:",
|
||||
"; ".join([member.email.lower() for member in members_to_fix_name]),
|
||||
)
|
||||
|
||||
print("\nOrg members - no Intel emails:")
|
||||
print_users(gh_api.members_to_remove)
|
||||
|
||||
gh_api.remove_users(org_emails_no_in_ldap | gh_api.members_to_remove)
|
||||
|
||||
|
||||
def main():
|
||||
"""The main entry point function"""
|
||||
arg_parser = ArgumentParser()
|
||||
arg_parser.add_argument(
|
||||
"--cfg-file",
|
||||
metavar="PATH",
|
||||
default=Config.default_cfg_path,
|
||||
help=f"Path to json configuration file, e.g. {Config.default_cfg_path}",
|
||||
)
|
||||
arg_parser.add_argument("--teams", action="store_true", help="Check GitHub teams")
|
||||
arg_parser.add_argument("--no-ldap", action="store_true", help="Don't use LDAP info")
|
||||
args, unknown_args = arg_parser.parse_known_args()
|
||||
|
||||
Config(args.cfg_file, unknown_args)
|
||||
gh_api = GithubOrgApi()
|
||||
|
||||
if args.teams:
|
||||
gh_api.get_org_teams()
|
||||
return
|
||||
|
||||
cfg_emails = get_dev_emails()
|
||||
print(f"\nCfg developer emails {len(cfg_emails)}:", "; ".join(sorted(cfg_emails)))
|
||||
|
||||
dev_emails = set()
|
||||
dev_emails.update(cfg_emails)
|
||||
|
||||
if not args.no_ldap:
|
||||
ldap_api = LdapApi()
|
||||
ldap_emails = ldap_api.get_user_emails()
|
||||
dev_emails.update(ldap_emails)
|
||||
print(f"\nLDAP developer emails {len(ldap_emails)}:", "; ".join(sorted(ldap_emails)))
|
||||
|
||||
cfg_emails_no_in_ldap = ldap_api.get_absent_emails(cfg_emails)
|
||||
print(
|
||||
f"\nCfg developer emails - absent in LDAP at all {len(cfg_emails_no_in_ldap)}:",
|
||||
"; ".join(sorted(cfg_emails_no_in_ldap)),
|
||||
)
|
||||
|
||||
cfg_ldap_inters = cfg_emails.intersection(ldap_emails)
|
||||
print(
|
||||
f"\nCfg developer emails - present in LDAP developers {len(cfg_ldap_inters)}:",
|
||||
"; ".join(sorted(cfg_ldap_inters)),
|
||||
)
|
||||
|
||||
org_emails = gh_api.get_org_emails()
|
||||
print(f"\nOrg emails {len(org_emails)}:", "; ".join(sorted(org_emails)))
|
||||
|
||||
org_emails_no_in_ldap = set()
|
||||
if not args.no_ldap:
|
||||
org_ldap_diff = org_emails.difference(ldap_emails)
|
||||
print(
|
||||
f"\nOrg member emails - absent in LDAP developers {len(org_ldap_diff)}:",
|
||||
"; ".join(sorted(org_ldap_diff)),
|
||||
)
|
||||
|
||||
for email in org_ldap_diff:
|
||||
user_info = ldap_api.get_user_info_by_email(email)
|
||||
if user_info:
|
||||
print_user_info(user_info, InfoLevel.PDL)
|
||||
else:
|
||||
org_emails_no_in_ldap.add(email)
|
||||
|
||||
org_pendig_invitation_emails = gh_api.get_org_invitation_emails()
|
||||
invite_emails = dev_emails.difference(org_emails).difference(org_pendig_invitation_emails)
|
||||
print(f"\nInvite emails {len(invite_emails)}:", "; ".join(sorted(invite_emails)))
|
||||
|
||||
valid_github_users = gh_api.get_valid_github_users(invite_emails)
|
||||
gh_api.invite_users(valid_github_users)
|
||||
|
||||
remove_members(gh_api, cfg_emails, org_emails, dev_emails, org_emails_no_in_ldap)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
261
modules/openvino-master/.github/github_org_control/check_pr.py
vendored
Normal file
@ -0,0 +1,261 @@
|
||||
# Copyright (C) 2018-2021 Intel Corporation
|
||||
# SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
"""
|
||||
Check GitHub PRs and set labels by type and categories, e.g. 'ExternalPR', 'category: ci'
|
||||
"""
|
||||
|
||||
# pylint: disable=fixme,no-member
|
||||
|
||||
import re
|
||||
import sys
|
||||
import datetime
|
||||
from enum import Enum
|
||||
from pathlib import Path
|
||||
from argparse import ArgumentParser
|
||||
|
||||
sys.path.append(str(Path(__file__).resolve().parents[1]))
|
||||
from github_org_control import github_api
|
||||
from github_org_control.configs import Config
|
||||
|
||||
|
||||
class PrType(Enum):
|
||||
"""Constants for type of GitHub pull request by author membership"""
|
||||
|
||||
EXTERNAL = "ExternalPR"
|
||||
INTEL = "ExternalIntelPR"
|
||||
ORG = "OpenvinoPR"
|
||||
BAD = "BadPR"
|
||||
|
||||
|
||||
def get_pr_labels(pull):
|
||||
"""Gets PR labels as set"""
|
||||
pr_lables = set()
|
||||
for label in pull.labels:
|
||||
pr_lables.add(label.name)
|
||||
return pr_lables
|
||||
|
||||
|
||||
def set_pr_labels(pull, labels):
|
||||
"""Sets new PR labels (all previously set labels are removed)"""
|
||||
if not labels or Config().DRY_RUN:
|
||||
return
|
||||
print("Set PR labels:", labels)
|
||||
# set_labels() should accept list but fails with empty "AssertionError:"
|
||||
pull.set_labels(labels)
|
||||
|
||||
|
||||
def add_pr_labels(pull, labels):
|
||||
"""Adds PR labels"""
|
||||
if not labels or Config().DRY_RUN:
|
||||
return
|
||||
print("Add PR labels:", labels)
|
||||
for label in labels:
|
||||
pull.add_to_labels(label)
|
||||
|
||||
|
||||
def get_pr_type_by_labels(pull):
|
||||
"""Gets PR type using labels"""
|
||||
pr_lables = get_pr_labels(pull)
|
||||
pr_types = set(type.value for type in PrType)
|
||||
pr_types_labels = pr_lables & pr_types
|
||||
if not pr_types_labels:
|
||||
return None
|
||||
if len(pr_types_labels) > 1:
|
||||
print(f"Duplicated labels: {pr_types_labels}")
|
||||
return PrType.BAD
|
||||
return PrType(PrType(pr_types_labels.pop()))
|
||||
|
||||
|
||||
def get_label_by_team_name_re(team_name):
|
||||
"""Generates label by PR reviwer team name using regular expressions"""
|
||||
if "admins" in team_name:
|
||||
return "category: ci"
|
||||
re_compile_label = re.compile(rf"{Config().GITHUB_REPO}-(.+)-maintainers")
|
||||
re_label = re_compile_label.match(team_name)
|
||||
if re_label:
|
||||
return f"category: {re_label.group(1).strip()}"
|
||||
return None
|
||||
|
||||
|
||||
def get_label_by_team_name_map(team_name):
|
||||
"""Generates label by PR reviwer team name using config map"""
|
||||
return Config().TEAM_TO_LABEL.get(team_name)
|
||||
|
||||
|
||||
def get_category_labels(pull):
|
||||
"""Gets list of category labels by all PR reviwer teams"""
|
||||
labels = []
|
||||
pr_lables = get_pr_labels(pull)
|
||||
for reviewer_team in pull.get_review_requests()[1]:
|
||||
reviewer_label = get_label_by_team_name_map(reviewer_team.name)
|
||||
if reviewer_label and reviewer_label not in pr_lables:
|
||||
labels.append(reviewer_label)
|
||||
return labels
|
||||
|
||||
|
||||
def get_pr_info_str(pull):
|
||||
"""Gets info about PR using a few workarounds"""
|
||||
pr_title = pull.title.encode("ASCII", "ignore").decode()
|
||||
|
||||
# Workaround for PyGithub issue: https://github.com/PyGithub/PyGithub/issues/512
|
||||
pr_created_at = pull.created_at.replace(tzinfo=datetime.timezone.utc).astimezone()
|
||||
|
||||
return (
|
||||
f"PR: {pull.number} - {pr_title} - Created: {pr_created_at} - "
|
||||
f"Labels: {get_pr_labels(pull)} - Type: {get_pr_type_by_labels(pull)}"
|
||||
)
|
||||
|
||||
|
||||
def update_labels(gh_api, pull, non_org_intel_pr_users, non_org_pr_users):
|
||||
"""Checks and updates labels"""
|
||||
print("Check and update labels:")
|
||||
pr_type_by_labels = get_pr_type_by_labels(pull)
|
||||
add_labels = []
|
||||
|
||||
# Checks PR source type
|
||||
if gh_api.is_org_user(pull.user):
|
||||
print(" - Org user")
|
||||
elif github_api.is_intel_email(pull.user.email) or github_api.is_intel_company(
|
||||
pull.user.company
|
||||
):
|
||||
print(" - Non org user with Intel email or company")
|
||||
non_org_intel_pr_users.add(pull.user)
|
||||
if pr_type_by_labels is not PrType.INTEL:
|
||||
print(f'NO "{PrType.INTEL.value}" label: ', end="")
|
||||
github_api.print_users(pull.user)
|
||||
add_labels.append(PrType.INTEL.value)
|
||||
elif github_api.is_user_ignored(pull.user):
|
||||
print(" - IGNORED non org user with NO Intel email or company")
|
||||
else:
|
||||
print(" - Non org user with NO Intel email or company")
|
||||
non_org_pr_users.add(pull.user)
|
||||
if pr_type_by_labels is not PrType.EXTERNAL:
|
||||
print(f'NO "{PrType.EXTERNAL.value}" label: ', end="")
|
||||
github_api.print_users(pull.user)
|
||||
add_labels.append(PrType.EXTERNAL.value)
|
||||
|
||||
add_labels += get_category_labels(pull)
|
||||
add_pr_labels(pull, add_labels)
|
||||
|
||||
|
||||
def get_wrong_commits(pull):
|
||||
"""Returns commits with incorrect user and email"""
|
||||
pr_author_email = (pull.user.email or "").lower()
|
||||
print("GitHub PR author email:", pr_author_email)
|
||||
print("Check commits:")
|
||||
wrong_commits = set()
|
||||
for commit in pull.get_commits():
|
||||
# import pprint; pprint.pprint(commit.raw_data)
|
||||
print("Commit SHA:", commit.sha)
|
||||
# Use raw data because commit author can be non GitHub user
|
||||
commit_author_email = (commit.raw_data["commit"]["author"]["email"] or "").lower()
|
||||
commit_committer_email = (commit.raw_data["commit"]["committer"]["email"] or "").lower()
|
||||
print(" Commit author email:", commit_author_email)
|
||||
print(" Commit committer email:", commit_committer_email)
|
||||
if not github_api.is_valid_user(commit.author):
|
||||
print(
|
||||
" ERROR: User with the commit author email is absent in GitHub:",
|
||||
commit.raw_data["commit"]["author"]["name"],
|
||||
)
|
||||
wrong_commits.add(commit.sha)
|
||||
if not github_api.is_valid_user(commit.committer):
|
||||
print(
|
||||
" ERROR: User with the commit committer email is absent in GitHub:",
|
||||
commit.raw_data["commit"]["committer"]["name"],
|
||||
)
|
||||
wrong_commits.add(commit.sha)
|
||||
if not commit.raw_data["commit"]["verification"]["verified"]:
|
||||
print(
|
||||
" WARNING: The commit is not verified. Reason:",
|
||||
commit.raw_data["commit"]["verification"]["reason"],
|
||||
)
|
||||
if pr_author_email != commit_author_email or pr_author_email != commit_committer_email:
|
||||
print(" WARNING: Commit emails and GitHub PR author public email are differnt")
|
||||
return wrong_commits
|
||||
|
||||
|
||||
def main():
|
||||
"""The main entry point function"""
|
||||
arg_parser = ArgumentParser()
|
||||
arg_parser.add_argument(
|
||||
"--cfg-file",
|
||||
metavar="PATH",
|
||||
default=Config.default_cfg_path,
|
||||
help=f"Path to json configuration file, e.g. {Config.default_cfg_path}",
|
||||
)
|
||||
arg_parser.add_argument(
|
||||
"--pr", metavar="NUMBER", help="Get GitHub pull request with the number"
|
||||
)
|
||||
arg_parser.add_argument(
|
||||
"--pr-state",
|
||||
default="open",
|
||||
choices=["open", "closed"],
|
||||
help="Set GitHub pull request state",
|
||||
)
|
||||
arg_parser.add_argument(
|
||||
"--newer", metavar="MINUTES", help="Get newly created GitHub pull request only"
|
||||
)
|
||||
arg_parser.add_argument(
|
||||
"--check-commits",
|
||||
action="store_true",
|
||||
help="Check and compare git commit email with GitHub account email",
|
||||
)
|
||||
args, unknown_args = arg_parser.parse_known_args()
|
||||
|
||||
Config(args.cfg_file, unknown_args)
|
||||
gh_api = github_api.GithubOrgApi()
|
||||
|
||||
if args.pr:
|
||||
pulls = [gh_api.repo.get_pull(int(args.pr))]
|
||||
else:
|
||||
pulls = gh_api.repo.get_pulls(state=args.pr_state)
|
||||
print(f"\nPRs count ({args.pr_state}):", pulls.totalCount)
|
||||
|
||||
if args.newer:
|
||||
pr_created_after = (
|
||||
datetime.datetime.now() - datetime.timedelta(minutes=int(args.newer))
|
||||
).astimezone()
|
||||
print("Checking PRs created after:", pr_created_after)
|
||||
|
||||
non_org_intel_pr_users = set()
|
||||
non_org_pr_users = set()
|
||||
wrong_pulls = {}
|
||||
|
||||
for pull in pulls:
|
||||
pr_created_at = pull.created_at.replace(tzinfo=datetime.timezone.utc).astimezone()
|
||||
if args.newer and pr_created_at <= pr_created_after:
|
||||
print(f"\nIGNORE: {get_pr_info_str(pull)}")
|
||||
continue
|
||||
|
||||
print(f"\n{get_pr_info_str(pull)}")
|
||||
if args.check_commits:
|
||||
wrong_commits = get_wrong_commits(pull)
|
||||
if wrong_commits:
|
||||
wrong_pulls[pull.number] = wrong_commits
|
||||
else:
|
||||
update_labels(gh_api, pull, non_org_intel_pr_users, non_org_pr_users)
|
||||
|
||||
if wrong_pulls:
|
||||
for pull_number, wrong_commits in wrong_pulls.items():
|
||||
print(
|
||||
f"\nERROR: Remove or replace wrong commits in the PR {pull_number}:\n ",
|
||||
"\n ".join(wrong_commits),
|
||||
)
|
||||
print(
|
||||
"\nAbout commit signature verification:\n ",
|
||||
"https://docs.github.com/en/github/authenticating-to-github/"
|
||||
"managing-commit-signature-verification/about-commit-signature-verification",
|
||||
)
|
||||
sys.exit(1)
|
||||
|
||||
if non_org_intel_pr_users:
|
||||
print("\nNon org user with Intel email or company:")
|
||||
github_api.print_users(non_org_intel_pr_users)
|
||||
if non_org_pr_users:
|
||||
print("\nNon org user with NO Intel email or company:")
|
||||
github_api.print_users(non_org_pr_users)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
51
modules/openvino-master/.github/github_org_control/config.json
vendored
Normal file
@ -0,0 +1,51 @@
|
||||
{
|
||||
"GITHUB_TOKEN": "<Put token here or set as arg or as env variable>",
|
||||
"GITHUB_ORGANIZATION": "openvinotoolkit",
|
||||
"GITHUB_REPO": "openvino",
|
||||
"IGNORE_LOGINS": [
|
||||
"openvino-ci",
|
||||
"openvino-pushbot",
|
||||
"workbench-ci-bot",
|
||||
"openvino-pot-ci",
|
||||
"sysicvvpux",
|
||||
"ote-ci-bot"
|
||||
],
|
||||
"MAX_MEMBERS_TO_REMOVE": 15,
|
||||
"EMAILS_FILE_PATH": "dev_emails-test.txt",
|
||||
"PROXIES": {
|
||||
"HTTP_PROXY": null,
|
||||
"HTTPS_PROXY": null,
|
||||
"NO_PROXY": "localhost,127.0.0.1,.intel.com"
|
||||
},
|
||||
"DRY_RUN": false,
|
||||
"TEAM_TO_LABEL": {
|
||||
"openvino-ci-maintainers": "category: CI",
|
||||
"openvino-maintainers": "category: inference",
|
||||
"openvino-docs-maintainers": "category: docs",
|
||||
"openvino-ie-maintainers": "category: inference",
|
||||
"openvino-ie-cpu-maintainers": "category: CPU",
|
||||
"openvino-ie-gna-maintainers": "category: GNA",
|
||||
"openvino-ie-gpu-maintainers": "category: GPU",
|
||||
"openvino-ie-lpt-maintainers": "category: LP transformations",
|
||||
"openvino-ie-transformations-maintainers": "category: transformations",
|
||||
"openvino-ie-auto-multi-maintainers": "category: AUTO",
|
||||
"openvino-auto-batch-maintainers": "category: AUTO BATCH",
|
||||
"openvino-hetero-maintainers": "category: HETERO",
|
||||
"openvino-ie-python-api-maintainers": "category: Python API",
|
||||
"openvino-ie-template-maintainers": "category: TEMPLATE",
|
||||
"openvino-ir-frontend-maintainers": "category: IR FE",
|
||||
"openvino-ie-paddle-maintainers": "category: PDPD FE",
|
||||
"openvino-tf-frontend-maintainers": "category: TF FE",
|
||||
"openvino-onnx-frontend-maintainers": "category: ONNX FE",
|
||||
"openvino-ie-tests-maintainers": "category: IE Tests",
|
||||
"openvino-mo-maintainers": "category: MO",
|
||||
"openvino-ngraph-maintainers": "category: Core",
|
||||
"openvino-scripts-maintainers": "category: build",
|
||||
"openvino-tests-maintainers": "category: IE Tests",
|
||||
"openvino-tools-maintainers": "category: tools",
|
||||
"openvino-pot-maintainers": "category: POT",
|
||||
"openvino-configuration-mgmt": "category: dependency_changes",
|
||||
"openvino-samples-maintainers": "category: samples",
|
||||
"openvino-c-api-maintainers": "category: C API"
|
||||
}
|
||||
}
|
||||
120
modules/openvino-master/.github/github_org_control/configs.py
vendored
Normal file
@ -0,0 +1,120 @@
|
||||
# Copyright (C) 2018-2021 Intel Corporation
|
||||
# SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
"""
|
||||
Configurations management
|
||||
"""
|
||||
|
||||
# pylint: disable=fixme,broad-except
|
||||
|
||||
import os
|
||||
import sys
|
||||
import ast
|
||||
import json
|
||||
from pathlib import Path
|
||||
|
||||
|
||||
if sys.version_info[:2] < (3, 7):
|
||||
raise Exception("Python version must be >= 3.7")
|
||||
|
||||
|
||||
class ConfigException(Exception):
|
||||
"""Base configuration exception"""
|
||||
|
||||
|
||||
class Config:
|
||||
"""Configuration wrapper"""
|
||||
|
||||
_instance = None
|
||||
_properties = None
|
||||
|
||||
default_cfg_path = Path(__file__).resolve().parent / "config.json"
|
||||
|
||||
def __new__(cls, *_args, **_kwargs):
|
||||
if not Config._instance:
|
||||
Config._instance = super(Config, cls).__new__(cls)
|
||||
return Config._instance
|
||||
|
||||
def __init__(self, file_path=None, cli_args=None):
|
||||
"""
|
||||
:param file_path: Path to json configuration file
|
||||
:type file_path: String
|
||||
|
||||
:param args: List of argparse arguments with patterns: 'name=value' or 'name'
|
||||
:type args: list
|
||||
"""
|
||||
if Config._properties:
|
||||
return
|
||||
|
||||
self._file_path = file_path or Config.default_cfg_path
|
||||
self._cli_args = cli_args or []
|
||||
|
||||
self._json_cfg = {}
|
||||
self._args = {}
|
||||
|
||||
self._load_cfg()
|
||||
self._parse_cli_args()
|
||||
|
||||
Config._properties = {}
|
||||
for name, value in self._json_cfg.items():
|
||||
if hasattr(self, name):
|
||||
raise ConfigException(f"Duplicating prosperity: {name}")
|
||||
property_value = self._args.get(name) or os.getenv(name)
|
||||
if property_value:
|
||||
# Try to set prosperity_value as Python literal structures, e.g. DRY_RUN=False
|
||||
try:
|
||||
property_value = ast.literal_eval(property_value)
|
||||
except Exception:
|
||||
pass
|
||||
if not isinstance(property_value, type(value)):
|
||||
raise ConfigException(f"Python type of {name} parameter must be {type(value)}")
|
||||
else:
|
||||
property_value = value
|
||||
Config._properties[name] = property_value
|
||||
|
||||
self.set_proxy()
|
||||
|
||||
def __getattr__(self, attr_name):
|
||||
if attr_name in self._properties:
|
||||
return self._properties.get(attr_name)
|
||||
raise AttributeError(f"'{self.__class__.__name__}' object has no attribute '{attr_name}'")
|
||||
|
||||
def _load_cfg(self):
|
||||
"""Load the json configuration file"""
|
||||
try:
|
||||
with open(self._file_path, encoding="utf-8") as conf:
|
||||
self._json_cfg = json.load(conf)
|
||||
except Exception as exc:
|
||||
raise ConfigException("Failed to load configuration from:", self._file_path) from exc
|
||||
|
||||
def _parse_cli_args(self):
|
||||
"""Parse argparse arguments with patterns: 'name=value' or 'name'"""
|
||||
for cli_arg in self._cli_args:
|
||||
arg = cli_arg.split("=")
|
||||
if arg[0] not in self._json_cfg:
|
||||
raise ConfigException(f"Unsupported argument: {arg}")
|
||||
self._args[arg[0]] = True if len(arg) == 1 else "=".join(arg[1:])
|
||||
|
||||
@property
|
||||
def properties(self):
|
||||
"""Get all properties as Dict"""
|
||||
return self._properties
|
||||
|
||||
def set_proxy(self):
|
||||
"""Set proxies"""
|
||||
for proxy_name, url in self._properties["PROXIES"].items():
|
||||
if url is not None:
|
||||
print(f"Set proxy: {proxy_name}={url}")
|
||||
os.environ[proxy_name] = url
|
||||
|
||||
|
||||
def _test():
|
||||
"""Test and debug"""
|
||||
print("Config.default_cfg_path:", Config.default_cfg_path)
|
||||
cfg = Config(cli_args=["DRY_RUN", 'PROXIES={"NO_PROXY": "localhost"}'])
|
||||
print("Config.properties:", cfg.properties)
|
||||
print("cfg.PROXIES:", cfg.PROXIES)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
_test()
|
||||
9
modules/openvino-master/.github/github_org_control/dev_emails-test.txt
vendored
Normal file
@ -0,0 +1,9 @@
|
||||
# good comment
|
||||
Last_name, First_name <first_name.last_name@intel.com>
|
||||
first_name.last_name@intel.com
|
||||
openvino_pushbot@intel.com
|
||||
|
||||
# Wrong emails
|
||||
foo@foo.com
|
||||
foo1 foo2
|
||||
foo1 foo2@intel.com
|
||||
384
modules/openvino-master/.github/github_org_control/github_api.py
vendored
Normal file
@ -0,0 +1,384 @@
|
||||
# Copyright (C) 2018-2021 Intel Corporation
|
||||
# SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
"""
|
||||
GitHub API for controlling organization
|
||||
"""
|
||||
|
||||
# pylint: disable=fixme,no-member
|
||||
|
||||
import re
|
||||
import sys
|
||||
import time
|
||||
import typing
|
||||
from pathlib import Path
|
||||
|
||||
from github import Github, GithubException, RateLimitExceededException, IncompletableObject
|
||||
from github.PaginatedList import PaginatedList
|
||||
|
||||
sys.path.append(str(Path(__file__).resolve().parents[1]))
|
||||
from github_org_control.configs import Config
|
||||
|
||||
|
||||
class GithubApiException(Exception):
|
||||
"""Base GitHub API exception"""
|
||||
|
||||
|
||||
def is_valid_user(user):
|
||||
"""Checks that user is valid github.Github object"""
|
||||
try:
|
||||
return user and user.login
|
||||
except IncompletableObject:
|
||||
return False
|
||||
|
||||
|
||||
def is_user_ignored(user):
|
||||
"""Checks that user should be ignored"""
|
||||
if is_valid_user(user) and user.login.lower() not in Config().IGNORE_LOGINS:
|
||||
return False
|
||||
return True
|
||||
|
||||
|
||||
def is_valid_name(name):
|
||||
"""Checks that GitHub user's name is valid"""
|
||||
return name and len(name) >= 3 and " " in name
|
||||
|
||||
|
||||
def is_intel_email(email):
|
||||
"""Checks that email is valid Intel email"""
|
||||
return email and len(email) > 10 and " " not in email and email.lower().endswith("@intel.com")
|
||||
|
||||
|
||||
def is_intel_company(company):
|
||||
"""Checks that company contains intel"""
|
||||
return company and "intel" in company.lower()
|
||||
|
||||
|
||||
def is_valid_intel_user(user):
|
||||
"""Checks that user is valid GitHub and Intel user"""
|
||||
try:
|
||||
return is_valid_user(user) and is_valid_name(user.name) and is_intel_email(user.email)
|
||||
except IncompletableObject:
|
||||
return False
|
||||
|
||||
|
||||
def print_users(users):
|
||||
"""Print list of users in different formats: list, set, PaginatedList"""
|
||||
if isinstance(users, (list, set, PaginatedList)):
|
||||
users_count = users.totalCount if isinstance(users, PaginatedList) else len(users)
|
||||
print(f"GitHub users {users_count} (login - name - company - email - valid):")
|
||||
else:
|
||||
users = [users]
|
||||
for user in users:
|
||||
if not is_valid_user(user):
|
||||
print("WRONG GitHub user: ???")
|
||||
continue
|
||||
|
||||
try:
|
||||
name = user.name
|
||||
except IncompletableObject:
|
||||
name = "???"
|
||||
|
||||
try:
|
||||
company = user.company
|
||||
except IncompletableObject:
|
||||
company = "???"
|
||||
|
||||
try:
|
||||
email = user.email
|
||||
except IncompletableObject:
|
||||
email = "???"
|
||||
|
||||
valid_check = "OK" if is_valid_intel_user(user) else "FIX"
|
||||
if not is_intel_email(email):
|
||||
valid_check += " email"
|
||||
if not is_valid_name(name):
|
||||
valid_check += " name"
|
||||
print(f'{user.login} - "{name}" - "{company}" - {email} - {valid_check}')
|
||||
|
||||
|
||||
def get_dev_emails():
|
||||
"""
|
||||
Read a file with developer emails. Supported email formats
|
||||
first_name.last_name@intel.com
|
||||
Import from Outlook: Last_name, First_name <first_name.last_name@intel.com>
|
||||
"""
|
||||
re_email = re.compile(r".+<(.+)>")
|
||||
emails = set()
|
||||
cfg = Config()
|
||||
with open(cfg.properties["EMAILS_FILE_PATH"]) as file_obj:
|
||||
for line in file_obj:
|
||||
line = line.strip().lower()
|
||||
if not line or line.startswith("#"):
|
||||
continue
|
||||
re_outlook_email = re_email.match(line)
|
||||
if re_outlook_email:
|
||||
line = re_outlook_email.group(1).strip()
|
||||
if not is_intel_email(line):
|
||||
print(f'Wrong email in {cfg.properties["EMAILS_FILE_PATH"]}: {line}')
|
||||
continue
|
||||
emails.add(line)
|
||||
return emails
|
||||
|
||||
|
||||
class GithubOrgApi:
|
||||
"""Common API for GitHub organization"""
|
||||
|
||||
def __init__(self):
|
||||
self._cfg = Config()
|
||||
self.github = Github(self._cfg.GITHUB_TOKEN)
|
||||
self.github_org = self.github.get_organization(self._cfg.GITHUB_ORGANIZATION)
|
||||
self.repo = self.github.get_repo(f"{self._cfg.GITHUB_ORGANIZATION}/{self._cfg.GITHUB_REPO}")
|
||||
self.github_users_by_email = {}
|
||||
self.org_members_by_login = {}
|
||||
self.members_to_remove = set()
|
||||
self.members_to_fix_name = set()
|
||||
|
||||
def is_org_user(self, user):
|
||||
"""Checks that user is a member of GitHub organization"""
|
||||
if is_valid_user(user):
|
||||
# user.get_organization_membership(self.github_org) doesn't work with org members
|
||||
# permissions, GITHUB_TOKEN must be org owner now
|
||||
return self.github_org.has_in_members(user)
|
||||
return False
|
||||
|
||||
def get_org_emails(self):
|
||||
"""Gets and prints emails of all GitHub organization members"""
|
||||
org_members = self.github_org.get_members()
|
||||
org_emails = set()
|
||||
|
||||
print(f"\nOrg members {org_members.totalCount} (login - name - company - email - valid):")
|
||||
for org_member in org_members:
|
||||
self.org_members_by_login[org_member.login.lower()] = org_member
|
||||
print_users(org_member)
|
||||
if is_intel_email(org_member.email):
|
||||
email = org_member.email.lower()
|
||||
org_emails.add(email)
|
||||
self.github_users_by_email[email] = org_member
|
||||
if not is_valid_name(org_member.name):
|
||||
self.members_to_fix_name.add(org_member)
|
||||
else:
|
||||
self.members_to_remove.add(org_member)
|
||||
|
||||
print("\nOrg members - no Intel emails:")
|
||||
print_users(self.members_to_remove)
|
||||
|
||||
print("\nOrg members - no real name:")
|
||||
print_users(self.members_to_fix_name)
|
||||
print(
|
||||
"\nOrg member emails - no real name:",
|
||||
"; ".join([member.email.lower() for member in self.members_to_fix_name]),
|
||||
)
|
||||
|
||||
return org_emails
|
||||
|
||||
def get_org_invitation_emails(self):
|
||||
"""Gets GitHub organization teams prints info"""
|
||||
org_invitations = self.github_org.invitations()
|
||||
org_invitation_emails = set()
|
||||
|
||||
print(
|
||||
f"\nOrg invitations {org_invitations.totalCount} "
|
||||
"(login - name - company - email - valid):"
|
||||
)
|
||||
for org_invitation in org_invitations:
|
||||
print_users(org_invitation)
|
||||
if is_user_ignored(org_invitation):
|
||||
continue
|
||||
if is_intel_email(org_invitation.email):
|
||||
org_invitation_emails.add(org_invitation.email.lower())
|
||||
else:
|
||||
print("Strange org invitation:", org_invitation)
|
||||
|
||||
print(
|
||||
f"\nOrg invitation emails {len(org_invitation_emails)}:",
|
||||
"; ".join(org_invitation_emails),
|
||||
)
|
||||
return org_invitation_emails
|
||||
|
||||
def get_org_teams(self):
|
||||
"""Gets GitHub organization teams prints info"""
|
||||
teams = []
|
||||
org_teams = self.github_org.get_teams()
|
||||
print("\nOrg teams count:", org_teams.totalCount)
|
||||
for team in org_teams:
|
||||
teams.append(team.name)
|
||||
print(f"\nTeam: {team.name} - parent: {team.parent}")
|
||||
|
||||
repos = team.get_repos()
|
||||
print("Repos:")
|
||||
for repo in repos:
|
||||
print(f" {repo.name} -", team.get_repo_permission(repo))
|
||||
|
||||
team_maintainers = team.get_members(role="maintainer")
|
||||
team_maintainer_logins = set()
|
||||
for maintainer in team_maintainers:
|
||||
team_maintainer_logins.add(maintainer.login)
|
||||
team_members = team.get_members(role="member")
|
||||
team_member_logins = set()
|
||||
for member in team_members:
|
||||
team_member_logins.add(member.login)
|
||||
members = team.get_members(role="all")
|
||||
member_emails = []
|
||||
print("Members (role - login - name - company - email - valid):")
|
||||
for user in members:
|
||||
if user.login in team_maintainer_logins:
|
||||
print(" Maintainer - ", end="")
|
||||
elif user.login in team_member_logins:
|
||||
print(" Member - ", end="")
|
||||
else:
|
||||
# It is not possible to check child teams members
|
||||
print(" ??? - ", end="")
|
||||
print_users(user)
|
||||
if is_intel_email(user.email) and not is_user_ignored(user):
|
||||
member_emails.append(user.email.lower())
|
||||
print(f"Intel emails {len(member_emails)}:", "; ".join(member_emails))
|
||||
return teams
|
||||
|
||||
def get_github_user_by_email(self, email):
|
||||
"""Gets GitHub user by email"""
|
||||
if email in self.github_users_by_email:
|
||||
return self.github_users_by_email.get(email)
|
||||
|
||||
def search_users():
|
||||
paginated_users = self.github.search_users(f"{email} in:email")
|
||||
# Minimize the GitHub Rate Limit
|
||||
users = []
|
||||
for user in paginated_users:
|
||||
users.append(user)
|
||||
if len(users) == 1:
|
||||
return users[0]
|
||||
if len(users) == 0:
|
||||
return None
|
||||
raise GithubApiException(
|
||||
f"ERROR: Found {len(users)} GitHub accounts with the same email {email}"
|
||||
)
|
||||
|
||||
try:
|
||||
user = search_users()
|
||||
except RateLimitExceededException:
|
||||
print("WARNING: RateLimitExceededException")
|
||||
time.sleep(30)
|
||||
user = search_users()
|
||||
self.github_users_by_email[email] = user
|
||||
|
||||
return user
|
||||
|
||||
def get_valid_github_users(self, emails):
|
||||
"""Gets valid GitHub users by email and prints status"""
|
||||
valid_users = set()
|
||||
wrong_emails = set()
|
||||
no_account_emails = set()
|
||||
no_account_names = set()
|
||||
print(f"\nGitHub users from {len(emails)} invite emails (email - status):")
|
||||
for email in emails:
|
||||
if not is_intel_email(email):
|
||||
print(f"{email} - Non Intel email")
|
||||
wrong_emails.add(email)
|
||||
continue
|
||||
|
||||
# You can make up to 30 requests per minute; https://developer.github.com/v3/search/
|
||||
time.sleep(2)
|
||||
user = self.get_github_user_by_email(email)
|
||||
|
||||
if not user:
|
||||
print(f"{email} - No valid GitHub account")
|
||||
no_account_emails.add(email)
|
||||
continue
|
||||
|
||||
if user.email and user.email.lower() == email:
|
||||
if is_valid_name(user.name):
|
||||
print(f"{email} - OK")
|
||||
valid_users.add(user)
|
||||
else:
|
||||
print(f"{email} - No valid name in GitHub account: ", end="")
|
||||
print_users(user)
|
||||
no_account_names.add(email)
|
||||
else:
|
||||
print(f"{email} - Non public or wrong email in GitHub account: ", end="")
|
||||
print_users(user)
|
||||
no_account_emails.add(email)
|
||||
|
||||
print("\nValid users:")
|
||||
print_users(valid_users)
|
||||
|
||||
print(f"\nWrong emails {len(wrong_emails)}:", "; ".join(wrong_emails))
|
||||
|
||||
print(
|
||||
f"\nIntel emails - No valid GitHub account {len(no_account_emails)}:",
|
||||
"; ".join(no_account_emails),
|
||||
)
|
||||
|
||||
print(
|
||||
f"\nIntel emails - No valid name in GitHub account {len(no_account_names)}:",
|
||||
"; ".join(no_account_names),
|
||||
)
|
||||
return valid_users
|
||||
|
||||
def invite_users(self, users):
|
||||
"""Invites users to GitHub organization and prints status"""
|
||||
if not isinstance(users, typing.Iterable):
|
||||
users = [users]
|
||||
print(f"\nInvite {len(users)} users:")
|
||||
|
||||
for user in users:
|
||||
if isinstance(user, str):
|
||||
print(f"Email: {user}")
|
||||
self.github_org.invite_user(email=user)
|
||||
else:
|
||||
print(f'{user.login} - "{user.name}" - {user.email} - ', end="")
|
||||
try:
|
||||
if is_user_ignored(user):
|
||||
print("Ignored")
|
||||
continue
|
||||
if self._cfg.DRY_RUN:
|
||||
print("Dry run")
|
||||
continue
|
||||
self.github_org.invite_user(user=user)
|
||||
print("OK")
|
||||
except GithubException as exc:
|
||||
print(f'FAIL: {exc.data["errors"][0]["message"]}')
|
||||
|
||||
def remove_users(self, users):
|
||||
"""Removes users from GitHub organization"""
|
||||
if not isinstance(users, typing.Iterable):
|
||||
users = [users]
|
||||
print(f"\nRemove {len(users)} users:")
|
||||
|
||||
dry_run = self._cfg.DRY_RUN
|
||||
if not dry_run and len(users) > self._cfg.MAX_MEMBERS_TO_REMOVE:
|
||||
print(
|
||||
"WARNING: Review is required for removing members more than "
|
||||
f"{self._cfg.MAX_MEMBERS_TO_REMOVE}"
|
||||
)
|
||||
# TODO: Add notification
|
||||
dry_run = True
|
||||
|
||||
for user in users:
|
||||
member = self.get_github_user_by_email(user) if isinstance(user, str) else user
|
||||
print(f'{member.login} - "{member.name}" - {member.email} - ', end="")
|
||||
try:
|
||||
if is_user_ignored(member):
|
||||
print("Ignored")
|
||||
continue
|
||||
if dry_run:
|
||||
print("Dry run")
|
||||
continue
|
||||
self.github_org.remove_from_membership(member)
|
||||
print("OK")
|
||||
except GithubException as exc:
|
||||
print(f'FAIL: {exc.data["errors"][0]["message"]}')
|
||||
|
||||
|
||||
def _test():
|
||||
"""Test and debug"""
|
||||
Config(cli_args=["DRY_RUN=True"])
|
||||
dev_emails = get_dev_emails()
|
||||
print("dev_emails:", dev_emails)
|
||||
|
||||
gh_api = GithubOrgApi()
|
||||
gh_api.get_org_emails()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
_test()
|
||||
247
modules/openvino-master/.github/github_org_control/ldap_api.py
vendored
Normal file
@ -0,0 +1,247 @@
|
||||
# Copyright (C) 2018-2021 Intel Corporation
|
||||
# SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
"""
|
||||
Gets info about users and groups via LDAP
|
||||
"""
|
||||
|
||||
# pylint: disable=fixme,no-member
|
||||
|
||||
import sys
|
||||
from enum import Enum
|
||||
from pathlib import Path
|
||||
|
||||
from ldap3 import Server, Connection, ALL, SUBTREE
|
||||
|
||||
sys.path.append(str(Path(__file__).resolve().parents[1]))
|
||||
from github_org_control.configs import Config
|
||||
|
||||
|
||||
class LdapApiException(Exception):
|
||||
"""Base LDAP API exception"""
|
||||
|
||||
|
||||
class InfoLevel(Enum):
|
||||
"""Constants for printing user info from LDAP"""
|
||||
|
||||
PDL = "PDL" # Public Distribution List (group of e-mail addresses)
|
||||
FULL = "Full"
|
||||
|
||||
|
||||
def print_user_info(info, info_level=None):
|
||||
"""Pretty-print of a user info data structure (dict). info_level is the InfoLevel Enum"""
|
||||
if not info or not info.get("mail"):
|
||||
raise LdapApiException("ERROR: No info or absent mail")
|
||||
|
||||
def get_membership():
|
||||
if info_level == InfoLevel.PDL:
|
||||
membership_info = " PDLs:"
|
||||
elif info_level == InfoLevel.FULL:
|
||||
membership_info = " memberOf :"
|
||||
else:
|
||||
return ""
|
||||
# Grouping groups by purpose
|
||||
if info_level == InfoLevel.PDL:
|
||||
sort_key = lambda i: i.split(",", 1)[0].lower()
|
||||
else:
|
||||
sort_key = lambda i: i.split(",", 1)[1] + i.split(",", 1)[0].lower()
|
||||
for item in sorted(info["memberOf"], key=sort_key):
|
||||
if info_level == InfoLevel.PDL and "OU=Delegated" not in item:
|
||||
continue
|
||||
membership_info += f"\n {item}"
|
||||
return membership_info
|
||||
|
||||
try:
|
||||
text_info = (
|
||||
f'\n{info["cn"]} <{info["mail"]}>; {info["sAMAccountName"]}; {info["employeeID"]}'
|
||||
f'\n Org group: {info["intelSuperGroupDescr"]} ({info["intelSuperGroupShortName"]}) /'
|
||||
f' {info["intelGroupDescr"]} ({info["intelGroupShortName"]}) /'
|
||||
f' {info["intelDivisionDescr"]} ({info["intelDivisionShortName"]}) /'
|
||||
f' {info["intelOrgUnitDescr"]}'
|
||||
f'\n Manager: {info.get("manager")}'
|
||||
f'\n Location: {info["intelRegionCode"]} / {info["co"]} / {info["intelSiteCode"]} /'
|
||||
f' {info["intelBldgCode"]} ({info.get("intelSiteName")}) /'
|
||||
f' {info["physicalDeliveryOfficeName"]}'
|
||||
f'\n Other: {info["employeeType"]} | {info["intelExportCountryGroup"]} |'
|
||||
f' {info["whenCreated"]} | {info["intelCostCenterDescr"]} | {info["jobDescription"]}'
|
||||
)
|
||||
except Exception as exc:
|
||||
raise LdapApiException(
|
||||
f'ERROR: Failed to get info about "{info["mail"]}". '
|
||||
f"Exception occurred:\n{repr(exc)}"
|
||||
) from exc
|
||||
print(text_info)
|
||||
|
||||
membership = get_membership()
|
||||
if info_level == InfoLevel.PDL and membership:
|
||||
print(membership)
|
||||
elif info_level == InfoLevel.FULL:
|
||||
for key in sorted(info):
|
||||
if isinstance(info[key], list):
|
||||
if key == "memberOf":
|
||||
print(membership)
|
||||
else:
|
||||
print(f" {key} :")
|
||||
for item in info[key]:
|
||||
print(" ", item)
|
||||
else:
|
||||
print(f" {key} : {info[key]}")
|
||||
|
||||
|
||||
class LdapApi:
|
||||
"""LDAP API for getting user info and emails"""
|
||||
|
||||
_binary_blobs = ["thumbnailPhoto", "msExchUMSpokenName", "msExchBlockedSendersHash"]
|
||||
_check_existing = [
|
||||
"intelExportCountryGroup",
|
||||
"physicalDeliveryOfficeName",
|
||||
"intelSuperGroupShortName",
|
||||
"intelGroupShortName",
|
||||
"intelDivisionShortName",
|
||||
]
|
||||
|
||||
null = "<null>"
|
||||
|
||||
def __init__(self):
|
||||
self._cfg = Config()
|
||||
self.server = Server(self._cfg.LDAP_SERVER, get_info=ALL)
|
||||
self.connection = Connection(
|
||||
self.server, user=self._cfg.LDAP_USER, password=self._cfg.LDAP_PASSWORD, auto_bind=True
|
||||
)
|
||||
self.connection.bind()
|
||||
|
||||
def get_user_emails(self, groups=None):
|
||||
"""Gets emails of LDAP groups and sub-groups"""
|
||||
print("\nGet emails from LDAP groups:")
|
||||
processed_ldap_members = {}
|
||||
|
||||
def process_group_members(member, parent_group):
|
||||
if member in processed_ldap_members:
|
||||
processed_ldap_members[member]["parent_groups"].append(parent_group)
|
||||
print(
|
||||
"\nWARNING: Ignore LDAP member to avoid duplication and recursive cycling "
|
||||
f"of PDLs: {member}\n "
|
||||
f'email: {processed_ldap_members[member].get("email")}\n parent_groups:'
|
||||
)
|
||||
for group in processed_ldap_members[member].get("parent_groups", []):
|
||||
print(7 * " ", group)
|
||||
|
||||
return
|
||||
processed_ldap_members[member] = {"email": None, "parent_groups": [parent_group]}
|
||||
|
||||
# AD moves terminated users to the boneyard OU in case the user returns,
|
||||
# so it can be reactivated with little effort.
|
||||
# After 30 days it is removed and the unix personality becomes unlinked.
|
||||
if "OU=Boneyard" in member:
|
||||
return
|
||||
self.connection.search(
|
||||
member, r"(objectClass=*)", SUBTREE, attributes=["cn", "member", "mail"]
|
||||
)
|
||||
|
||||
# print(self.connection.entries)
|
||||
if not self.connection.response:
|
||||
raise LdapApiException(f"ERROR: empty response. LDAP member: {member}")
|
||||
|
||||
# Check that the member is worker.
|
||||
# The response can contain several items, but the first item is valid only
|
||||
if "OU=Workers" in member:
|
||||
if self.connection.response[0]["attributes"]["mail"]:
|
||||
processed_ldap_members[member]["email"] = self.connection.response[0][
|
||||
"attributes"
|
||||
]["mail"].lower()
|
||||
return
|
||||
raise LdapApiException(
|
||||
f"ERROR: no mail. LDAP worker: {member}\n" f"{self.connection.entries}"
|
||||
)
|
||||
|
||||
if len(self.connection.response) > 1:
|
||||
raise LdapApiException(
|
||||
f"ERROR: multiple responses for {member}: "
|
||||
f"{len(self.connection.response)}\n"
|
||||
f"{self.connection.entries}"
|
||||
)
|
||||
|
||||
if self.connection.response[0]["attributes"]["member"]:
|
||||
for group_member in self.connection.response[0]["attributes"]["member"]:
|
||||
process_group_members(group_member, member)
|
||||
else:
|
||||
print(f"\nERROR: no members in LDAP group: {member}\n{self.connection.entries}")
|
||||
|
||||
for group in groups or self._cfg.LDAP_PDLs:
|
||||
print("\nProcess ROOT LDAP group:", group)
|
||||
process_group_members(group, "ROOT")
|
||||
return {
|
||||
member.get("email") for member in processed_ldap_members.values() if member.get("email")
|
||||
}
|
||||
|
||||
def _get_user_info(self, query):
|
||||
"""Gets user info from LDAP as dict matching key and values pairs from query"""
|
||||
query_filter = "".join(f"({key}={value})" for key, value in query.items())
|
||||
|
||||
for domain in self._cfg.LDAP_DOMAINS:
|
||||
search_base = f"OU=Workers,DC={domain},DC=corp,DC=intel,DC=com"
|
||||
self.connection.search(
|
||||
search_base,
|
||||
f"(&(objectcategory=person)(objectclass=user)(intelflags=1){query_filter})",
|
||||
SUBTREE,
|
||||
attributes=["*"],
|
||||
)
|
||||
|
||||
if self.connection.response:
|
||||
if len(self.connection.response) > 1:
|
||||
raise LdapApiException(
|
||||
f"ERROR: multiple responses for {query_filter}: "
|
||||
f"{len(self.connection.response)}\n"
|
||||
f"{self.connection.entries}"
|
||||
)
|
||||
info = self.connection.response[0]["attributes"]
|
||||
|
||||
# remove long binary blobs
|
||||
for blob in LdapApi._binary_blobs:
|
||||
info[blob] = b""
|
||||
for key in LdapApi._check_existing:
|
||||
if not info.get(key):
|
||||
info[key] = LdapApi.null
|
||||
return info
|
||||
return {}
|
||||
|
||||
def get_user_info_by_idsid(self, idsid):
|
||||
"""Gets user info from LDAP as dict using account name for searching"""
|
||||
return self._get_user_info({"sAMAccountName": idsid})
|
||||
|
||||
def get_user_info_by_name(self, name):
|
||||
"""Gets user info from LDAP as dict using common name for searching"""
|
||||
return self._get_user_info({"cn": name})
|
||||
|
||||
def get_user_info_by_email(self, email):
|
||||
"""Gets user info from LDAP as dict using emails for searching"""
|
||||
return self._get_user_info({"mail": email})
|
||||
|
||||
def get_absent_emails(self, emails):
|
||||
"""Checks users by email in LDAP and returns absent emails"""
|
||||
absent_emails = set()
|
||||
for email in emails:
|
||||
if not self.get_user_info_by_email(email):
|
||||
absent_emails.add(email)
|
||||
return absent_emails
|
||||
|
||||
|
||||
def _test():
|
||||
"""Test and debug"""
|
||||
ldap = LdapApi()
|
||||
|
||||
emails = ldap.get_user_emails()
|
||||
print(f'\nLDAP emails count: {len(emails)}\n{"; ".join(emails)}')
|
||||
|
||||
emails = ["foo@intel.com"]
|
||||
|
||||
for email in emails:
|
||||
info = ldap.get_user_info_by_email(email)
|
||||
if info:
|
||||
print_user_info(info, InfoLevel.PDL)
|
||||
else:
|
||||
print(f"\n{email} - not found")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
_test()
|
||||
1
modules/openvino-master/.github/github_org_control/requirements-dev.txt
vendored
Normal file
@ -0,0 +1 @@
|
||||
pylint==2.11.1
|
||||
2
modules/openvino-master/.github/github_org_control/requirements.txt
vendored
Normal file
@ -0,0 +1,2 @@
|
||||
PyGithub==1.55
|
||||
ldap3==2.7
|
||||
145
modules/openvino-master/.github/labeler.yml
vendored
Normal file
@ -0,0 +1,145 @@
|
||||
'category: AUTO BATCH':
|
||||
- 'src/plugins/auto_batch/**/*'
|
||||
|
||||
'category: AUTO':
|
||||
- 'src/plugins/auto/**/*'
|
||||
|
||||
'category: build':
|
||||
- 'cmake/**/*'
|
||||
- '**/CMakeLists.txt'
|
||||
- '**/*.cmake'
|
||||
|
||||
'category: C API':
|
||||
- 'src/bindings/c/**/*'
|
||||
|
||||
'category: CI':
|
||||
- '.github/**/*'
|
||||
- '.ci/**/*'
|
||||
- 'Jenkinsfile'
|
||||
|
||||
'category: Core':
|
||||
- 'src/core/**/*'
|
||||
- 'src/common/itt/**/*'
|
||||
- 'src/common/util/**/*'
|
||||
- 'src/frontends/common/**/*'
|
||||
- 'src/common/conditional_compilation/**/*'
|
||||
|
||||
'category: CPP API':
|
||||
- 'src/inference/include/**/*'
|
||||
- 'src/core/include/**/*'
|
||||
- 'src/frontends/common/include/**/*'
|
||||
- 'src/frontends/onnx/frontend/include/**/*'
|
||||
- 'src/frontends/tensorflow/include/**/*'
|
||||
- 'src/frontends/tensorflow_lite/include/**/*'
|
||||
- 'src/frontends/pytorch/include/**/*'
|
||||
- 'src/frontends/paddle/include/**/*'
|
||||
|
||||
'category: CPU':
|
||||
- 'src/plugins/intel_cpu/**/*'
|
||||
- 'src/common/snippets/**/*'
|
||||
- 'thirdparty/xbyak/**/*'
|
||||
|
||||
'category: dependency_changes':
|
||||
- '**/requirement*.txt'
|
||||
- '**/constraints*.txt'
|
||||
- 'scripts/**/*'
|
||||
- '.gitmodules'
|
||||
- '**/setup.py'
|
||||
- any: ['thirdparty/**/*',
|
||||
'!thirdparty/**/CMakeLists.txt']
|
||||
|
||||
'category: docs':
|
||||
- 'docs/**/*'
|
||||
- '**/*.md'
|
||||
|
||||
'category: extensions':
|
||||
- 'src/core/include/openvino/core/extension.hpp'
|
||||
- 'src/frontends/common/include/openvino/frontend/extension.hpp'
|
||||
- 'src/frontends/common/include/openvino/frontend/extension/**/*'
|
||||
|
||||
'category: GNA':
|
||||
- 'src/plugins/intel_gna/**/*'
|
||||
|
||||
'category: GPU':
|
||||
- 'src/plugins/intel_gpu/**/*'
|
||||
- 'src/tests/**/gpu/**/*'
|
||||
- 'thirdparty/ocl/**/*'
|
||||
|
||||
'category: HETERO':
|
||||
- 'src/plugins/hetero/**/*'
|
||||
|
||||
'category: IE Tests':
|
||||
- 'thirdparty/gtest/**/*'
|
||||
- 'src/frontends/tests/frontend/shared/**/*'
|
||||
- any: ['src/tests/**/*',
|
||||
'!src/tests/**/gpu/**/*',
|
||||
'!src/tests/**/inference_engine/**/*']
|
||||
|
||||
'category: inference':
|
||||
- 'src/inference/**/*'
|
||||
- 'src/tests/functional/inference_engine/**/*'
|
||||
|
||||
'category: IR FE':
|
||||
- 'src/frontends/ir/**/*'
|
||||
|
||||
'category: LP transformations':
|
||||
- 'src/common/low_precision_transformations/**/*'
|
||||
|
||||
'category: MO':
|
||||
- 'tools/mo/**/*'
|
||||
- 'tools/ovc/**/*'
|
||||
|
||||
'category: ONNX FE':
|
||||
- 'src/frontends/onnx/**/*'
|
||||
- 'thirdparty/onnx/**/*'
|
||||
|
||||
'category: packaging':
|
||||
- 'cmake/**/packaging/**/*'
|
||||
- 'src/bindings/python/wheel/**/*'
|
||||
- 'tools/openvino_dev/**/*'
|
||||
|
||||
'category: PDPD FE':
|
||||
- 'src/frontends/paddle/**/*'
|
||||
|
||||
'category: POT':
|
||||
- 'tools/pot/**/*'
|
||||
|
||||
'category: preprocessing':
|
||||
- 'src/common/preprocessing/**/*'
|
||||
|
||||
'category: Python API':
|
||||
- 'src/bindings/python/**/*'
|
||||
|
||||
'category: samples':
|
||||
- 'samples/**/*'
|
||||
- 'thirdparty/zlib/**/*'
|
||||
- 'thirdparty/gflags/**/*'
|
||||
- 'thirdparty/json/**/*'
|
||||
- 'thirdparty/cnpy/**/*'
|
||||
|
||||
'category: TEMPLATE':
|
||||
- 'src/plugins/template/**/*'
|
||||
|
||||
'category: TF FE':
|
||||
- 'src/frontends/tensorflow/**/*'
|
||||
- 'src/frontends/tensorflow_common/**/*'
|
||||
- 'tests/layer_tests/tensorflow_tests/**/*'
|
||||
|
||||
'category: TFL FE':
|
||||
- 'src/frontends/tensorflow_lite/**/*'
|
||||
- 'src/frontends/tensorflow_common/**/*'
|
||||
- 'tests/layer_tests/tensorflow_lite_tests/**/*'
|
||||
|
||||
'category: PyTorch FE':
|
||||
- 'src/frontends/pytorch/**/*'
|
||||
- 'tests/layer_tests/pytorch_tests/**/*'
|
||||
- 'src/bindings/python/src/openvino/frontend/pytorch/**/*'
|
||||
|
||||
'category: tools':
|
||||
- any: ['tools/**',
|
||||
'!tools/pot/**/*',
|
||||
'!tools/mo/**/*']
|
||||
|
||||
'category: transformations':
|
||||
- 'src/common/transformations/**/*'
|
||||
- 'src/common/offline_transformations/**/*'
|
||||
6
modules/openvino-master/.github/pull_request_template.md
vendored
Normal file
@ -0,0 +1,6 @@
|
||||
### Details:
|
||||
- *item1*
|
||||
- *...*
|
||||
|
||||
### Tickets:
|
||||
- *ticket-id*
|
||||
119
modules/openvino-master/.github/workflows/build_doc.yml
vendored
Normal file
@ -0,0 +1,119 @@
|
||||
name: Documentation
|
||||
on:
|
||||
pull_request:
|
||||
push:
|
||||
branches:
|
||||
- 'master'
|
||||
- 'releases/**'
|
||||
|
||||
env:
|
||||
DOXY_VER: '1.9.6'
|
||||
DOXYREST_VER: '2.1.3'
|
||||
|
||||
concurrency:
|
||||
group: ${{ github.workflow }}-${{ github.head_ref && github.ref || github.run_id }}
|
||||
cancel-in-progress: true
|
||||
|
||||
jobs:
|
||||
Build_Doc:
|
||||
runs-on: ubuntu-20.04
|
||||
steps:
|
||||
- name: Clone OpenVINO
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
submodules: true
|
||||
lfs: true
|
||||
|
||||
- name: Install apt-get dependencies
|
||||
uses: awalsh128/cache-apt-pkgs-action@v1.3.0
|
||||
with:
|
||||
packages: graphviz texlive liblua5.2-0 libclang1-9 libclang-cpp9
|
||||
version: 3.0
|
||||
|
||||
- uses: actions/setup-python@v4
|
||||
id: cp310
|
||||
with:
|
||||
python-version: '3.10'
|
||||
cache: 'pip'
|
||||
cache-dependency-path: |
|
||||
docs/requirements.txt
|
||||
docs/openvino_sphinx_theme/setup.py
|
||||
|
||||
- name: Install python dependencies
|
||||
run: |
|
||||
python3 -m pip install -r docs/requirements.txt
|
||||
(cd docs/openvino_sphinx_theme && python3 setup.py install)
|
||||
|
||||
- name: Download and install doxygen && doxyrest
|
||||
run: |
|
||||
# install doxyrest
|
||||
wget https://github.com/vovkos/doxyrest/releases/download/doxyrest-$DOXYREST_VER/doxyrest-$DOXYREST_VER-linux-amd64.tar.xz
|
||||
tar -xf doxyrest-$DOXYREST_VER-linux-amd64.tar.xz
|
||||
echo "$(pwd)/doxyrest-$DOXYREST_VER-linux-amd64/bin/" >> $GITHUB_PATH
|
||||
# install doxygen
|
||||
wget https://www.doxygen.nl/files/doxygen-$DOXY_VER.linux.bin.tar.gz
|
||||
tar -xzf doxygen-$DOXY_VER.linux.bin.tar.gz
|
||||
echo "$(pwd)/doxygen-$DOXY_VER/bin/" >> $GITHUB_PATH
|
||||
|
||||
- name: CMake configure
|
||||
run: cmake -DENABLE_DOCS=ON -B build
|
||||
|
||||
- name: Cache documentation
|
||||
id: cache_sphinx_docs
|
||||
uses: actions/cache@v3
|
||||
with:
|
||||
path: build/docs/_build/.doctrees
|
||||
key: sphinx-docs-cache
|
||||
|
||||
- name: Get number of CPU cores
|
||||
uses: SimenB/github-actions-cpu-cores@v2
|
||||
id: cpu-cores
|
||||
|
||||
- name: Build docs
|
||||
run: cmake --build build --target sphinx_docs --parallel ${{ steps.cpu-cores.outputs.count }}
|
||||
|
||||
- name: Archive docs HTML
|
||||
run: (cd build/docs && zip -r openvino_docs_html.zip _build)
|
||||
|
||||
- name: Set PR number
|
||||
run: |
|
||||
PR_NUMBER=$(echo $GITHUB_REF | awk 'BEGIN { FS = "/" } ; { print $3 }')
|
||||
echo "PR_NUMBER=$PR_NUMBER" >> $GITHUB_ENV
|
||||
|
||||
- name: 'Upload doxygen.log'
|
||||
uses: actions/upload-artifact@v3
|
||||
with:
|
||||
name: doxygen_build_log_${{ env.PR_NUMBER }}.log
|
||||
path: build/docs/doxygen.log
|
||||
|
||||
- name: 'Upload sphinx.log'
|
||||
uses: actions/upload-artifact@v3
|
||||
with:
|
||||
name: sphinx_build_log_${{ env.PR_NUMBER }}.log
|
||||
path: build/docs/sphinx.log
|
||||
|
||||
- name: 'Upload docs html'
|
||||
uses: actions/upload-artifact@v3
|
||||
with:
|
||||
name: openvino_docs_html_${{ env.PR_NUMBER }}.zip
|
||||
path: build/docs/openvino_docs_html.zip
|
||||
|
||||
- name: Run Pytest
|
||||
run: |
|
||||
pytest --doxygen="./build/docs/doxygen.log" \
|
||||
--include_pot \
|
||||
--sphinx="./build/docs/sphinx.log" \
|
||||
--suppress-warnings="./docs/suppress_warnings.txt" \
|
||||
--confcutdir="./docs/scripts/tests/" \
|
||||
--html="./build/docs/_artifacts/doc-generation.html" \
|
||||
--doxygen-strip="$(pwd)" \
|
||||
--sphinx-strip="$(pwd)/build/docs/rst" \
|
||||
--doxygen-xfail="./docs/doxygen-xfail.txt" \
|
||||
--self-contained-html ./docs/scripts/tests/test_docs.py
|
||||
|
||||
- name: 'Upload test results'
|
||||
if: failure()
|
||||
uses: actions/upload-artifact@v3
|
||||
with:
|
||||
name: openvino_docs_pytest
|
||||
path: build/docs/_artifacts/
|
||||
17
modules/openvino-master/.github/workflows/check_pr_commits.yml
vendored
Normal file
@ -0,0 +1,17 @@
|
||||
name: PR Commits
|
||||
on: [pull_request]
|
||||
|
||||
jobs:
|
||||
Checks:
|
||||
runs-on: ubuntu-22.04
|
||||
steps:
|
||||
- name: Clone OpenVINO
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Install dependencies
|
||||
run: python3 -m pip install -r ./.github/github_org_control/requirements.txt
|
||||
|
||||
- name: PR commits
|
||||
run: python3 ./.github/github_org_control/check_pr.py --pr=${{ github.event.number }} --check-commits DRY_RUN
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
48
modules/openvino-master/.github/workflows/code_snippets.yml
vendored
Normal file
@ -0,0 +1,48 @@
|
||||
name: Code snippets
|
||||
on:
|
||||
push:
|
||||
paths:
|
||||
- '.github/workflows/code_snippets.yml'
|
||||
- 'docs/snippets/**'
|
||||
branches:
|
||||
- 'master'
|
||||
- 'releases/**'
|
||||
pull_request:
|
||||
paths:
|
||||
- '.github/workflows/code_snippets.yml'
|
||||
- 'docs/snippets/**'
|
||||
|
||||
concurrency:
|
||||
group: ${{ github.workflow }}-${{ github.ref }}
|
||||
cancel-in-progress: true
|
||||
|
||||
jobs:
|
||||
Build:
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
os: ['ubuntu-22.04', 'macos-latest', 'windows-latest']
|
||||
runs-on: ${{ matrix.os }}
|
||||
steps:
|
||||
- name: Clone OpenVINO
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
submodules: recursive
|
||||
lfs: true
|
||||
|
||||
- name: Install OpenCL
|
||||
uses: awalsh128/cache-apt-pkgs-action@v1.3.0
|
||||
if: runner.os == 'Linux'
|
||||
with:
|
||||
packages: ocl-icd-opencl-dev opencl-headers
|
||||
version: 3.0
|
||||
|
||||
- name: CMake configure
|
||||
run: cmake -DCMAKE_BUILD_TYPE=Release -B build
|
||||
|
||||
- name: Get number of CPU cores
|
||||
uses: SimenB/github-actions-cpu-cores@v2
|
||||
id: cpu-cores
|
||||
|
||||
- name: Build snippets
|
||||
run: cmake --build build --target ie_docs_snippets --parallel ${{ steps.cpu-cores.outputs.count }}
|
||||
98
modules/openvino-master/.github/workflows/code_style.yml
vendored
Normal file
@ -0,0 +1,98 @@
|
||||
name: Code Style
|
||||
on: [pull_request]
|
||||
|
||||
concurrency:
|
||||
group: ${{ github.workflow }}-${{ github.ref }}
|
||||
cancel-in-progress: true
|
||||
|
||||
jobs:
|
||||
clang-format:
|
||||
runs-on: ubuntu-20.04
|
||||
permissions:
|
||||
pull-requests: write
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
with:
|
||||
submodules: recursive
|
||||
|
||||
- name: Install clang-format-9
|
||||
run: |
|
||||
sudo apt update
|
||||
sudo apt --assume-yes install clang-format-9
|
||||
|
||||
- name: Install dependencies
|
||||
run: |
|
||||
python3 -m pip install --upgrade pip
|
||||
python3 -m pip install -r ./src/bindings/python/requirements.txt
|
||||
# Add for -DENABLE_PYTHON=ON, no cython
|
||||
python3 -m pip install -r ./src/bindings/python/src/compatibility/openvino/requirements-dev.txt
|
||||
|
||||
# Run cmake with -DENABLE_PROFILING_ITT=ON -DSELECTIVE_BUILD=COLLECT in order to enable codestyle check for ITT collector
|
||||
- name: CMake configure
|
||||
run: cmake -DENABLE_PYTHON=ON -DENABLE_TESTS=ON -DENABLE_PROFILING_ITT=ON -DSELECTIVE_BUILD=COLLECT -B build
|
||||
|
||||
- name: Create code style diff
|
||||
run: cmake --build build --target clang_format_fix_all -j8
|
||||
|
||||
- name: suggester / clang-format
|
||||
if: startsWith(github.event_name, 'pull_request')
|
||||
uses: reviewdog/action-suggester@v1
|
||||
with:
|
||||
github_token: ${{ secrets.GITHUB_TOKEN }}
|
||||
level: warning
|
||||
fail_on_error: true
|
||||
|
||||
ShellCheck:
|
||||
runs-on: ubuntu-22.04
|
||||
permissions:
|
||||
pull-requests: write
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
with:
|
||||
submodules: recursive
|
||||
|
||||
- name: Install ShellCheck
|
||||
run: |
|
||||
sudo apt update
|
||||
sudo apt --assume-yes install shellcheck
|
||||
|
||||
- name: CMake configure
|
||||
run: cmake -B build
|
||||
|
||||
- name: Shellcheck cmake target
|
||||
run: cmake --build build --target ie_shellcheck -j8
|
||||
|
||||
# always provide suggestions even for skipped scripts in ie_shellcheck tagret
|
||||
- name: ShellCheck action
|
||||
if: always()
|
||||
uses: reviewdog/action-shellcheck@v1
|
||||
with:
|
||||
level: style
|
||||
reporter: github-pr-review
|
||||
check_all_files_with_shebangs: true
|
||||
fail_on_error: true
|
||||
exclude: |
|
||||
"*/thirdparty/*"
|
||||
"./temp/*"
|
||||
|
||||
NamingConventionCheck:
|
||||
runs-on: ubuntu-22.04
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
with:
|
||||
submodules: recursive
|
||||
|
||||
- name: Install Clang dependency
|
||||
run: |
|
||||
sudo apt update
|
||||
sudo apt --assume-yes remove clang-7 clang-8 clang-9 clang-10 clang-11 clang-12 clang-13
|
||||
sudo apt --assume-yes install libclang-14-dev
|
||||
|
||||
- name: Install Python-based dependencies
|
||||
run: python3 -m pip install -r cmake/developer_package/ncc_naming_style/requirements_dev.txt
|
||||
|
||||
- name: CMake configure
|
||||
run: cmake -B build
|
||||
|
||||
- name: Naming convention check
|
||||
run: cmake --build build --target ncc_all -j8
|
||||
149
modules/openvino-master/.github/workflows/coverage.yml
vendored
Normal file
@ -0,0 +1,149 @@
|
||||
name: Code coverage
|
||||
on: workflow_dispatch
|
||||
|
||||
concurrency:
|
||||
group: ${{ github.workflow }}-${{ github.ref }}
|
||||
cancel-in-progress: true
|
||||
|
||||
jobs:
|
||||
Coverage:
|
||||
runs-on: ${{ matrix.config.os }}
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
config:
|
||||
- { name: "Ubuntu gcc", os: ubuntu-latest-16-cores, cc: "gcc", cxx: "g++" }
|
||||
|
||||
steps:
|
||||
- name: Setup python
|
||||
uses: actions/setup-python@v4
|
||||
with:
|
||||
python-version: '3.10.10'
|
||||
architecture: 'x64'
|
||||
|
||||
|
||||
- name: Setup ccache
|
||||
uses: hendrikmuhs/ccache-action@v1.2
|
||||
with:
|
||||
max-size: 50G
|
||||
|
||||
- name: Clone OpenVINO
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
submodules: recursive
|
||||
|
||||
- name: Install dependencies
|
||||
run: |
|
||||
sudo apt --assume-yes update
|
||||
sudo -E ${{ github.workspace }}/install_build_dependencies.sh
|
||||
sudo apt --assume-yes install lcov
|
||||
|
||||
python3 -m pip install --upgrade pip
|
||||
python3 -m pip install -r ${{ github.workspace }}/src/bindings/python/wheel/requirements-dev.txt
|
||||
python3 -m pip install -r ${{ github.workspace }}/src/bindings/python/requirements.txt
|
||||
# For running Python API tests
|
||||
python3 -m pip install -r ${{ github.workspace }}/src/bindings/python/src/compatibility/openvino/requirements-dev.txt
|
||||
# For running Paddle frontend unit tests
|
||||
python3 -m pip install -r ${{ github.workspace }}/src/frontends/paddle/tests/requirements.txt
|
||||
# For running ONNX frontend unit tests
|
||||
python3 -m pip install -r ${{ github.workspace }}/src/frontends/onnx/tests/requirements.txt
|
||||
# For running TensorFlow frontend unit tests
|
||||
python3 -m pip install -r ${{ github.workspace }}/src/frontends/tensorflow/tests/requirements.txt
|
||||
# For MO unit tests
|
||||
python3 -m pip install -r ${{ github.workspace }}/tools/mo/requirements_mxnet.txt
|
||||
python3 -m pip install -r ${{ github.workspace }}/tools/mo/requirements_caffe.txt
|
||||
python3 -m pip install -r ${{ github.workspace }}/tools/mo/requirements_kaldi.txt
|
||||
python3 -m pip install -r ${{ github.workspace }}/tools/mo/requirements_onnx.txt
|
||||
python3 -m pip install -r ${{ github.workspace }}/tools/mo/requirements_tf2.txt
|
||||
python3 -m pip install -r ${{ github.workspace }}/tools/mo/requirements_dev.txt
|
||||
|
||||
- name: Get number of CPU cores
|
||||
uses: SimenB/github-actions-cpu-cores@v2
|
||||
id: cpu-cores
|
||||
|
||||
- name: Build OpenVINO with CMake
|
||||
uses: ashutoshvarma/action-cmake-build@master
|
||||
with:
|
||||
build-dir: ${{ github.workspace }}/build
|
||||
cc: ${{ matrix.config.cc }}
|
||||
cxx: ${{ matrix.config.cxx }}
|
||||
configure-options: >
|
||||
-GNinja
|
||||
-DCMAKE_VERBOSE_MAKEFILE=ON
|
||||
-DENABLE_PYTHON=ON
|
||||
-DENABLE_ONEDNN_FOR_GPU=ON
|
||||
-DBUILD_SHARED_LIBS=ON
|
||||
-DENABLE_TESTS=ON
|
||||
-DENABLE_OV_ONNX_FRONTEND=ON
|
||||
-DENABLE_FASTER_BUILD=ON
|
||||
-DENABLE_STRICT_DEPENDENCIES=OFF
|
||||
-DENABLE_COVERAGE=ON
|
||||
-DCMAKE_C_COMPILER_LAUNCHER=ccache
|
||||
-DCMAKE_CXX_COMPILER_LAUNCHER=ccache
|
||||
-DCMAKE_C_LINKER_LAUNCHER=ccache
|
||||
-DCMAKE_CXX_LINKER_LAUNCHER=ccache
|
||||
-DENABLE_SYSTEM_SNAPPY=ON
|
||||
build-type: Release
|
||||
parallel: ${{ steps.cpu-cores.outputs.count }}
|
||||
|
||||
- name: Install wheel packages
|
||||
run: cmake -DCOMPONENT=python_wheels -DCMAKE_INSTALL_PREFIX=${{ github.workspace }}/install_pkg -P '${{ github.workspace }}/build/cmake_install.cmake'
|
||||
|
||||
- name: Install python wheels
|
||||
run: python3 -m pip install openvino-dev --find-links=${{ github.workspace }}/install_pkg/tools
|
||||
|
||||
- name: List binaries
|
||||
run: ls -la ${{ github.workspace }}/bin/intel64/Release
|
||||
|
||||
- name: Install OpenVINO
|
||||
run: cmake -DCMAKE_INSTALL_PREFIX=${{ github.workspace }}/install_pkg -P '${{ github.workspace }}/build/cmake_install.cmake'
|
||||
|
||||
- name: Run OV core unit tests
|
||||
run: ${{ github.workspace }}/bin/intel64/Release/ov_core_unit_tests
|
||||
|
||||
- name: Run OV Proxy plugin tests
|
||||
run: ${{ github.workspace }}/bin/intel64/Release/ov_proxy_plugin_tests
|
||||
|
||||
- name: Run OV Hetero Func tests
|
||||
run: ${{ github.workspace }}/bin/intel64/Release/ov_hetero_func_tests
|
||||
|
||||
- name: Run IR frontend tests
|
||||
run: ${{ github.workspace }}/bin/intel64/Release/ov_ir_frontend_tests # --gtest_print_time=1 --gtest_output=xml:${{ github.workspace }}/testdata/TEST-IRFrontend.xml
|
||||
|
||||
- name: Run ONNX frontend tests
|
||||
run: ${{ github.workspace }}/bin/intel64/Release/ov_onnx_frontend_tests --gtest_filter=-*IE_GPU*
|
||||
|
||||
#- name: Run Paddle frontend unit tests
|
||||
# run: ${{ github.workspace }}/bin/intel64/Release/paddle_tests --gtest_filter=-*IE_GPU*
|
||||
|
||||
- name: Run TensorFlow frontend unit tests
|
||||
run: ${{ github.workspace }}/bin/intel64/Release/ov_tensorflow_frontend_tests --gtest_filter=-*IE_GPU*
|
||||
|
||||
- name: Build coverage with CMake
|
||||
uses: ashutoshvarma/action-cmake-build@master
|
||||
with:
|
||||
build-dir: ${{ github.workspace }}/coverage
|
||||
cc: ${{ matrix.config.cc }}
|
||||
cxx: ${{ matrix.config.cxx }}
|
||||
target: ov_coverage
|
||||
configure-options: >
|
||||
-DCMAKE_VERBOSE_MAKEFILE=ON
|
||||
-DCMAKE_C_COMPILER_LAUNCHER=ccache
|
||||
-DCMAKE_CXX_COMPILER_LAUNCHER=ccache
|
||||
-DCMAKE_C_LINKER_LAUNCHER=ccache
|
||||
-DCMAKE_CXX_LINKER_LAUNCHER=ccache
|
||||
parallel: ${{ steps.cpu-cores.outputs.count }}
|
||||
|
||||
|
||||
- name: Print info
|
||||
run: |
|
||||
ls -laR
|
||||
pwd
|
||||
- name: Generate raport
|
||||
run: |
|
||||
lcov --capture --directory ${{ github.workspace }}/. --output-file coverage.info
|
||||
genhtml coverage.info --output-directory coverage-report
|
||||
- name: Collect coverage
|
||||
uses: codecov/codecov-action@v3
|
||||
with:
|
||||
verbose: true
|
||||
17
modules/openvino-master/.github/workflows/dependency_review.yml
vendored
Normal file
@ -0,0 +1,17 @@
|
||||
name: 'Dependency Review'
|
||||
on: [pull_request]
|
||||
|
||||
permissions:
|
||||
contents: read
|
||||
|
||||
jobs:
|
||||
dependency-review:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Clone OpenVINO
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Dependency Review
|
||||
uses: actions/dependency-review-action@v3
|
||||
with:
|
||||
config-file: './.github/dependency_review.yml'
|
||||
18
modules/openvino-master/.github/workflows/files_size.yml
vendored
Normal file
@ -0,0 +1,18 @@
|
||||
name: Files Size
|
||||
on: [push, pull_request]
|
||||
|
||||
concurrency:
|
||||
group: ${{ github.workflow }}-${{ github.ref }}
|
||||
cancel-in-progress: true
|
||||
|
||||
jobs:
|
||||
Check_Files_Size:
|
||||
runs-on: ubuntu-22.04
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
|
||||
- name: git ls-tree
|
||||
run: git ls-tree -r -t -l --full-name HEAD | sort -n -r -k 4
|
||||
|
||||
- name: git lfs ls-files
|
||||
run: git lfs ls-files --size
|
||||
16
modules/openvino-master/.github/workflows/labeler.yml
vendored
Normal file
@ -0,0 +1,16 @@
|
||||
name: "Pull Request Labeler"
|
||||
on:
|
||||
- pull_request_target
|
||||
|
||||
jobs:
|
||||
triage:
|
||||
permissions:
|
||||
contents: read
|
||||
pull-requests: write
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/labeler@v4
|
||||
with:
|
||||
repo-token: "${{ secrets.GITHUB_TOKEN }}"
|
||||
configuration-path: '.github/labeler.yml'
|
||||
sync-labels: 'true'
|
||||
825
modules/openvino-master/.github/workflows/linux.yml
vendored
Normal file
@ -0,0 +1,825 @@
|
||||
name: Tests on Linux (Ubuntu 22.04, Python 3.11)
|
||||
on:
|
||||
schedule:
|
||||
# at 00:00 on Wednesday and Saturday
|
||||
- cron: '0 0 * * 3,6'
|
||||
workflow_dispatch:
|
||||
pull_request:
|
||||
paths-ignore:
|
||||
- '**/docs/**'
|
||||
- 'docs/**'
|
||||
- '**/**.md'
|
||||
- '**.md'
|
||||
- '**/layer_tests_summary/**'
|
||||
- '**/conformance/**'
|
||||
push:
|
||||
paths-ignore:
|
||||
- '**/docs/**'
|
||||
- 'docs/**'
|
||||
- '**/**.md'
|
||||
- '**.md'
|
||||
- '**/layer_tests_summary/**'
|
||||
- '**/conformance/**'
|
||||
branches:
|
||||
- master
|
||||
|
||||
concurrency:
|
||||
group: ${{ github.head_ref || github.run_id }}-linux
|
||||
cancel-in-progress: true
|
||||
|
||||
jobs:
|
||||
Build:
|
||||
defaults:
|
||||
run:
|
||||
shell: bash
|
||||
runs-on: ubuntu-latest-8-cores
|
||||
env:
|
||||
CMAKE_BUILD_TYPE: 'Release'
|
||||
CMAKE_GENERATOR: 'Ninja'
|
||||
CMAKE_CXX_COMPILER_LAUNCHER: ccache
|
||||
CMAKE_C_COMPILER_LAUNCHER: ccache
|
||||
OPENVINO_REPO: ${{ github.workspace }}/openvino
|
||||
OPENVINO_CONTRIB_REPO: ${{ github.workspace }}/openvino_contrib
|
||||
INSTALL_DIR: ${{ github.workspace }}/install
|
||||
INSTALL_TEST_DIR: ${{ github.workspace }}/install/tests
|
||||
SAMPLES_INSTALL_DIR: ${{ github.workspace }}/install/samples
|
||||
LAYER_TESTS_INSTALL_DIR: ${{ github.workspace }}/install/tests/layer_tests
|
||||
MODEL_HUB_TESTS_INSTALL_DIR: ${{ github.workspace }}/install/tests/model_hub_tests
|
||||
BUILD_DIR: ${{ github.workspace }}/build
|
||||
DATA_PATH: ${{ github.workspace }}/testdata
|
||||
MODELS_PATH: ${{ github.workspace }}/testdata
|
||||
OV_TEMP: ${{ github.workspace }}/openvino_temp
|
||||
PYTHON_STATIC_ARGS: -m "not dynamic_library"
|
||||
steps:
|
||||
- name: Clone OpenVINO
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
path: 'openvino'
|
||||
submodules: 'recursive'
|
||||
|
||||
- name: Clone OpenVINO Contrib
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
repository: 'openvinotoolkit/openvino_contrib'
|
||||
path: 'openvino_contrib'
|
||||
submodules: 'recursive'
|
||||
|
||||
- name: Clone testdata for C API tests
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
repository: 'openvinotoolkit/testdata'
|
||||
path: 'testdata'
|
||||
submodules: 'recursive'
|
||||
lfs: 'true'
|
||||
|
||||
#
|
||||
# Dependencies
|
||||
#
|
||||
|
||||
- name: Install build dependencies
|
||||
run: |
|
||||
sudo -E ${{ env.OPENVINO_REPO }}/install_build_dependencies.sh
|
||||
sudo -E apt update
|
||||
sudo -E apt --assume-yes install openjdk-11-jdk libbz2-dev clang unzip libpugixml-dev libtbb-dev intel-opencl-icd ocl-icd-opencl-dev opencl-headers
|
||||
|
||||
wget https://github.com/ninja-build/ninja/releases/download/v1.10.2/ninja-linux.zip
|
||||
unzip ninja-linux.zip
|
||||
sudo cp -v ninja /usr/local/bin/
|
||||
|
||||
- uses: actions/setup-python@v4
|
||||
with:
|
||||
python-version: '3.11'
|
||||
|
||||
- name: Install python dependencies
|
||||
run: |
|
||||
# For Python API
|
||||
python3 -m pip install --upgrade pip
|
||||
python3 -m pip install Scons
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/src/bindings/python/wheel/requirements-dev.txt
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/src/bindings/python/requirements.txt
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/src/bindings/python/requirements_test.txt
|
||||
|
||||
# For running Python API tests
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/src/bindings/python/src/compatibility/openvino/requirements-dev.txt
|
||||
|
||||
# For running ONNX frontend unit tests
|
||||
python3 -m pip install --force-reinstall -r ${{ env.OPENVINO_REPO }}/src/frontends/onnx/tests/requirements.txt
|
||||
|
||||
# For running TensorFlow frontend unit tests
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/src/frontends/tensorflow/tests/requirements.txt
|
||||
|
||||
# For running Paddle frontend unit tests
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/src/frontends/paddle/tests/requirements.txt
|
||||
|
||||
- name: Install MO dependencies
|
||||
run: |
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_mxnet.txt
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_caffe.txt
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_kaldi.txt
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_onnx.txt
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_tf2.txt
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_dev.txt
|
||||
|
||||
#
|
||||
# Build
|
||||
#
|
||||
|
||||
- name: Setup ccache
|
||||
uses: hendrikmuhs/ccache-action@v1.2
|
||||
with:
|
||||
max-size: "2000M"
|
||||
# Should save cache only if run in the master branch of the base repo
|
||||
# github.ref_name is 'ref/PR_#' in case of the PR, and 'branch_name' when executed on push
|
||||
save: ${{ github.ref_name == 'master' && 'true' || 'false' }}
|
||||
verbose: 2
|
||||
key: linux-ubuntu
|
||||
restore-keys: |
|
||||
linux-ubuntu
|
||||
|
||||
- name: Get tools versions
|
||||
run: |
|
||||
ninja --version || exit 1
|
||||
ccache --version || exit 1
|
||||
python3 --version || exit 1
|
||||
cmake --version || exit 1
|
||||
|
||||
- name: Get number of CPU cores
|
||||
uses: SimenB/github-actions-cpu-cores@v2
|
||||
id: cpu-cores
|
||||
|
||||
- name: CMake configure
|
||||
run: |
|
||||
cmake \
|
||||
-GNinja \
|
||||
-DENABLE_CPPLINT=OFF \
|
||||
-DENABLE_NCC_STYLE=OFF \
|
||||
-DENABLE_TESTS=ON \
|
||||
-DENABLE_PYTHON=ON \
|
||||
-DCMAKE_VERBOSE_MAKEFILE=ON \
|
||||
-DCMAKE_BUILD_TYPE=Release \
|
||||
-DBUILD_SHARED_LIBS=ON \
|
||||
-DENABLE_ONEDNN_FOR_GPU=OFF \
|
||||
-DENABLE_OV_ONNX_FRONTEND=ON \
|
||||
-DCMAKE_COMPILE_WARNING_AS_ERROR=OFF \
|
||||
-DENABLE_STRICT_DEPENDENCIES=OFF \
|
||||
-DCMAKE_CXX_COMPILER_LAUNCHER=ccache \
|
||||
-DCMAKE_C_COMPILER_LAUNCHER=ccache \
|
||||
-DCMAKE_CXX_LINKER_LAUNCHER=ccache \
|
||||
-DCMAKE_C_LINKER_LAUNCHER=ccache \
|
||||
-DENABLE_SYSTEM_SNAPPY=ON \
|
||||
-DENABLE_SYSTEM_TBB=ON \
|
||||
-DBUILD_nvidia_plugin=OFF \
|
||||
-DENABLE_DEBUG_CAPS=ON \
|
||||
-DCUSTOM_OPERATIONS="calculate_grid;complex_mul;fft;grid_sample;sparse_conv;sparse_conv_transpose" \
|
||||
-DOPENVINO_EXTRA_MODULES=${{ env.OPENVINO_CONTRIB_REPO }}/modules \
|
||||
-S ${{ env.OPENVINO_REPO }} \
|
||||
-B ${{ env.BUILD_DIR }}
|
||||
|
||||
- name: Clean ccache stats
|
||||
run: ccache --zero-stats --show-config
|
||||
|
||||
- name: Build
|
||||
run: cmake --build ${{ env.BUILD_DIR }} --parallel ${{ steps.cpu-cores.outputs.count }} --config Release
|
||||
|
||||
- name: Show ccache stats
|
||||
run: ccache --show-stats
|
||||
|
||||
- name: Cmake Layer Tests
|
||||
run: cmake -GNinja -S ${{ env.OPENVINO_REPO }}/tests/layer_tests -B ${{ env.BUILD_DIR }}/layer_tests
|
||||
|
||||
- name: Cmake Model Hub Tests
|
||||
run: cmake -GNinja -S ${{ env.OPENVINO_REPO }}/tests/model_hub_tests -B ${{ env.BUILD_DIR }}/model_hub_tests
|
||||
|
||||
- name: Build Layer Tests
|
||||
run: cmake --build ${{ env.BUILD_DIR }}/layer_tests --parallel --config Release
|
||||
|
||||
- name: Build Model Hub Tests
|
||||
run: cmake --build ${{ env.BUILD_DIR }}/model_hub_tests --parallel --config Release
|
||||
|
||||
- name: Install wheel packages
|
||||
run: cmake -DCOMPONENT=python_wheels -DCMAKE_INSTALL_PREFIX=${{ env.INSTALL_DIR }} -P ${{ env.BUILD_DIR }}/cmake_install.cmake
|
||||
|
||||
- name: Install Layer Tests
|
||||
run: cmake -DCOMPONENT=tests -DCMAKE_INSTALL_PREFIX=${{ env.INSTALL_DIR }} -P ${{ env.BUILD_DIR }}/layer_tests/cmake_install.cmake
|
||||
|
||||
- name: Install Model Hub Tests
|
||||
run: cmake -DCOMPONENT=tests -DCMAKE_INSTALL_PREFIX=${{ env.INSTALL_DIR }} -P ${{ env.BUILD_DIR }}/model_hub_tests/cmake_install.cmake
|
||||
|
||||
- name: Install python wheels
|
||||
run: python3 -m pip install openvino-dev --find-links=${{ env.INSTALL_DIR }}/tools
|
||||
|
||||
- name: Install tests
|
||||
run: cmake -DCMAKE_INSTALL_PREFIX=${{ env.INSTALL_DIR }} -DCOMPONENT=tests -P ${{ env.BUILD_DIR }}/cmake_install.cmake
|
||||
|
||||
- name: Install OpenVINO
|
||||
run: cmake -DCMAKE_INSTALL_PREFIX=${{ env.INSTALL_DIR }} -P ${{ env.BUILD_DIR }}/cmake_install.cmake
|
||||
|
||||
- name: CMake Samples Tests
|
||||
run: cmake -GNinja -S ${{ env.OPENVINO_REPO }}/tests/samples_tests -B ${{ env.BUILD_DIR }}/samples_tests
|
||||
|
||||
- name: Build Samples Tests
|
||||
run: cmake --build ${{ env.BUILD_DIR }}/samples_tests --config Release
|
||||
|
||||
- name: Install Samples Tests
|
||||
run: cmake -DCOMPONENT=tests -DCMAKE_INSTALL_PREFIX=${{ env.INSTALL_DIR }} -P ${{ env.BUILD_DIR }}/samples_tests/cmake_install.cmake
|
||||
|
||||
- name: Pack Artifacts
|
||||
run: |
|
||||
pushd ${{ env.INSTALL_DIR }}
|
||||
tar -czvf ${{ env.BUILD_DIR }}/openvino_package.tar.gz --exclude=tests *
|
||||
popd
|
||||
|
||||
pushd ${{ env.INSTALL_DIR }}
|
||||
tar -czvf ${{ env.BUILD_DIR }}/openvino_tests.tar.gz tests/
|
||||
popd
|
||||
|
||||
- name: Build cpp samples
|
||||
run: ${{ env.SAMPLES_INSTALL_DIR }}/cpp/build_samples.sh -i ${{ env.INSTALL_DIR }} -b ${{ env.BUILD_DIR }}/cpp_samples
|
||||
|
||||
- name: Build c samples
|
||||
run: ${{ env.SAMPLES_INSTALL_DIR }}/c/build_samples.sh -i ${{ env.INSTALL_DIR }} -b ${{ env.BUILD_DIR }}/c_samples
|
||||
|
||||
#
|
||||
# Tests
|
||||
#
|
||||
|
||||
- name: Samples tests
|
||||
run: |
|
||||
python3 -m pip install --ignore-installed PyYAML -r ${{ env.INSTALL_TEST_DIR }}/smoke_tests/requirements.txt
|
||||
export LD_LIBRARY_PATH=${{ env.IE_APP_PATH }}:$LD_LIBRARY_PATH
|
||||
|
||||
source ${{ env.INSTALL_DIR }}/setupvars.sh
|
||||
|
||||
python3 -m pytest -sv ${{ env.INSTALL_TEST_DIR }}/smoke_tests \
|
||||
--env_conf ${{ env.INSTALL_TEST_DIR }}/smoke_tests/env_config.yml \
|
||||
--junitxml=${{ env.INSTALL_TEST_DIR }}/TEST-SamplesSmokeTests.xml
|
||||
env:
|
||||
IE_APP_PATH: ${{ env.INSTALL_DIR }}/samples_bin
|
||||
IE_APP_PYTHON_PATH: ${{ env.INSTALL_DIR }}/samples/python
|
||||
SHARE: ${{ env.INSTALL_TEST_DIR }}/smoke_tests/samples_smoke_tests_data
|
||||
WORKSPACE: ${{ env.INSTALL_DIR }}
|
||||
|
||||
# Present in the "Build" job due to the fact that these tests require build directory
|
||||
- name: ONNX frontend tests
|
||||
run: |
|
||||
source ${{ env.INSTALL_DIR }}/setupvars.sh
|
||||
${{ env.INSTALL_TEST_DIR }}/ov_onnx_frontend_tests --gtest_print_time=1 --gtest_filter=-*IE_GPU*:*FrontEndLoadFromTest.testLoadFromTwoStreams*:*FrontEndLoadFromTest.testLoadFromTwoFiles* \
|
||||
--gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-ONNXFrontend.xml
|
||||
|
||||
#
|
||||
# Upload build artifacts
|
||||
#
|
||||
|
||||
- name: Upload openvino package
|
||||
if: ${{ always() }}
|
||||
uses: actions/upload-artifact@v3
|
||||
with:
|
||||
name: openvino_package
|
||||
path: ${{ env.BUILD_DIR }}/openvino_package.tar.gz
|
||||
if-no-files-found: 'error'
|
||||
|
||||
- name: Upload openvino tests package
|
||||
if: ${{ always() }}
|
||||
uses: actions/upload-artifact@v3
|
||||
with:
|
||||
name: openvino_tests
|
||||
path: ${{ env.BUILD_DIR }}/openvino_tests.tar.gz
|
||||
if-no-files-found: 'error'
|
||||
|
||||
CXX_Unit_Tests:
|
||||
needs: Build
|
||||
defaults:
|
||||
run:
|
||||
shell: bash
|
||||
runs-on: ubuntu-22.04
|
||||
env:
|
||||
INSTALL_DIR: ${{ github.workspace }}/install
|
||||
INSTALL_TEST_DIR: ${{ github.workspace }}/install/tests
|
||||
|
||||
steps:
|
||||
- name: Create Directories
|
||||
run: |
|
||||
mkdir -p ${{ env.INSTALL_DIR }} ${{ env.INSTALL_TEST_DIR }}
|
||||
|
||||
#
|
||||
# Dependencies
|
||||
#
|
||||
|
||||
- name: Install dependencies
|
||||
run: |
|
||||
sudo -E apt update
|
||||
sudo -E apt --assume-yes install openjdk-11-jdk libbz2-dev clang unzip libpugixml-dev libtbb-dev intel-opencl-icd ocl-icd-opencl-dev opencl-headers
|
||||
|
||||
- name: Download OpenVINO package
|
||||
uses: actions/download-artifact@v3
|
||||
with:
|
||||
name: openvino_package
|
||||
path: ${{ env.INSTALL_DIR }}
|
||||
|
||||
- name: Download OpenVINO tests package
|
||||
uses: actions/download-artifact@v3
|
||||
with:
|
||||
name: openvino_tests
|
||||
path: ${{ env.INSTALL_TEST_DIR }}
|
||||
|
||||
- name: Extract OpenVINO packages
|
||||
run: |
|
||||
pushd ${{ env.INSTALL_DIR }}
|
||||
tar -xzf openvino_package.tar.gz -C ${{ env.INSTALL_DIR }} && rm openvino_package.tar.gz || exit 1
|
||||
popd
|
||||
pushd ${{ env.INSTALL_TEST_DIR }}
|
||||
tar -xzf openvino_tests.tar.gz -C ${{ env.INSTALL_DIR }} && rm openvino_tests.tar.gz || exit 1
|
||||
popd
|
||||
|
||||
#
|
||||
# Tests
|
||||
#
|
||||
|
||||
- name: OpenVINO Core Unit Tests
|
||||
run: |
|
||||
source ${{ env.INSTALL_DIR }}/setupvars.sh
|
||||
${{ env.INSTALL_TEST_DIR }}/ov_core_unit_tests --gtest_print_time=1 --gtest_filter=-*IE_GPU* \
|
||||
--gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-OVCoreUT.xml
|
||||
|
||||
- name: OpenVINO Inference Functional Tests
|
||||
run: |
|
||||
source ${{ env.INSTALL_DIR }}/setupvars.sh
|
||||
${{ env.INSTALL_TEST_DIR }}/ov_inference_functional_tests --gtest_print_time=1 \
|
||||
--gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-InferenceFunc.xml
|
||||
|
||||
- name: OpenVINO Inference Unit Tests
|
||||
run: |
|
||||
source ${{ env.INSTALL_DIR }}/setupvars.sh
|
||||
${{ env.INSTALL_TEST_DIR }}/ov_inference_unit_tests --gtest_print_time=1 \
|
||||
--gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-InferenceUnit.xml
|
||||
|
||||
- name: Low Precision Transformations Tests
|
||||
run: |
|
||||
source ${{ env.INSTALL_DIR }}/setupvars.sh
|
||||
${{ env.INSTALL_TEST_DIR }}/ov_lp_transformations_tests --gtest_print_time=1 \
|
||||
--gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-LpTransformations.xml
|
||||
|
||||
- name: OpenVINO Conditional compilation tests
|
||||
run: |
|
||||
source ${{ env.INSTALL_DIR }}/setupvars.sh
|
||||
${{ env.INSTALL_TEST_DIR }}/ov_conditional_compilation_tests --gtest_print_time=1 \
|
||||
--gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-ConditionalCompilation.xml
|
||||
|
||||
- name: IR frontend tests
|
||||
run: |
|
||||
source ${{ env.INSTALL_DIR }}/setupvars.sh
|
||||
${{ env.INSTALL_TEST_DIR }}/ov_ir_frontend_tests --gtest_print_time=1 \
|
||||
--gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-IRFrontend.xml
|
||||
|
||||
# Disabled in Azure: https://github.com/openvinotoolkit/openvino/blob/master/.ci/azure/linux.yml#L403
|
||||
# - name: PaddlePaddle frontend tests
|
||||
# run: |
|
||||
# source ${{ env.INSTALL_DIR }}/setupvars.sh
|
||||
# ${{ env.INSTALL_TEST_DIR }}/paddle_tests --gtest_print_time=1 --gtest_filter=*smoke* \
|
||||
# --gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-PaddleTests.xml
|
||||
|
||||
# Present in the "Build" job as these tests require build directory
|
||||
# - name: ONNX frontend tests
|
||||
# run: |
|
||||
# source ${{ env.INSTALL_DIR }}/setupvars.sh
|
||||
# ${{ env.INSTALL_TEST_DIR }}/ov_onnx_frontend_tests --gtest_print_time=1 --gtest_filter=-*IE_GPU*:*FrontEndLoadFromTest.testLoadFromTwoStreams*:*FrontEndLoadFromTest.testLoadFromTwoFiles* \
|
||||
# --gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-ONNXFrontend.xml
|
||||
|
||||
- name: TensorFlow Common tests
|
||||
run: |
|
||||
source ${{ env.INSTALL_DIR }}/setupvars.sh
|
||||
${{ env.INSTALL_TEST_DIR }}/ov_tensorflow_common_tests --gtest_print_time=1 \
|
||||
--gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-TensorFlowCommonFrontend.xml
|
||||
|
||||
- name: TensorFlow frontend tests
|
||||
run: |
|
||||
source ${{ env.INSTALL_DIR }}/setupvars.sh
|
||||
${{ env.INSTALL_TEST_DIR }}/ov_tensorflow_frontend_tests --gtest_print_time=1 \
|
||||
--gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-TensorFlowFrontend.xml
|
||||
|
||||
- name: TensorFlow Lite frontend tests
|
||||
run: |
|
||||
source ${{ env.INSTALL_DIR }}/setupvars.sh
|
||||
${{ env.INSTALL_TEST_DIR }}/ov_tensorflow_lite_frontend_tests --gtest_print_time=1 \
|
||||
--gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-TensorFlowLiteFrontend.xml
|
||||
|
||||
- name: Transformations Tests
|
||||
run: |
|
||||
source ${{ env.INSTALL_DIR }}/setupvars.sh
|
||||
${{ env.INSTALL_TEST_DIR }}/ov_transformations_tests --gtest_print_time=1 \
|
||||
--gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-Transformations.xml
|
||||
|
||||
- name: Common test utils tests
|
||||
run: |
|
||||
source ${{ env.INSTALL_DIR }}/setupvars.sh
|
||||
${{ env.INSTALL_TEST_DIR }}/ov_util_tests --gtest_print_time=1 \
|
||||
--gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-CommonUtilTests.xml
|
||||
|
||||
- name: Snippets func tests
|
||||
run: |
|
||||
source ${{ env.INSTALL_DIR }}/setupvars.sh
|
||||
${{ env.INSTALL_TEST_DIR }}/ov_snippets_func_tests --gtest_print_time=1 \
|
||||
--gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-SnippetsFuncTests.xml
|
||||
|
||||
- name: CPU plugin unit tests
|
||||
run: |
|
||||
source ${{ env.INSTALL_DIR }}/setupvars.sh
|
||||
${{ env.INSTALL_TEST_DIR }}/ov_cpu_unit_tests --gtest_print_time=1 \
|
||||
--gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-CPUUnitTests.xml
|
||||
|
||||
# Disabled in Azure: https://github.com/openvinotoolkit/openvino/blob/master/.ci/azure/linux.yml#L409
|
||||
# - name: GNA plugin unit tests
|
||||
# run: |
|
||||
# source ${{ env.INSTALL_DIR }}/setupvars.sh
|
||||
# ${{ env.INSTALL_TEST_DIR }}/ov_gna_unit_tests --gtest_print_time=1 \
|
||||
# --gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-GNAUnitTests.xml
|
||||
|
||||
- name: AUTO UT
|
||||
run: |
|
||||
source ${{ env.INSTALL_DIR }}/setupvars.sh
|
||||
${{ env.INSTALL_TEST_DIR }}/ov_auto_unit_tests --gtest_print_time=1 \
|
||||
--gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-ov_auto_unit_tests.xml
|
||||
|
||||
- name: Template plugin tests
|
||||
run: |
|
||||
source ${{ env.INSTALL_DIR }}/setupvars.sh
|
||||
${{ env.INSTALL_TEST_DIR }}/ov_template_func_tests --gtest_print_time=1 \
|
||||
--gtest_filter=*smoke* \
|
||||
--gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-TemplateFuncTests.xml
|
||||
|
||||
- name: Inference Engine C API tests
|
||||
run: |
|
||||
source ${{ env.INSTALL_DIR }}/setupvars.sh
|
||||
${{ env.INSTALL_TEST_DIR }}/InferenceEngineCAPITests --gtest_print_time=1 \
|
||||
--gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-InferenceEngineCAPITests.xml
|
||||
|
||||
- name: OpenVINO C API tests
|
||||
run: |
|
||||
source ${{ env.INSTALL_DIR }}/setupvars.sh
|
||||
${{ env.INSTALL_TEST_DIR }}/ov_capi_test --gtest_print_time=1 \
|
||||
--gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-OpenVINOCAPITests.xml
|
||||
|
||||
- name: AutoBatch FuncTests
|
||||
run: |
|
||||
source ${{ env.INSTALL_DIR }}/setupvars.sh
|
||||
${{ env.INSTALL_TEST_DIR }}/ov_auto_batch_func_tests --gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-ov_auto_batch_func_tests.xml
|
||||
|
||||
- name: Proxy Plugin Tests
|
||||
run: |
|
||||
source ${{ env.INSTALL_DIR }}/setupvars.sh
|
||||
${{ env.INSTALL_TEST_DIR }}/ov_proxy_plugin_tests --gtest_print_time=1 --gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-OVProxyTests.xml
|
||||
|
||||
- name: Hetero Func Tests
|
||||
run: |
|
||||
source ${{ env.INSTALL_DIR }}/setupvars.sh
|
||||
${{ env.INSTALL_TEST_DIR }}/ov_hetero_func_tests --gtest_print_time=1 --gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-OVHeteroFuncTests.xml
|
||||
|
||||
- name: Upload Test Results
|
||||
uses: actions/upload-artifact@v3
|
||||
if: ${{ always() }}
|
||||
with:
|
||||
name: test-results-cpp
|
||||
path: ${{ env.INSTALL_TEST_DIR }}/TEST*.xml
|
||||
if-no-files-found: 'error'
|
||||
|
||||
Python_Unit_Tests:
|
||||
needs: Build
|
||||
defaults:
|
||||
run:
|
||||
shell: bash
|
||||
runs-on: ubuntu-22.04
|
||||
env:
|
||||
OPENVINO_REPO: ${{ github.workspace }}/openvino
|
||||
OPENVINO_CONTRIB_REPO: ${{ github.workspace }}/openvino_contrib
|
||||
INSTALL_DIR: ${{ github.workspace }}/install
|
||||
INSTALL_TEST_DIR: ${{ github.workspace }}/install/tests
|
||||
SAMPLES_INSTALL_DIR: ${{ github.workspace }}/install/samples
|
||||
LAYER_TESTS_INSTALL_DIR: ${{ github.workspace }}/install/tests/layer_tests
|
||||
MODEL_HUB_TESTS_INSTALL_DIR: ${{ github.workspace }}/install/tests/model_hub_tests
|
||||
BUILD_DIR: ${{ github.workspace }}/build
|
||||
DATA_PATH: ${{ github.workspace }}/testdata
|
||||
MODELS_PATH: ${{ github.workspace }}/testdata
|
||||
OV_TEMP: ${{ github.workspace }}/openvino_temp
|
||||
PYTHON_STATIC_ARGS: -m "not dynamic_library"
|
||||
|
||||
steps:
|
||||
- name: Create Directories
|
||||
run: |
|
||||
mkdir -p ${{ env.INSTALL_DIR }} ${{ env.INSTALL_TEST_DIR }}
|
||||
|
||||
- name: Clone OpenVINO
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
path: 'openvino'
|
||||
submodules: 'recursive'
|
||||
|
||||
#
|
||||
# Dependencies
|
||||
#
|
||||
|
||||
- name: Install dependencies
|
||||
run: |
|
||||
sudo -E apt update
|
||||
sudo -E apt --assume-yes install openjdk-11-jdk libbz2-dev clang unzip libpugixml-dev libtbb-dev intel-opencl-icd ocl-icd-opencl-dev opencl-headers
|
||||
|
||||
- uses: actions/setup-python@v4
|
||||
with:
|
||||
python-version: '3.11'
|
||||
|
||||
- name: Install python dependencies
|
||||
run: |
|
||||
# For Python API
|
||||
python3 -m pip install --upgrade pip
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/src/bindings/python/wheel/requirements-dev.txt
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/src/bindings/python/requirements.txt
|
||||
|
||||
# For running Python API tests
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/src/bindings/python/src/compatibility/openvino/requirements-dev.txt
|
||||
|
||||
# For running ONNX frontend unit tests
|
||||
python3 -m pip install --force-reinstall -r ${{ env.OPENVINO_REPO }}/src/frontends/onnx/tests/requirements.txt
|
||||
|
||||
# For running TensorFlow frontend unit tests
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/src/frontends/tensorflow/tests/requirements.txt
|
||||
|
||||
# For running Paddle frontend unit tests
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/src/frontends/paddle/tests/requirements.txt
|
||||
|
||||
# For torchvision to OpenVINO preprocessing converter
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/src/bindings/python/src/openvino/preprocess/torchvision/requirements.txt
|
||||
|
||||
- name: Install MO dependencies
|
||||
run: |
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_mxnet.txt
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_caffe.txt
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_kaldi.txt
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_onnx.txt
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_tf2.txt
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_dev.txt
|
||||
|
||||
- name: Download OpenVINO package
|
||||
uses: actions/download-artifact@v3
|
||||
with:
|
||||
name: openvino_package
|
||||
path: ${{ env.INSTALL_DIR }}
|
||||
|
||||
- name: Download OpenVINO tests package
|
||||
uses: actions/download-artifact@v3
|
||||
with:
|
||||
name: openvino_tests
|
||||
path: ${{ env.INSTALL_TEST_DIR }}
|
||||
|
||||
- name: Extract OpenVINO packages
|
||||
run: |
|
||||
pushd ${{ env.INSTALL_DIR }}
|
||||
tar -xzf openvino_package.tar.gz -C ${{ env.INSTALL_DIR }} && rm openvino_package.tar.gz || exit 1
|
||||
popd
|
||||
|
||||
pushd ${{ env.INSTALL_TEST_DIR }}
|
||||
tar -xzf openvino_tests.tar.gz -C ${{ env.INSTALL_DIR }} && rm openvino_tests.tar.gz || exit 1
|
||||
popd
|
||||
|
||||
- name: Install Python wheels
|
||||
run: |
|
||||
python3 -m pip install openvino-dev --find-links=${{ env.INSTALL_DIR }}/tools
|
||||
|
||||
- name: nGraph and IE Python Bindings Tests
|
||||
run: |
|
||||
source ${{ env.INSTALL_DIR }}/setupvars.sh
|
||||
python3 -m pytest -s ${{ env.INSTALL_TEST_DIR }}/pyngraph ${{ env.PYTHON_STATIC_ARGS }} \
|
||||
--junitxml=${{ env.INSTALL_TEST_DIR }}/TEST-Pyngraph.xml \
|
||||
--ignore=${{ env.INSTALL_TEST_DIR }}/pyngraph/tests/test_onnx/test_zoo_models.py \
|
||||
--ignore=${{ env.INSTALL_TEST_DIR }}/pyngraph/tests/test_onnx/test_backend.py
|
||||
|
||||
- name: Python API 2.0 Tests
|
||||
run: |
|
||||
# For python imports to import pybind_mock_frontend
|
||||
export PYTHONPATH=${{ env.INSTALL_TEST_DIR }}:$PYTHONPATH
|
||||
source ${{ env.INSTALL_DIR }}/setupvars.sh
|
||||
export LD_LIBRARY_PATH=${{ env.INSTALL_TEST_DIR }}:$LD_LIBRARY_PATH
|
||||
|
||||
python3 -m pytest -sv ${{ env.INSTALL_TEST_DIR }}/pyopenvino ${{ env.PYTHON_STATIC_ARGS }} \
|
||||
--junitxml=${{ env.INSTALL_TEST_DIR }}/TEST-Pyngraph.xml \
|
||||
--ignore=${{ env.INSTALL_TEST_DIR }}/pyopenvino/tests/test_utils/test_utils.py \
|
||||
--ignore=${{ env.INSTALL_TEST_DIR }}/pyopenvino/tests/test_onnx/test_zoo_models.py \
|
||||
--ignore=${{ env.INSTALL_TEST_DIR }}/pyopenvino/tests/test_onnx/test_backend.py
|
||||
|
||||
- name: Model Optimizer UT
|
||||
run: |
|
||||
|
||||
export PYTHONPATH=${{ env.OPENVINO_REPO }}/tools/mo/:${{ env.LAYER_TESTS_INSTALL_DIR }}:${{ env.INSTALL_TEST_DIR }}:${{ env.INSTALL_DIR }}/python/python3.11:$PYTHONPATH
|
||||
|
||||
# TODO: figure out why they need to be reinstalled
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_mxnet.txt
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_caffe.txt
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_kaldi.txt
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_onnx.txt
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_tf2.txt
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_dev.txt
|
||||
|
||||
source ${{ env.INSTALL_DIR }}/setupvars.sh
|
||||
|
||||
python3 -m pytest -s ${{ env.INSTALL_TEST_DIR }}/mo/unit_tests \
|
||||
--junitxml=${{ env.INSTALL_TEST_DIR }}/TEST-ModelOptimizer.xml
|
||||
|
||||
- name: PyTorch Layer Tests
|
||||
run: |
|
||||
python3 -m pip install -r ${{ env.LAYER_TESTS_INSTALL_DIR }}/requirements.txt
|
||||
export PYTHONPATH=${{ env.OPENVINO_REPO }}/tools/mo/:${{ env.LAYER_TESTS_INSTALL_DIR }}:$PYTHONPATH
|
||||
source ${{ env.INSTALL_DIR }}/setupvars.sh
|
||||
|
||||
python3 -m pytest ${{ env.LAYER_TESTS_INSTALL_DIR }}/pytorch_tests -m precommit --junitxml=${{ env.INSTALL_TEST_DIR }}/TEST-pytorch.xml
|
||||
env:
|
||||
TEST_DEVICE: CPU
|
||||
TEST_PRECISION: FP16
|
||||
|
||||
- name: TensorFlow 1 Layer Tests - TF FE
|
||||
run: |
|
||||
python3 -m pip install -r ${{ env.LAYER_TESTS_INSTALL_DIR }}/requirements.txt
|
||||
|
||||
export PYTHONPATH=${{ env.OPENVINO_REPO }}/tools/mo/:${{ env.LAYER_TESTS_INSTALL_DIR }}:$PYTHONPATH
|
||||
|
||||
source ${{ env.INSTALL_DIR }}/setupvars.sh
|
||||
|
||||
python3 -m pytest ${{ env.LAYER_TESTS_INSTALL_DIR }}/tensorflow_tests/ --use_new_frontend -m precommit_tf_fe --junitxml=${{ env.INSTALL_TEST_DIR }}/TEST-tf_fe.xml
|
||||
env:
|
||||
TEST_DEVICE: CPU
|
||||
TEST_PRECISION: FP16
|
||||
|
||||
- name: TensorFlow 2 Layer Tests - TF FE
|
||||
run: |
|
||||
python3 -m pip install -r ${{ env.LAYER_TESTS_INSTALL_DIR }}/requirements.txt
|
||||
export PYTHONPATH=${{ env.OPENVINO_REPO }}/tools/mo/:${{ env.LAYER_TESTS_INSTALL_DIR }}:$PYTHONPATH
|
||||
|
||||
source ${{ env.INSTALL_DIR }}/setupvars.sh
|
||||
|
||||
python3 -m pytest ${{ env.LAYER_TESTS_INSTALL_DIR }}/tensorflow2_keras_tests/ --use_new_frontend -m precommit_tf_fe --junitxml=${{ env.INSTALL_TEST_DIR }}/TEST-tf2_fe.xml
|
||||
env:
|
||||
TEST_DEVICE: CPU
|
||||
TEST_PRECISION: FP16
|
||||
|
||||
- name: JAX Layer Tests - TF FE
|
||||
run: |
|
||||
python3 -m pip install -r ${{ env.LAYER_TESTS_INSTALL_DIR }}/requirements.txt
|
||||
export PYTHONPATH=${{ env.LAYER_TESTS_INSTALL_DIR }}:$PYTHONPATH
|
||||
|
||||
source ${{ env.INSTALL_DIR }}/setupvars.sh
|
||||
|
||||
python3 -m pytest ${{ env.LAYER_TESTS_INSTALL_DIR }}/jax_tests/ -m precommit --junitxml=${{ env.INSTALL_TEST_DIR }}/TEST-jax.xml
|
||||
env:
|
||||
TEST_DEVICE: CPU
|
||||
|
||||
- name: TensorFlow Hub Tests - TF FE
|
||||
run: |
|
||||
python3 -m pip install -r ${{ env.MODEL_HUB_TESTS_INSTALL_DIR }}/tf_hub_tests/requirements.txt
|
||||
|
||||
export PYTHONPATH=${{ env.MODEL_HUB_TESTS_INSTALL_DIR }}:$PYTHONPATH
|
||||
|
||||
source ${{ env.INSTALL_DIR }}/setupvars.sh
|
||||
|
||||
python3 -m pytest ${{ env.MODEL_HUB_TESTS_INSTALL_DIR }}/tf_hub_tests/ -m ${{ env.TYPE }} --junitxml=${{ env.INSTALL_TEST_DIR }}/TEST-tf_hub_tf_fe.xml --html=${{ env.INSTALL_TEST_DIR }}/TEST-tf_hub_tf_fe.html --self-contained-html
|
||||
env:
|
||||
TYPE: ${{ github.event_name == 'schedule' && 'nightly' || 'precommit'}}
|
||||
TEST_DEVICE: CPU
|
||||
|
||||
- name: TensorFlow 1 Layer Tests - Legacy FE
|
||||
run: |
|
||||
python3 -m pip install -r ${{ env.LAYER_TESTS_INSTALL_DIR }}/requirements.txt
|
||||
export PYTHONPATH=${{ env.OPENVINO_REPO }}/tools/mo/:${{ env.LAYER_TESTS_INSTALL_DIR }}:$PYTHONPATH
|
||||
source ${{ env.INSTALL_DIR }}/setupvars.sh
|
||||
|
||||
python3 -m pytest ${{ env.LAYER_TESTS_INSTALL_DIR }}/tensorflow_tests/test_tf_Roll.py --ir_version=10 --junitxml=${{ env.INSTALL_TEST_DIR }}/TEST-tf_Roll.xml
|
||||
|
||||
- name: TensorFlow 2 Layer Tests - Legacy FE
|
||||
run: |
|
||||
python3 -m pip install -r ${{ env.LAYER_TESTS_INSTALL_DIR }}/requirements.txt
|
||||
export PYTHONPATH=${{ env.OPENVINO_REPO }}/tools/mo/:${{ env.LAYER_TESTS_INSTALL_DIR }}:$PYTHONPATH
|
||||
source ${{ env.INSTALL_DIR }}/setupvars.sh
|
||||
|
||||
python3 -m pytest ${{ env.LAYER_TESTS_INSTALL_DIR }}/tensorflow2_keras_tests/test_tf2_keras_activation.py \
|
||||
--ir_version=11 --junitxml=${{ env.INSTALL_TEST_DIR }}/TEST-tf2_Activation.xml -k "sigmoid"
|
||||
env:
|
||||
TEST_DEVICE: CPU
|
||||
TEST_PRECISION: FP16
|
||||
|
||||
- name: TensorFlow Lite Layer Tests - TFL FE
|
||||
run: |
|
||||
python3 -m pip install -r ${{ env.LAYER_TESTS_INSTALL_DIR }}/requirements.txt
|
||||
export PYTHONPATH=${{ env.OPENVINO_REPO }}/tools/mo/:${{ env.LAYER_TESTS_INSTALL_DIR }}:$PYTHONPATH
|
||||
source ${{ env.INSTALL_DIR }}/setupvars.sh
|
||||
|
||||
python3 -m pytest ${{ env.LAYER_TESTS_INSTALL_DIR }}/tensorflow_lite_tests/ --junitxml=${{ env.INSTALL_TEST_DIR }}/TEST-tfl_fe.xml
|
||||
env:
|
||||
TEST_DEVICE: CPU
|
||||
TEST_PRECISION: FP16
|
||||
|
||||
- name: MO Python API Tests
|
||||
run: |
|
||||
python3 -m pip install -r ${{ env.LAYER_TESTS_INSTALL_DIR }}/requirements.txt
|
||||
export PYTHONPATH=${{ env.OPENVINO_REPO }}/tools/mo/:${{ env.LAYER_TESTS_INSTALL_DIR }}:$PYTHONPATH
|
||||
source ${{ env.INSTALL_DIR }}/setupvars.sh
|
||||
|
||||
python3 -m pytest ${{ env.LAYER_TESTS_INSTALL_DIR }}/mo_python_api_tests --junitxml=${{ env.INSTALL_TEST_DIR }}/TEST-test_mo_convert.xml
|
||||
env:
|
||||
TEST_DEVICE: CPU
|
||||
TEST_PRECISION: FP16
|
||||
|
||||
- name: Python Frontend tests
|
||||
run: |
|
||||
python3 -m pip install -r ${{ env.LAYER_TESTS_INSTALL_DIR }}/requirements.txt
|
||||
export PYTHONPATH=${{ env.OPENVINO_REPO }}/tools/mo/:${{ env.LAYER_TESTS_INSTALL_DIR }}:$PYTHONPATH
|
||||
source ${{ env.INSTALL_DIR }}/setupvars.sh
|
||||
|
||||
python3 -m pytest ${{ env.LAYER_TESTS_INSTALL_DIR }}/py_frontend_tests --junitxml=${{ env.INSTALL_TEST_DIR }}/TEST-test_py_fontend.xml
|
||||
|
||||
- name: Conversion UT
|
||||
run: |
|
||||
# For python imports to import pybind_mock_frontend
|
||||
export PYTHONPATH=${{ env.INSTALL_TEST_DIR }}:$PYTHONPATH
|
||||
source ${{ env.INSTALL_DIR }}/setupvars.sh
|
||||
|
||||
python3 -m pytest -s ${{ env.OPENVINO_REPO }}/tools/ovc/unit_tests --junitxml=${{ env.INSTALL_TEST_DIR }}/TEST-OpenVinoConversion.xml
|
||||
|
||||
- name: Upload Test Results
|
||||
uses: actions/upload-artifact@v3
|
||||
if: ${{ always() }}
|
||||
with:
|
||||
name: test-results-python
|
||||
path: |
|
||||
${{ env.INSTALL_TEST_DIR }}/TEST*.html
|
||||
${{ env.INSTALL_TEST_DIR }}/TEST*.xml
|
||||
if-no-files-found: 'error'
|
||||
|
||||
CPU_Functional_Tests:
|
||||
needs: Build
|
||||
defaults:
|
||||
run:
|
||||
shell: bash
|
||||
runs-on: ubuntu-latest-4-cores
|
||||
env:
|
||||
INSTALL_DIR: ${{ github.workspace }}/install
|
||||
INSTALL_TEST_DIR: ${{ github.workspace }}/install/tests
|
||||
PARALLEL_TEST_SCRIPT: ${{ github.workspace }}/install/tests/functional_test_utils/run_parallel.py
|
||||
PARALLEL_TEST_CACHE: ${{ github.workspace }}/install/tests/test_cache.lst
|
||||
|
||||
steps:
|
||||
- name: Create Directories
|
||||
run: mkdir -p ${{ env.INSTALL_DIR }} ${{ env.INSTALL_TEST_DIR }}
|
||||
|
||||
- name: Install dependencies
|
||||
run: |
|
||||
sudo -E apt update
|
||||
sudo -E apt --assume-yes install openjdk-11-jdk libbz2-dev clang unzip libpugixml-dev libtbb-dev intel-opencl-icd ocl-icd-opencl-dev opencl-headers
|
||||
|
||||
- name: Download OpenVINO package
|
||||
uses: actions/download-artifact@v3
|
||||
with:
|
||||
name: openvino_package
|
||||
path: ${{ env.INSTALL_DIR }}
|
||||
|
||||
- name: Download OpenVINO tests package
|
||||
uses: actions/download-artifact@v3
|
||||
with:
|
||||
name: openvino_tests
|
||||
path: ${{ env.INSTALL_TEST_DIR }}
|
||||
|
||||
- name: Extract OpenVINO packages
|
||||
run: |
|
||||
pushd ${{ env.INSTALL_DIR }}
|
||||
tar -xzf openvino_package.tar.gz -C ${{ env.INSTALL_DIR }} && rm openvino_package.tar.gz || exit 1
|
||||
popd
|
||||
pushd ${{ env.INSTALL_TEST_DIR }}
|
||||
tar -xzf openvino_tests.tar.gz -C ${{ env.INSTALL_DIR }} && rm openvino_tests.tar.gz || exit 1
|
||||
popd
|
||||
|
||||
- name: Install python dependencies
|
||||
run: |
|
||||
python3 -m pip install --upgrade pip
|
||||
python3 -m pip install -r ${{ env.INSTALL_TEST_DIR }}/functional_test_utils/requirements.txt
|
||||
|
||||
- name: Cache Tests Execution Time
|
||||
id: tests-functional-cpu-cache
|
||||
uses: actions/cache@v3
|
||||
with:
|
||||
path: ${{ env.PARALLEL_TEST_CACHE }}
|
||||
key: ${{ runner.os }}-tests-functional-cpu-cache
|
||||
|
||||
- name: Intel CPU plugin func tests (parallel)
|
||||
run: |
|
||||
source ${{ env.INSTALL_DIR }}/setupvars.sh
|
||||
python3 ${{ env.PARALLEL_TEST_SCRIPT }} -e ${{ env.INSTALL_TEST_DIR }}/ov_cpu_func_tests -c ${{ env.PARALLEL_TEST_CACHE }} -w ${{ env.INSTALL_TEST_DIR }} -s suite -rf 0 -- --gtest_print_time=1 --gtest_filter=*smoke*
|
||||
timeout-minutes: 25
|
||||
|
||||
- name: Upload Test Results
|
||||
uses: actions/upload-artifact@v3
|
||||
if: ${{ always() }}
|
||||
with:
|
||||
name: test-results-functional-cpu
|
||||
path: |
|
||||
${{ env.INSTALL_TEST_DIR }}/TEST*.xml
|
||||
${{ env.INSTALL_TEST_DIR }}/logs/failed/*.log
|
||||
${{ env.INSTALL_TEST_DIR }}/logs/crashed/*.log
|
||||
${{ env.INSTALL_TEST_DIR }}/logs/hanged/*.log
|
||||
${{ env.INSTALL_TEST_DIR }}/logs/interapted/*.log
|
||||
${{ env.INSTALL_TEST_DIR }}/logs/disabled_tests.log
|
||||
if-no-files-found: 'error'
|
||||
438
modules/openvino-master/.github/workflows/linux_debian.yml
vendored
Normal file
@ -0,0 +1,438 @@
|
||||
name: Linux Debian (Ubuntu 20.04, Python 3.11)
|
||||
on:
|
||||
schedule:
|
||||
# run daily at 00:00
|
||||
- cron: '0 0 * * *'
|
||||
workflow_dispatch:
|
||||
# pull_request:
|
||||
# paths-ignore:
|
||||
# - '**/docs/**'
|
||||
# - 'docs/**'
|
||||
# - '**/**.md'
|
||||
# - '**.md'
|
||||
# - '**/layer_tests_summary/**'
|
||||
# - '**/conformance/**'
|
||||
# push:
|
||||
# paths-ignore:
|
||||
# - '**/docs/**'
|
||||
# - 'docs/**'
|
||||
# - '**/**.md'
|
||||
# - '**.md'
|
||||
# - '**/layer_tests_summary/**'
|
||||
# - '**/conformance/**'
|
||||
# branches:
|
||||
# - master
|
||||
|
||||
concurrency:
|
||||
group: ${{ github.head_ref || github.run_id }}-linux-debian
|
||||
cancel-in-progress: true
|
||||
|
||||
jobs:
|
||||
Build:
|
||||
# TODO: remove. Temporary measure to prevent the workflow from scheduling on forks.
|
||||
if: ${{ github.repository_owner == 'openvinotoolkit' }}
|
||||
defaults:
|
||||
run:
|
||||
shell: bash
|
||||
runs-on: ubuntu-20.04-8-cores
|
||||
env:
|
||||
CMAKE_BUILD_TYPE: 'Release'
|
||||
CMAKE_GENERATOR: 'Ninja'
|
||||
CMAKE_CXX_COMPILER_LAUNCHER: ccache
|
||||
CMAKE_C_COMPILER_LAUNCHER: ccache
|
||||
CMAKE_CXX_LINKER_LAUNCHER: ccache
|
||||
CMAKE_C_LINKER_LAUNCHER: ccache
|
||||
BUILD_TYPE: Release
|
||||
OPENVINO_REPO: ${{ github.workspace }}/openvino
|
||||
BUILD_DIR: ${{ github.workspace }}/build
|
||||
INSTALL_DIR: ${{ github.workspace }}/install
|
||||
INSTALL_TEST_DIR: ${{ github.workspace }}/install/tests
|
||||
LAYER_TESTS_INSTALL_DIR: ${{ github.workspace }}/install/tests/layer_tests
|
||||
DATA_PATH: ${{ github.workspace }}/testdata
|
||||
MODELS_PATH: ${{ github.workspace }}/testdata
|
||||
OV_TEMP: ${{ github.workspace }}/openvino_temp
|
||||
SAMPLES_INSTALL_DIR: /usr/share/openvino/samples
|
||||
PYTHON_STATIC_ARGS: -m "not dynamic_library and not template_plugin"
|
||||
steps:
|
||||
- name: Clone OpenVINO
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
path: 'openvino'
|
||||
submodules: 'recursive'
|
||||
|
||||
- name: Clone testdata for C API tests
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
repository: 'openvinotoolkit/testdata'
|
||||
path: 'testdata'
|
||||
submodules: 'recursive'
|
||||
lfs: 'true'
|
||||
|
||||
- name: Create Directories
|
||||
run: |
|
||||
mkdir -p ${{ env.BUILD_DIR }}
|
||||
mkdir -p ${{ env.INSTALL_DIR }}
|
||||
|
||||
- name: Setup Python 3.11
|
||||
uses: actions/setup-python@v4
|
||||
with:
|
||||
python-version: '3.11'
|
||||
|
||||
#
|
||||
# Dependencies
|
||||
#
|
||||
|
||||
- name: Install build dependencies
|
||||
run: |
|
||||
sudo -E apt update
|
||||
sudo -E ${{ env.OPENVINO_REPO }}/install_build_dependencies.sh
|
||||
|
||||
# 'clang' is used as a default compiler
|
||||
sudo apt --assume-yes install clang
|
||||
sudo apt --assume-yes install --no-install-recommends libopencv-imgproc-dev libopencv-imgcodecs-dev
|
||||
|
||||
# Speed up build
|
||||
sudo apt -y --no-install-recommends install unzip
|
||||
wget https://github.com/ninja-build/ninja/releases/download/v1.10.2/ninja-linux.zip
|
||||
unzip ninja-linux.zip
|
||||
sudo cp -v ninja /usr/local/bin/
|
||||
|
||||
# Speed up tests
|
||||
git clone https://github.com/google/gtest-parallel.git
|
||||
|
||||
- name: Install python dependencies
|
||||
run: |
|
||||
python3 -m pip install --upgrade pip
|
||||
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/src/bindings/python/wheel/requirements-dev.txt
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/src/bindings/python/requirements.txt
|
||||
|
||||
# For running Python API tests
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/src/bindings/python/src/compatibility/openvino/requirements-dev.txt
|
||||
|
||||
# For running Paddle frontend unit tests
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/src/frontends/paddle/tests/requirements.txt
|
||||
|
||||
# For running ONNX frontend unit tests
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/src/frontends/onnx/tests/requirements.txt
|
||||
|
||||
# For running TensorFlow frontend unit tests
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/src/frontends/tensorflow/tests/requirements.txt
|
||||
|
||||
# For MO unit tests
|
||||
python3 -m pip install -U pip
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_mxnet.txt
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_caffe.txt
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_kaldi.txt
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_onnx.txt
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_tf2.txt
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_dev.txt
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/src/frontends/paddle/tests/requirements.txt
|
||||
|
||||
# for Python API tests
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/src/bindings/python/requirements_test.txt
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/tools/mo/requirements.txt
|
||||
|
||||
- name: Setup ccache
|
||||
uses: hendrikmuhs/ccache-action@v1.2
|
||||
with:
|
||||
max-size: "2000M"
|
||||
# Should save cache only if run in the master branch of the base repo
|
||||
# github.ref_name is 'ref/PR_#' in case of the PR, and 'branch_name' when executed on push
|
||||
save: ${{ github.ref_name == 'master' && 'true' || 'false' }}
|
||||
verbose: 2
|
||||
key: ${{ github.job }}-linux-debian
|
||||
restore-keys: |
|
||||
${{ github.job }}-linux-debian
|
||||
|
||||
- name: Get tools versions
|
||||
run: |
|
||||
ninja --version
|
||||
ccache --version
|
||||
python3 --version
|
||||
cmake --version
|
||||
|
||||
#
|
||||
# Build
|
||||
#
|
||||
|
||||
- name: Get number of CPU cores
|
||||
uses: SimenB/github-actions-cpu-cores@v2
|
||||
id: cpu-cores
|
||||
|
||||
- name: CMake configure
|
||||
run: |
|
||||
cmake \
|
||||
-GNinja \
|
||||
-DENABLE_CPPLINT=OFF \
|
||||
-DCMAKE_BUILD_TYPE=${{ env.BUILD_TYPE }} \
|
||||
-DCMAKE_COMPILE_WARNING_AS_ERROR=OFF \
|
||||
-DENABLE_PYTHON=ON \
|
||||
-DENABLE_INTEL_GNA=OFF \
|
||||
-DENABLE_TESTS=ON \
|
||||
-DENABLE_FASTER_BUILD=ON \
|
||||
-DENABLE_STRICT_DEPENDENCIES=OFF \
|
||||
-DENABLE_SYSTEM_SNAPPY=ON \
|
||||
-DENABLE_PYTHON_PACKAGING=ON \
|
||||
-DCPACK_GENERATOR=DEB \
|
||||
-S ${{ env.OPENVINO_REPO }} \
|
||||
-B ${{ env.BUILD_DIR }}
|
||||
|
||||
- name: Clean ccache stats
|
||||
run: ccache --zero-stats --show-config
|
||||
|
||||
- name: Build
|
||||
run: cmake --build ${{ env.BUILD_DIR }} --parallel ${{ steps.cpu-cores.outputs.count }} --config ${{ env.BUILD_TYPE }}
|
||||
|
||||
- name: Show ccache stats
|
||||
run: ccache --show-stats
|
||||
|
||||
- name: CMake Layer Tests
|
||||
run: cmake -GNinja -S ${{ env.OPENVINO_REPO }}/tests/layer_tests -B ${{ env.BUILD_DIR }}/layer_tests
|
||||
|
||||
- name: Build Layer Tests
|
||||
run: cmake --build ${{ env.BUILD_DIR }}/layer_tests --parallel --config ${{ env.BUILD_TYPE }}
|
||||
|
||||
# to check that wheel packages tested later contain all the dependencies like TBB or pugixml
|
||||
- name: Remove debian dependencies
|
||||
run: sudo apt-get remove libtbb2 libpugixml1v5 -y
|
||||
|
||||
- name: Install wheel packages
|
||||
run: cmake -DCOMPONENT=python_wheels -DCMAKE_INSTALL_PREFIX=${{ env.INSTALL_DIR }} -P ${{ env.BUILD_DIR }}/cmake_install.cmake
|
||||
|
||||
- name: Install Python Samples
|
||||
run: cmake -DCOMPONENT=python_samples -DCMAKE_INSTALL_PREFIX=${{ env.INSTALL_DIR }} -P ${{ env.BUILD_DIR }}/cmake_install.cmake
|
||||
|
||||
- name: Install Layer Tests
|
||||
run: cmake -DCOMPONENT=tests -DCMAKE_INSTALL_PREFIX=${{ env.INSTALL_DIR }} -P ${{ env.BUILD_DIR }}/layer_tests/cmake_install.cmake
|
||||
|
||||
- name: Install tests
|
||||
run: cmake -DCMAKE_INSTALL_PREFIX=${{ env.INSTALL_DIR }} -DCOMPONENT=tests -P ${{ env.BUILD_DIR }}/cmake_install.cmake
|
||||
|
||||
- name: List install test files
|
||||
run: ls -alR ${{ env.INSTALL_DIR }}
|
||||
|
||||
- name: Install python wheels
|
||||
run: python3 -m pip install openvino-dev --find-links=${{ env.INSTALL_DIR }}/tools
|
||||
|
||||
- name: Build Debian packages
|
||||
run: |
|
||||
sudo apt-get install libtbb-dev libpugixml-dev -y
|
||||
cmake --build ${{ env.BUILD_DIR }} --config ${{ env.BUILD_TYPE }} --target package --parallel
|
||||
|
||||
- name: Install Debian packages
|
||||
run: |
|
||||
pushd ${{ env.BUILD_DIR }}
|
||||
# install debian packages from previous release
|
||||
sudo apt-get -y update
|
||||
sudo apt-get install --no-install-recommends gnupg wget -y
|
||||
wget https://apt.repos.intel.com/intel-gpg-keys/GPG-PUB-KEY-INTEL-SW-PRODUCTS.PUB
|
||||
sudo apt-key add GPG-PUB-KEY-INTEL-SW-PRODUCTS.PUB
|
||||
echo "deb https://apt.repos.intel.com/openvino/2023 ubuntu20 main" | sudo tee /etc/apt/sources.list.d/intel-openvino-2023.list
|
||||
sudo apt-get update -o Dir::Etc::sourcelist=/etc/apt/sources.list.d/intel-openvino-2023.list
|
||||
sudo apt-get install openvino -y
|
||||
# install our local one and make sure the conflicts are resolved
|
||||
sudo apt-get install --no-install-recommends dpkg-dev -y
|
||||
rm -r _CPack_Packages
|
||||
dpkg-scanpackages . /dev/null | gzip -9c > Packages.gz
|
||||
echo "deb [trusted=yes] file:${{ env.BUILD_DIR }} ./" | sudo tee /etc/apt/sources.list.d/openvino-local.list
|
||||
sudo apt-get update
|
||||
sudo apt-get install openvino -y
|
||||
popd
|
||||
|
||||
- name: List install files
|
||||
run: ls -alR ${{ env.INSTALL_DIR }}
|
||||
|
||||
- name: Build cpp samples - gcc
|
||||
run: ${{ env.SAMPLES_INSTALL_DIR }}/cpp/build_samples.sh -i ${{ env.INSTALL_DIR }}
|
||||
|
||||
- name: Build c samples
|
||||
run: ${{ env.SAMPLES_INSTALL_DIR }}/c/build_samples.sh -i ${{ env.INSTALL_DIR }}
|
||||
|
||||
- name: OpenVINO Core Unit Tests
|
||||
run: |
|
||||
export LD_LIBRARY_PATH=${{ env.INSTALL_TEST_DIR }}:$LD_LIBRARY_PATH
|
||||
${{ env.INSTALL_TEST_DIR }}/ov_core_unit_tests --gtest_print_time=1 --gtest_filter=-*IE_GPU* \
|
||||
--gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-OVCoreUT.xml
|
||||
|
||||
- name: Proxy Plugin Tests
|
||||
run: |
|
||||
export LD_LIBRARY_PATH=${{ env.INSTALL_TEST_DIR }}:$LD_LIBRARY_PATH
|
||||
${{ env.INSTALL_TEST_DIR }}/ov_proxy_plugin_tests --gtest_print_time=1 --gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-OVProxyTests.xml
|
||||
|
||||
- name: Hetero Func Tests
|
||||
run: |
|
||||
export LD_LIBRARY_PATH=${{ env.INSTALL_TEST_DIR }}:$LD_LIBRARY_PATH
|
||||
${{ env.INSTALL_TEST_DIR }}/ov_hetero_func_tests --gtest_print_time=1 --gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-OVHeteroFuncTests.xml
|
||||
|
||||
- name: ONNX frontend tests
|
||||
run: |
|
||||
export LD_LIBRARY_PATH=${{ env.INSTALL_TEST_DIR }}:$LD_LIBRARY_PATH
|
||||
${{ env.INSTALL_TEST_DIR }}/ov_onnx_frontend_tests --gtest_print_time=1 --gtest_filter=-*IE_GPU*:*FrontEndLoadFromTest.testLoadFromTwoStreams*:*FrontEndLoadFromTest.testLoadFromTwoFiles* \
|
||||
--gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-ONNXFrontend.xml
|
||||
|
||||
- name: TensorFlow frontend tests
|
||||
run: |
|
||||
export LD_LIBRARY_PATH=${{ env.INSTALL_TEST_DIR }}:$LD_LIBRARY_PATH
|
||||
${{ env.INSTALL_TEST_DIR }}/ov_tensorflow_frontend_tests --gtest_print_time=1 \
|
||||
--gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-TensorFlowFrontend.xml
|
||||
|
||||
# Disabled in Azure: https://github.com/openvinotoolkit/openvino/blob/master/.ci/azure/linux.yml#L403
|
||||
# - name: PaddlePaddle frontend tests
|
||||
# run: |
|
||||
# ${{ env.INSTALL_TEST_DIR }}/paddle_tests --gtest_print_time=1 --gtest_filter=*smoke* \
|
||||
# --gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-PaddleTests.xml
|
||||
|
||||
- name: TensorFlow Common tests
|
||||
run: |
|
||||
export LD_LIBRARY_PATH=${{ env.INSTALL_TEST_DIR }}:$LD_LIBRARY_PATH
|
||||
${{ env.INSTALL_TEST_DIR }}/ov_tensorflow_common_tests --gtest_print_time=1 \
|
||||
--gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-TensorFlowCommonFrontend.xml
|
||||
|
||||
- name: TensorFlow Lite frontend tests
|
||||
run: |
|
||||
export LD_LIBRARY_PATH=${{ env.INSTALL_TEST_DIR }}:$LD_LIBRARY_PATH
|
||||
${{ env.INSTALL_TEST_DIR }}/ov_tensorflow_lite_frontend_tests --gtest_print_time=1 \
|
||||
--gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-TensorFlowLiteFrontend.xml
|
||||
|
||||
- name: Snippets func tests
|
||||
run: |
|
||||
export LD_LIBRARY_PATH=${{ env.INSTALL_TEST_DIR }}:$LD_LIBRARY_PATH
|
||||
${{ env.INSTALL_TEST_DIR }}/ov_snippets_func_tests --gtest_print_time=1 \
|
||||
--gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-SnippetsFuncTests.xml
|
||||
|
||||
- name: CPU plugin unit tests
|
||||
run: |
|
||||
export LD_LIBRARY_PATH=${{ env.INSTALL_TEST_DIR }}:$LD_LIBRARY_PATH
|
||||
${{ env.INSTALL_TEST_DIR }}/ov_cpu_unit_tests --gtest_print_time=1 \
|
||||
--gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-CPUUnitTests.xml
|
||||
|
||||
- name: AUTO UT
|
||||
run: |
|
||||
export LD_LIBRARY_PATH=${{ env.INSTALL_TEST_DIR }}:$LD_LIBRARY_PATH
|
||||
${{ env.INSTALL_TEST_DIR }}/ov_auto_unit_tests --gtest_print_time=1 \
|
||||
--gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-ov_auto_unit_tests.xml
|
||||
|
||||
- name: Template plugin tests
|
||||
run: |
|
||||
export LD_LIBRARY_PATH=${{ env.INSTALL_TEST_DIR }}:$LD_LIBRARY_PATH
|
||||
${{ env.INSTALL_TEST_DIR }}/ov_template_func_tests --gtest_print_time=1 \
|
||||
--gtest_filter=*smoke* \
|
||||
--gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-TemplateFuncTests.xml
|
||||
|
||||
- name: Inference Engine C API tests
|
||||
run: |
|
||||
export LD_LIBRARY_PATH=${{ env.INSTALL_TEST_DIR }}:$LD_LIBRARY_PATH
|
||||
${{ env.INSTALL_TEST_DIR }}/InferenceEngineCAPITests --gtest_print_time=1 \
|
||||
--gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-InferenceEngineCAPITests.xml
|
||||
|
||||
- name: OpenVINO C API tests
|
||||
run: |
|
||||
export LD_LIBRARY_PATH=${{ env.INSTALL_TEST_DIR }}:$LD_LIBRARY_PATH
|
||||
${{ env.INSTALL_TEST_DIR }}/ov_capi_test --gtest_print_time=1 \
|
||||
--gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-OpenVINOCAPITests.xml
|
||||
|
||||
- name: nGraph and IE Python Bindings Tests
|
||||
run: |
|
||||
export LD_LIBRARY_PATH=${{ env.INSTALL_TEST_DIR }}:$LD_LIBRARY_PATH
|
||||
python3 -m pytest -s ${{ env.INSTALL_TEST_DIR }}/pyngraph ${{ env.PYTHON_STATIC_ARGS }} \
|
||||
--junitxml=${{ env.INSTALL_TEST_DIR }}/TEST-Pyngraph.xml \
|
||||
--ignore=${{ env.INSTALL_TEST_DIR }}/pyngraph/tests/test_onnx/test_zoo_models.py \
|
||||
--ignore=${{ env.INSTALL_TEST_DIR }}/pyngraph/tests/test_onnx/test_backend.py
|
||||
|
||||
- name: Python API 2.0 Tests
|
||||
run: |
|
||||
# For python imports to import pybind_mock_frontend
|
||||
export PYTHONPATH=${{ env.INSTALL_TEST_DIR }}:${{ env.OPENVINO_REPO }}/tools/mo:$PYTHONPATH
|
||||
|
||||
export LD_LIBRARY_PATH=${{ env.INSTALL_TEST_DIR }}:$LD_LIBRARY_PATH
|
||||
|
||||
python3 -m pytest -sv ${{ env.INSTALL_TEST_DIR }}/pyopenvino \
|
||||
--junitxml=${{ env.INSTALL_TEST_DIR }}/TEST-Pyngraph.xml \
|
||||
--ignore=${{ env.INSTALL_TEST_DIR }}/pyopenvino/tests/test_utils/test_utils.py
|
||||
|
||||
- name: ONNX Frontend Python Tests
|
||||
run: |
|
||||
# For python imports to import pybind_mock_frontend
|
||||
export PYTHONPATH=${{ env.INSTALL_TEST_DIR }}:${{ env.OPENVINO_REPO }}/tools/mo:$PYTHONPATH
|
||||
|
||||
export LD_LIBRARY_PATH=${{ env.INSTALL_TEST_DIR }}:$LD_LIBRARY_PATH
|
||||
|
||||
python3 -m pytest -sv ${{ env.OPENVINO_REPO }}/src/frontends/onnx/tests \
|
||||
--junitxml=${{ env.INSTALL_TEST_DIR }}/TEST-ONNX-FE-PYTHON.xml \
|
||||
--ignore=${{ env.OPENVINO_REPO }}/src/frontends/onnx/tests/test_python/test_zoo_models.py \
|
||||
--ignore=${{ env.OPENVINO_REPO }}/src/frontends/onnx/tests/test_python/test_backend.py
|
||||
|
||||
- name: Model Optimizer UT
|
||||
run: |
|
||||
|
||||
export PYTHONPATH=${{ env.OPENVINO_REPO }}/tools/mo/:${{ env.OPENVINO_REPO }}/tools/ovc/:${{ env.LAYER_TESTS_INSTALL_DIR }}:${{ env.INSTALL_TEST_DIR }}:${{ env.INSTALL_DIR }}/python/python3.11:$PYTHONPATH
|
||||
|
||||
# Need to be reinstalled to have correct numpy version
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_mxnet.txt
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_caffe.txt
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_kaldi.txt
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_onnx.txt
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_tf2.txt
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_dev.txt
|
||||
|
||||
python3 -m pytest -s ${{ env.INSTALL_TEST_DIR }}/mo/unit_tests \
|
||||
--junitxml=${{ env.INSTALL_TEST_DIR }}/TEST-ModelOptimizer.xml
|
||||
|
||||
# run not all smoke filter to save time in post-commit
|
||||
- name: CPU FuncTests
|
||||
run: ${{ env.INSTALL_TEST_DIR }}/ov_cpu_func_tests --gtest_filter=*OVCLass*:*CoreThreadingTests* --gtest_print_time=1 --gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-ov_cpu_func_tests.xml
|
||||
|
||||
- name: CMake Samples Tests
|
||||
run: cmake -GNinja -S ${{ env.OPENVINO_REPO }}/tests/samples_tests -B ${{ env.BUILD_DIR }}/samples_tests
|
||||
|
||||
- name: Install Samples Tests
|
||||
run: cmake -DCOMPONENT=tests -DCMAKE_INSTALL_PREFIX=${{ env.INSTALL_DIR }} -P ${{ env.BUILD_DIR }}/samples_tests/cmake_install.cmake
|
||||
|
||||
- name: Samples Smoke Tests
|
||||
run: |
|
||||
python3 -m pip install --ignore-installed PyYAML -r ${{ env.INSTALL_TEST_DIR }}/smoke_tests/requirements.txt
|
||||
|
||||
export LD_LIBRARY_PATH=${{ env.IE_APP_PATH }}:$LD_LIBRARY_PATH
|
||||
|
||||
python3 -m pytest -sv ${{ env.INSTALL_TEST_DIR }}/smoke_tests -k "not GNA" \
|
||||
--env_conf ${{ env.INSTALL_TEST_DIR }}/smoke_tests/env_config.yml \
|
||||
--junitxml=${{ env.INSTALL_TEST_DIR }}/TEST-SamplesSmokeTests.xml
|
||||
env:
|
||||
IE_APP_PATH: ${{ env.INSTALL_DIR }}/samples_bin
|
||||
IE_APP_PYTHON_PATH: ${{ env.INSTALL_DIR }}/share/openvino/samples/python
|
||||
LD_LIBRARY_PATH: ${{ env.INSTALL_DIR }}/samples_bin
|
||||
SHARE: ${{ env.INSTALL_TEST_DIR }}/smoke_tests/samples_smoke_tests_data
|
||||
WORKSPACE: ${{ env.INSTALL_DIR }}
|
||||
|
||||
- name: TensorFlow 1 Layer Tests - Legacy FE
|
||||
run: |
|
||||
python3 -m pip install -r ${{ env.LAYER_TESTS_INSTALL_DIR }}/requirements.txt
|
||||
export PYTHONPATH=${{ env.OPENVINO_REPO }}/tools/mo/:${{ env.LAYER_TESTS_INSTALL_DIR }}:$PYTHONPATH
|
||||
|
||||
python3 -m pytest ${{ env.LAYER_TESTS_INSTALL_DIR }}/tensorflow_tests/test_tf_Roll.py --ir_version=10 --junitxml=${{ env.INSTALL_TEST_DIR }}/TEST-tf_Roll.xml
|
||||
|
||||
- name: TensorFlow Lite Layer Tests - TFL FE
|
||||
run: |
|
||||
python3 -m pip install -r ${{ env.LAYER_TESTS_INSTALL_DIR }}/requirements.txt
|
||||
export PYTHONPATH=${{ env.OPENVINO_REPO }}/tools/mo/:${{ env.LAYER_TESTS_INSTALL_DIR }}:$PYTHONPATH
|
||||
|
||||
# Need to be reinstalled to have correct numpy version
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_caffe.txt
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_kaldi.txt
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_onnx.txt
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_tf2.txt
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_dev.txt
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_mxnet.txt
|
||||
|
||||
python3 -m pytest ${{ env.LAYER_TESTS_INSTALL_DIR }}/tensorflow_lite_tests/ --junitxml=${{ env.INSTALL_TEST_DIR }}/TEST-tfl_fe.xml
|
||||
env:
|
||||
TEST_DEVICE: CPU
|
||||
|
||||
- name: Upload Test Results
|
||||
uses: actions/upload-artifact@v3
|
||||
if: ${{ always() }}
|
||||
with:
|
||||
name: test-results
|
||||
path: ${{ env.INSTALL_TEST_DIR }}/TEST*.xml
|
||||
if-no-files-found: 'error'
|
||||
61
modules/openvino-master/.github/workflows/mo.yml
vendored
Normal file
@ -0,0 +1,61 @@
|
||||
name: MO
|
||||
on:
|
||||
push:
|
||||
paths:
|
||||
- 'tools/mo/**'
|
||||
- '.github/workflows/mo.yml'
|
||||
branches:
|
||||
- 'master'
|
||||
- 'releases/**'
|
||||
pull_request:
|
||||
paths:
|
||||
- 'tools/mo/**'
|
||||
- '.github/workflows/mo.yml'
|
||||
|
||||
concurrency:
|
||||
group: ${{ github.workflow }}-${{ github.ref }}
|
||||
cancel-in-progress: true
|
||||
|
||||
jobs:
|
||||
Pylint-UT:
|
||||
runs-on: ubuntu-22.04
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
with:
|
||||
submodules: recursive
|
||||
|
||||
- name: Setup Python
|
||||
uses: actions/setup-python@v4
|
||||
with:
|
||||
python-version: '3.10'
|
||||
|
||||
- name: Cache pip
|
||||
uses: actions/cache@v3
|
||||
with:
|
||||
path: ~/.cache/pip
|
||||
key: ${{ runner.os }}-pip-${{ hashFiles('tools/mo/requirements*.txt') }}
|
||||
restore-keys: |
|
||||
${{ runner.os }}-pip-
|
||||
${{ runner.os }}-
|
||||
|
||||
- name: Install dependencies
|
||||
run: |
|
||||
python -m pip install --upgrade pip setuptools
|
||||
# For UT
|
||||
pip install unittest-xml-reporting==3.0.2
|
||||
# MO requirements
|
||||
pip install -r requirements_mxnet.txt
|
||||
pip install -r requirements_caffe.txt
|
||||
pip install -r requirements_kaldi.txt
|
||||
pip install -r requirements_onnx.txt
|
||||
pip install -r requirements_tf2.txt
|
||||
pip install -r requirements_dev.txt
|
||||
working-directory: tools/mo
|
||||
|
||||
- name: Pylint-MO
|
||||
run: pylint -d C,R,W openvino/tools/mo
|
||||
working-directory: tools/mo
|
||||
|
||||
- name: Pylint-OVC
|
||||
run: pylint -d C,R,W openvino/tools/ovc
|
||||
working-directory: tools/ovc
|
||||
166
modules/openvino-master/.github/workflows/py_checks.yml
vendored
Normal file
@ -0,0 +1,166 @@
|
||||
name: Python API Checks
|
||||
|
||||
on:
|
||||
workflow_dispatch:
|
||||
push:
|
||||
paths:
|
||||
- 'src/bindings/python/**'
|
||||
- 'samples/python/**'
|
||||
- '.github/workflows/py_checks.yml'
|
||||
branches:
|
||||
- 'master'
|
||||
- 'releases/**'
|
||||
pull_request:
|
||||
paths:
|
||||
- 'src/bindings/python/**'
|
||||
- 'samples/python/**'
|
||||
- '.github/workflows/py_checks.yml'
|
||||
|
||||
concurrency:
|
||||
group: ${{ github.workflow }}-${{ github.ref }}
|
||||
cancel-in-progress: true
|
||||
|
||||
jobs:
|
||||
linters:
|
||||
runs-on: ubuntu-20.04
|
||||
steps:
|
||||
- name: Code checkout
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
submodules: recursive
|
||||
|
||||
- name: Setup Python
|
||||
uses: actions/setup-python@v4
|
||||
with:
|
||||
python-version: '3.7'
|
||||
|
||||
- name: Install dependencies
|
||||
run: python -m pip install -r src/bindings/python/requirements_test.txt
|
||||
|
||||
# samples code-style
|
||||
- name: Run flake8 on samples
|
||||
run: python -m flake8 ./ --config=setup.cfg
|
||||
working-directory: samples/python
|
||||
|
||||
- name: Create code style diff for samples
|
||||
if: failure()
|
||||
run: |
|
||||
python -m black -l 160 -S ./
|
||||
git diff > samples_diff.diff
|
||||
working-directory: samples/python
|
||||
|
||||
- uses: actions/upload-artifact@v3
|
||||
if: failure()
|
||||
with:
|
||||
name: samples_diff
|
||||
path: samples_diff.diff
|
||||
|
||||
# IE Python API Flake code-style
|
||||
- name: Run flake8 on IE Python API
|
||||
run: python -m flake8 ./ --config=setup.cfg
|
||||
working-directory: src/bindings/python/src/compatibility/openvino
|
||||
|
||||
- name: Create code style diff for IE Python API
|
||||
if: failure()
|
||||
run: |
|
||||
python -m black -l 160 -S ./
|
||||
git diff > ie_python_diff.diff
|
||||
working-directory: src/bindings/python/src/compatibility/openvino
|
||||
|
||||
- uses: actions/upload-artifact@v3
|
||||
if: failure()
|
||||
with:
|
||||
name: ie_python_diff
|
||||
path: ie_python_diff.diff
|
||||
|
||||
# nGraph Python API Flake code-style
|
||||
- name: Run flake8 on nGraph Python API
|
||||
run: python -m flake8 ./src/compatibility/ngraph --config=setup.cfg
|
||||
working-directory: src/bindings/python
|
||||
|
||||
- name: Create code style diff for nGraph Python API
|
||||
if: failure()
|
||||
run: |
|
||||
python -m black -l 160 -S ./
|
||||
git diff > pyngraph_diff.diff
|
||||
working-directory: src/bindings/python/src/compatibility/ngraph
|
||||
|
||||
- uses: actions/upload-artifact@v3
|
||||
if: failure()
|
||||
with:
|
||||
name: pyngraph_diff
|
||||
path: pyngraph_diff.diff
|
||||
|
||||
# Python API 2.0 Flake code-style
|
||||
- name: Run flake8 on Python API 2.0
|
||||
run: python -m flake8 ./src/openvino --config=setup.cfg
|
||||
working-directory: src/bindings/python
|
||||
|
||||
- name: Create code style diff for Python API 2.0
|
||||
if: failure()
|
||||
run: |
|
||||
python -m black -l 160 -S ./
|
||||
git diff > pyopenvino_diff.diff
|
||||
working-directory: src/bindings/python/src/openvino
|
||||
|
||||
- uses: actions/upload-artifact@v3
|
||||
if: failure()
|
||||
with:
|
||||
name: pyopenvino_diff
|
||||
path: pyopenvino_diff.diff
|
||||
|
||||
# wheel Flake code-style
|
||||
- name: Run flake8 on wheel
|
||||
run: python -m flake8 ./ --config=../setup.cfg
|
||||
working-directory: src/bindings/python/wheel
|
||||
|
||||
- name: Create code style diff for wheel
|
||||
if: failure()
|
||||
run: |
|
||||
python -m black -l 160 -S ./
|
||||
git diff > wheel_diff.diff
|
||||
working-directory: src/bindings/python/wheel
|
||||
|
||||
- uses: actions/upload-artifact@v3
|
||||
if: failure()
|
||||
with:
|
||||
name: wheel_diff
|
||||
path: wheel_diff.diff
|
||||
|
||||
# Python API 2.0 tests Flake code-style
|
||||
- name: Run flake8 on python tests
|
||||
# ignore lack of docs in tests
|
||||
run: python -m flake8 tests/ --config=setup.cfg
|
||||
working-directory: src/bindings/python
|
||||
|
||||
# IE Python API mypy check
|
||||
- name: Run mypy on IE Python API
|
||||
run: python -m mypy ./ --config-file ./setup.cfg
|
||||
working-directory: src/bindings/python/src/compatibility/openvino
|
||||
|
||||
# nGraph Python API mypy check
|
||||
- name: Run mypy on nGraph Python API
|
||||
run: python -m mypy ./src/compatibility/ngraph --config-file ./setup.cfg
|
||||
working-directory: src/bindings/python
|
||||
|
||||
# Python API 2.0 mypy check
|
||||
- name: Run mypy on Python API 2.0
|
||||
run: python -m mypy ./src/openvino --config-file ./setup.cfg
|
||||
working-directory: src/bindings/python
|
||||
|
||||
- name: Run Bandit
|
||||
run: python -m bandit -r ./ -f screen
|
||||
working-directory: src/bindings/python/src/compatibility/openvino
|
||||
|
||||
# layer_tests Flake code-style
|
||||
- name: Run flake8 on python tests in openvino/tests/layer_tests
|
||||
run: |
|
||||
modified_files=$(git diff --name-only)
|
||||
for file in $modified_files; do
|
||||
if [[ $file == "openvino/tests/layer_tests/"* ]]; then
|
||||
if [[ -f "$file" ]]; then
|
||||
python -m flake8 "$file" --config= ./setup.cfg
|
||||
fi
|
||||
fi
|
||||
done
|
||||
|
||||
25
modules/openvino-master/.github/workflows/stale_prs_and_issues.yml
vendored
Normal file
@ -0,0 +1,25 @@
|
||||
name: 'Close stale issues and PRs'
|
||||
on:
|
||||
workflow_dispatch:
|
||||
schedule:
|
||||
- cron: '0 0 * * *'
|
||||
|
||||
permissions:
|
||||
issues: write
|
||||
pull-requests: write
|
||||
|
||||
jobs:
|
||||
stale:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/stale@v8
|
||||
with:
|
||||
stale-issue-message: 'This issue will be closed in a week because of 9 months of no activity.'
|
||||
stale-pr-message: 'This PR will be closed in a week because of 2 weeks of no activity.'
|
||||
close-issue-message: 'This issue was closed because it has been stalled for 9 months with no activity.'
|
||||
close-pr-message: 'This PR was closed because it has been stalled for 2 week with no activity.'
|
||||
days-before-pr-stale: 14
|
||||
days-before-issue-stale: 274
|
||||
days-before-close: 7
|
||||
ascending: true
|
||||
exempt-pr-labels: 'no_stale'
|
||||
715
modules/openvino-master/.github/workflows/windows.yml
vendored
Normal file
@ -0,0 +1,715 @@
|
||||
name: Tests on Windows (VS 2022, Python 3.11)
|
||||
on:
|
||||
workflow_dispatch:
|
||||
# pull_request:
|
||||
# paths-ignore:
|
||||
# - '**/docs/**'
|
||||
# - 'docs/**'
|
||||
# - '**/**.md'
|
||||
# - '**.md'
|
||||
# - '**/layer_tests_summary/**'
|
||||
# - '**/conformance/**'
|
||||
# push:
|
||||
# paths-ignore:
|
||||
# - '**/docs/**'
|
||||
# - 'docs/**'
|
||||
# - '**/**.md'
|
||||
# - '**.md'
|
||||
# - '**/layer_tests_summary/**'
|
||||
# - '**/conformance/**'
|
||||
# branches:
|
||||
# - master
|
||||
|
||||
concurrency:
|
||||
group: ${{ github.head_ref || github.run_id }}-windows
|
||||
cancel-in-progress: true
|
||||
|
||||
env:
|
||||
CMAKE_BUILD_TYPE: 'Release'
|
||||
CMAKE_GENERATOR: 'Ninja'
|
||||
CMAKE_CXX_COMPILER_LAUNCHER: sccache
|
||||
CMAKE_C_COMPILER_LAUNCHER: sccache
|
||||
OPENVINO_REPO: "${{ github.workspace }}\\openvino"
|
||||
OPENVINO_CONTRIB_REPO: "${{ github.workspace }}\\openvino_contrib"
|
||||
INSTALL_DIR: "${{ github.workspace }}\\install"
|
||||
INSTALL_TEST_DIR: "${{ github.workspace }}\\install\\tests"
|
||||
SAMPLES_INSTALL_DIR: "${{ github.workspace }}\\install\\samples"
|
||||
LAYER_TESTS_INSTALL_DIR: "${{ github.workspace }}\\install\\tests\\layer_tests"
|
||||
BUILD_DIR: "${{ github.workspace }}\\build"
|
||||
DATA_PATH: "${{ github.workspace }}\\testdata"
|
||||
MODELS_PATH: "${{ github.workspace }}\\testdata"
|
||||
OV_TEMP: "${{ github.workspace }}\\openvino_temp"
|
||||
PYTHON_STATIC_ARGS: -m "not dynamic_library and not template_plugin"
|
||||
VCVARSPATH: "C:\\Program Files\\Microsoft Visual Studio\\2022\\Enterprise\\VC\\Auxiliary\\Build\\vcvarsall.bat"
|
||||
|
||||
jobs:
|
||||
Build:
|
||||
defaults:
|
||||
run:
|
||||
shell: pwsh
|
||||
runs-on: windows-latest-8-cores
|
||||
steps:
|
||||
- name: Clone OpenVINO
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
path: 'openvino'
|
||||
submodules: 'recursive'
|
||||
|
||||
- name: Clone OpenVINO Contrib
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
repository: 'openvinotoolkit/openvino_contrib'
|
||||
path: 'openvino_contrib'
|
||||
submodules: 'recursive'
|
||||
|
||||
- name: Clone testdata for C API tests
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
repository: 'openvinotoolkit/testdata'
|
||||
path: 'testdata'
|
||||
submodules: 'recursive'
|
||||
lfs: 'true'
|
||||
|
||||
#
|
||||
# Dependencies
|
||||
#
|
||||
|
||||
- uses: actions/setup-python@v4
|
||||
with:
|
||||
python-version: '3.11'
|
||||
|
||||
- name: Install python dependencies
|
||||
run: |
|
||||
# For Python API
|
||||
python3 -m pip install Scons
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/src/bindings/python/wheel/requirements-dev.txt
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/src/bindings/python/requirements.txt
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/src/bindings/python/requirements_test.txt
|
||||
|
||||
# For running Python API tests
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/src/bindings/python/src/compatibility/openvino/requirements-dev.txt
|
||||
|
||||
# For running ONNX frontend unit tests
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/src/frontends/onnx/tests/requirements.txt
|
||||
|
||||
# For running TensorFlow frontend unit tests
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/src/frontends/tensorflow/tests/requirements.txt
|
||||
|
||||
# For running Paddle frontend unit tests
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/src/frontends/paddle/tests/requirements.txt
|
||||
|
||||
- name: Install MO dependencies
|
||||
run: |
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_mxnet.txt `
|
||||
-r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_caffe.txt `
|
||||
-r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_kaldi.txt `
|
||||
-r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_onnx.txt `
|
||||
-r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_tf2.txt `
|
||||
-r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_dev.txt
|
||||
|
||||
- name: Install build dependencies
|
||||
run: |
|
||||
choco install --no-progress ninja
|
||||
choco install --no-progress shellcheck
|
||||
|
||||
- name: Get tools versions
|
||||
run: |
|
||||
python3 --version
|
||||
cmake --version
|
||||
|
||||
#
|
||||
# Build
|
||||
#
|
||||
|
||||
- name: Get number of CPU cores
|
||||
uses: SimenB/github-actions-cpu-cores@v2
|
||||
id: cpu-cores
|
||||
|
||||
- uses: ilammy/msvc-dev-cmd@v1
|
||||
|
||||
- name: Setup sccache
|
||||
uses: hendrikmuhs/ccache-action@v1.2
|
||||
with:
|
||||
variant: sccache
|
||||
max-size: "2000M"
|
||||
# Should save cache only if run in the master branch of the base repo
|
||||
# github.ref_name is 'ref/PR_#' in case of the PR, and 'branch_name' when executed on push
|
||||
save: ${{ github.ref_name == 'master' && 'true' || 'false' }}
|
||||
key: ${{ github.job }}-windows
|
||||
restore-keys: |
|
||||
${{ github.job }}-windows
|
||||
|
||||
- name: CMake configure
|
||||
run: |
|
||||
& {{ env.VCVARSPATH }} x64 && cmake -G "Ninja Multi-Config" `
|
||||
-DENABLE_CPPLINT=OFF `
|
||||
-DENABLE_ONEDNN_FOR_GPU=OFF `
|
||||
-DBUILD_SHARED_LIBS=OFF `
|
||||
-DENABLE_TESTS=ON `
|
||||
-DCMAKE_COMPILE_WARNING_AS_ERROR=OFF `
|
||||
-DENABLE_STRICT_DEPENDENCIES=OFF `
|
||||
-DENABLE_PYTHON=ON `
|
||||
-DBUILD_nvidia_plugin=OFF `
|
||||
-DCMAKE_DISABLE_FIND_PACKAGE_PkgConfig=ON `
|
||||
-DCUSTOM_OPERATIONS="calculate_grid;complex_mul;fft;grid_sample;sparse_conv;sparse_conv_transpose" `
|
||||
-DOPENVINO_EXTRA_MODULES=${{ env.OPENVINO_CONTRIB_REPO }}\modules `
|
||||
-DCMAKE_BUILD_TYPE=Release `
|
||||
-S ${{ env.OPENVINO_REPO }} `
|
||||
-B ${{ env.BUILD_DIR }}
|
||||
|
||||
- name: Build
|
||||
run: |
|
||||
& {{ env.VCVARSPATH }} x64 && cmake --build ${{ env.BUILD_DIR }} --parallel ${{ steps.cpu-cores.outputs.count }} --config Release
|
||||
|
||||
- name: Install
|
||||
run: cmake -DCMAKE_INSTALL_PREFIX=${{ env.INSTALL_DIR }} -P ${{ env.BUILD_DIR }}/cmake_install.cmake
|
||||
|
||||
- name: Install Wheels
|
||||
run: python3 -m pip install openvino-dev --find-links=${{ env.INSTALL_DIR }}\tools
|
||||
|
||||
- name: CMake Samples Tests
|
||||
run: |
|
||||
& {{ env.VCVARSPATH }} x64 && cmake -S ${{ env.OPENVINO_REPO }}/tests/samples_tests -B ${{ env.BUILD_DIR }}/samples_tests
|
||||
|
||||
- name: Install Samples Tests
|
||||
run: cmake -DCOMPONENT=tests -DCMAKE_INSTALL_PREFIX=${{ env.INSTALL_DIR }} -P ${{ env.BUILD_DIR }}/samples_tests/cmake_install.cmake
|
||||
|
||||
- name: Install Tests
|
||||
run: cmake -DCMAKE_INSTALL_PREFIX=${{ env.INSTALL_DIR }} -DCOMPONENT=tests -P ${{ env.BUILD_DIR }}\cmake_install.cmake
|
||||
|
||||
- name: Cmake Layer Tests
|
||||
run: |
|
||||
& {{ env.VCVARSPATH }} x64 && cmake -S ${{ env.OPENVINO_REPO }}/tests/layer_tests -B ${{ env.BUILD_DIR }}/layer_tests
|
||||
|
||||
- name: Build Layer Tests
|
||||
run: cmake --build ${{ env.BUILD_DIR }}/layer_tests --parallel --config Release
|
||||
|
||||
- name: Install Layer Tests
|
||||
run: cmake -DCOMPONENT=tests -DCMAKE_INSTALL_PREFIX=${{ env.INSTALL_DIR }} -P ${{ env.BUILD_DIR }}/layer_tests/cmake_install.cmake
|
||||
|
||||
- name: Pack Artifacts
|
||||
run: |
|
||||
$file=Get-ChildItem -Path "${{ env.INSTALL_DIR }}" -Exclude "tests"
|
||||
$compress = @{
|
||||
Path = $file
|
||||
CompressionLevel = "Optimal"
|
||||
DestinationPath = "${{ env.BUILD_DIR }}/openvino_package.zip"
|
||||
}
|
||||
Compress-Archive @compress
|
||||
|
||||
$file=Get-ChildItem -Path "${{ env.INSTALL_DIR }}\tests"
|
||||
$compress = @{
|
||||
Path = $file
|
||||
CompressionLevel = "Optimal"
|
||||
DestinationPath = "${{ env.BUILD_DIR }}/openvino_tests.zip"
|
||||
}
|
||||
Compress-Archive @compress
|
||||
|
||||
- name: Build cpp samples
|
||||
run: |
|
||||
& {{ env.VCVARSPATH }} x64
|
||||
& ${{ env.SAMPLES_INSTALL_DIR }}/cpp/build_samples_msvc.bat -i ${{ env.INSTALL_DIR }}
|
||||
|
||||
- name: Build c samples
|
||||
run: |
|
||||
& {{ env.VCVARSPATH }} x64
|
||||
& ${{ env.SAMPLES_INSTALL_DIR }}/c/build_samples_msvc.bat -i ${{ env.INSTALL_DIR }}
|
||||
|
||||
- name: Samples tests
|
||||
shell: cmd
|
||||
run: |
|
||||
python3 -m pip install --ignore-installed PyYAML -r ${{ env.INSTALL_TEST_DIR }}/smoke_tests/requirements.txt
|
||||
call "${{ env.INSTALL_DIR }}\\setupvars.bat" && python3 -m pytest -sv ${{ env.INSTALL_TEST_DIR }}/smoke_tests --env_conf ${{ env.INSTALL_TEST_DIR }}/smoke_tests/env_config.yml --junitxml=${{ env.INSTALL_TEST_DIR }}/TEST-SamplesSmokeTests.xml
|
||||
env:
|
||||
IE_APP_PATH: ${{ env.INSTALL_DIR }}/samples_bin
|
||||
IE_APP_PYTHON_PATH: ${{ env.INSTALL_DIR }}/samples/python
|
||||
SHARE: ${{ env.INSTALL_TEST_DIR }}/smoke_tests/samples_smoke_tests_data
|
||||
WORKSPACE: ${{ env.INSTALL_DIR }}
|
||||
|
||||
# Present in the "Build" job due to the fact that these tests require build directory
|
||||
- name: ONNX frontend tests
|
||||
shell: cmd
|
||||
run: |
|
||||
call "${{ env.INSTALL_DIR }}\\setupvars.bat" && ${{ env.INSTALL_TEST_DIR }}/ov_onnx_frontend_tests --gtest_print_time=1 --gtest_filter=-*IE_GPU* --gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-ONNXFrontend.xml
|
||||
|
||||
- name: List installed files
|
||||
if: ${{ always() }}
|
||||
run: |
|
||||
Get-ChildItem -Recurse -Directory ${{ env.INSTALL_DIR }}
|
||||
|
||||
- name: Upload openvino package
|
||||
uses: actions/upload-artifact@v3
|
||||
with:
|
||||
name: openvino_package
|
||||
path: ${{ env.BUILD_DIR }}/openvino_package.zip
|
||||
if-no-files-found: 'error'
|
||||
|
||||
- name: Upload openvino tests package
|
||||
uses: actions/upload-artifact@v3
|
||||
with:
|
||||
name: openvino_tests
|
||||
path: ${{ env.BUILD_DIR }}/openvino_tests.zip
|
||||
if-no-files-found: 'error'
|
||||
|
||||
Python_Unit_Tests:
|
||||
needs: Build
|
||||
defaults:
|
||||
run:
|
||||
shell: pwsh
|
||||
runs-on: windows-latest
|
||||
env:
|
||||
OPENVINO_REPO: "${{ github.workspace }}\\openvino"
|
||||
OPENVINO_CONTRIB_REPO: "${{ github.workspace }}\\openvino_contrib"
|
||||
INSTALL_DIR: "${{ github.workspace }}\\install"
|
||||
INSTALL_TEST_DIR: "${{ github.workspace }}\\install\\tests"
|
||||
SAMPLES_INSTALL_DIR: "${{ github.workspace }}\\install\\samples"
|
||||
LAYER_TESTS_INSTALL_DIR: "${{ github.workspace }}\\install\\tests\\layer_tests"
|
||||
BUILD_DIR: "${{ github.workspace }}\\build"
|
||||
DATA_PATH: "${{ github.workspace }}\\testdata"
|
||||
MODELS_PATH: "${{ github.workspace }}\\testdata"
|
||||
PYTHON_STATIC_ARGS: -m "not dynamic_library and not template_plugin"
|
||||
|
||||
steps:
|
||||
- name: Create Directories
|
||||
run: |
|
||||
mkdir ${{ env.INSTALL_DIR }}
|
||||
mkdir ${{ env.INSTALL_TEST_DIR }}
|
||||
|
||||
- name: Download OpenVINO package
|
||||
uses: actions/download-artifact@v3
|
||||
with:
|
||||
name: openvino_package
|
||||
path: ${{ env.INSTALL_DIR }}
|
||||
|
||||
- name: Download OpenVINO tests package
|
||||
uses: actions/download-artifact@v3
|
||||
with:
|
||||
name: openvino_tests
|
||||
path: ${{ env.INSTALL_TEST_DIR }}
|
||||
|
||||
- name: Extract OpenVINO packages
|
||||
run: |
|
||||
pushd ${{ env.INSTALL_DIR }}
|
||||
Expand-Archive openvino_package.zip -DestinationPath "${{ env.INSTALL_DIR }}"
|
||||
popd
|
||||
pushd ${{ env.INSTALL_TEST_DIR }}
|
||||
Expand-Archive openvino_tests.zip -DestinationPath "${{ env.INSTALL_TEST_DIR }}"
|
||||
popd
|
||||
|
||||
- name: Check extraction
|
||||
run: |
|
||||
ls "${{ github.workspace }}"
|
||||
ls "${{ env.INSTALL_DIR }}"
|
||||
ls "${{ env.INSTALL_TEST_DIR }}"
|
||||
|
||||
- name: Clone OpenVINO
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
path: 'openvino'
|
||||
submodules: 'recursive'
|
||||
|
||||
- name: Clone OpenVINO Contrib
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
repository: 'openvinotoolkit/openvino_contrib'
|
||||
path: 'openvino_contrib'
|
||||
submodules: 'recursive'
|
||||
|
||||
- uses: actions/setup-python@v4
|
||||
with:
|
||||
python-version: '3.11'
|
||||
|
||||
- name: Install python dependencies
|
||||
run: |
|
||||
# For Python API
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/src/bindings/python/wheel/requirements-dev.txt
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/src/bindings/python/requirements.txt
|
||||
|
||||
# For running Python API tests
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/src/bindings/python/src/compatibility/openvino/requirements-dev.txt
|
||||
|
||||
# For running ONNX frontend unit tests
|
||||
python3 -m pip install --force-reinstall -r ${{ env.OPENVINO_REPO }}/src/frontends/onnx/tests/requirements.txt
|
||||
|
||||
# For running TensorFlow frontend unit tests
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/src/frontends/tensorflow/tests/requirements.txt
|
||||
|
||||
# For running Paddle frontend unit tests
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/src/frontends/paddle/tests/requirements.txt
|
||||
|
||||
- name: Install MO dependencies
|
||||
run: |
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_mxnet.txt `
|
||||
-r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_caffe.txt `
|
||||
-r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_kaldi.txt `
|
||||
-r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_onnx.txt `
|
||||
-r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_tf2.txt `
|
||||
-r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_dev.txt
|
||||
|
||||
- name: Install Python wheels
|
||||
run: |
|
||||
python3 -m pip install openvino-dev --force-reinstall --find-links=${{ env.INSTALL_DIR }}\tools
|
||||
|
||||
- name: nGraph and IE Python Bindings Tests
|
||||
shell: cmd
|
||||
run: |
|
||||
set PYTHONPATH=${{ env.OPENVINO_REPO }}\tools\mo;${{ env.LAYER_TESTS_INSTALL_DIR }};%PYTHONPATH%
|
||||
call "${{ env.INSTALL_DIR }}\\setupvars.bat" && python3 -m pytest -s ${{ env.INSTALL_TEST_DIR }}/pyngraph ${{ env.PYTHON_STATIC_ARGS }} --junitxml=${{ env.INSTALL_TEST_DIR }}/TEST-Pyngraph.xml --ignore=${{ env.INSTALL_TEST_DIR }}/pyngraph/tests/test_onnx/test_zoo_models.py --ignore=${{ env.INSTALL_TEST_DIR }}/pyngraph/tests/test_onnx/test_backend.py
|
||||
|
||||
- name: Python API 2.0 Tests
|
||||
shell: cmd
|
||||
run: |
|
||||
set PYTHONPATH=${{ env.OPENVINO_REPO }}\tools\mo;${{ env.LAYER_TESTS_INSTALL_DIR }};%PYTHONPATH%
|
||||
call "${{ env.INSTALL_DIR }}\\setupvars.bat" && python3 -m pytest -sv ${{ env.INSTALL_TEST_DIR }}/pyopenvino ${{ env.PYTHON_STATIC_ARGS }} --junitxml=${{ env.INSTALL_TEST_DIR }}/TEST-Pyngraph.xml --ignore=${{ env.INSTALL_TEST_DIR }}/pyopenvino/tests/test_utils/test_utils.py --ignore=${{ env.INSTALL_TEST_DIR }}/pyopenvino/tests/test_onnx/test_zoo_models.py --ignore=${{ env.INSTALL_TEST_DIR }}/pyopenvino/tests/test_onnx/test_backend.py
|
||||
|
||||
- name: Model Optimizer UT
|
||||
shell: cmd
|
||||
run: |
|
||||
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_mxnet.txt ^
|
||||
-r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_caffe.txt ^
|
||||
-r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_kaldi.txt ^
|
||||
-r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_onnx.txt ^
|
||||
-r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_tf2.txt ^
|
||||
-r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_dev.txt
|
||||
|
||||
set PYTHONPATH=${{ env.OPENVINO_REPO }}\tools\mo;${{ env.LAYER_TESTS_INSTALL_DIR }};${{ env.INSTALL_TEST_DIR }};${{ env.INSTALL_DIR }}\python\python3.11;%PYTHONPATH%
|
||||
|
||||
call "${{ env.INSTALL_DIR }}\\setupvars.bat" && python3 -m pytest -s ${{ env.INSTALL_TEST_DIR }}/mo/unit_tests --ignore=${{ env.INSTALL_TEST_DIR }}/mo/unit_tests/mo/front/mxnet --junitxml=${{ env.INSTALL_TEST_DIR }}/TEST-ModelOptimizer.xml
|
||||
|
||||
# Ticket - 115085
|
||||
# - name: PyTorch Layer Tests
|
||||
# shell: cmd
|
||||
# run: |
|
||||
#
|
||||
# python3 -m pip install -r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_mxnet.txt ^
|
||||
# -r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_caffe.txt ^
|
||||
# -r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_kaldi.txt ^
|
||||
# -r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_onnx.txt ^
|
||||
# -r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_tf2.txt ^
|
||||
# -r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_dev.txt
|
||||
#
|
||||
# python3 -m pip install -r ${{ env.LAYER_TESTS_INSTALL_DIR }}/requirements.txt
|
||||
#
|
||||
# set PYTHONPATH=${{ env.OPENVINO_REPO }}\tools\mo;${{ env.LAYER_TESTS_INSTALL_DIR }};%PYTHONPATH%
|
||||
#
|
||||
# call "${{ env.INSTALL_DIR }}\\setupvars.bat" && python3 -m pytest ${{ env.LAYER_TESTS_INSTALL_DIR }}/pytorch_tests -m precommit --junitxml=${{ env.INSTALL_TEST_DIR }}/TEST-pytorch.xml
|
||||
# env:
|
||||
# TEST_DEVICE: CPU
|
||||
|
||||
- name: TensorFlow 1 Layer Tests - TF FE
|
||||
if: ${{ always() }}
|
||||
shell: cmd
|
||||
run: |
|
||||
python3 -m pip install -r ${{ env.LAYER_TESTS_INSTALL_DIR }}/requirements.txt
|
||||
|
||||
set PYTHONPATH=${{ env.OPENVINO_REPO }}\tools\mo;${{ env.LAYER_TESTS_INSTALL_DIR }};%PYTHONPATH%
|
||||
|
||||
call "${{ env.INSTALL_DIR }}\\setupvars.bat" && python3 -m pytest ${{ env.LAYER_TESTS_INSTALL_DIR }}/tensorflow_tests/ --use_new_frontend -m precommit_tf_fe --junitxml=${{ env.INSTALL_TEST_DIR }}/TEST-tf_fe.xml
|
||||
env:
|
||||
TEST_DEVICE: CPU
|
||||
|
||||
- name: TensorFlow 2 Layer Tests - TF FE
|
||||
if: ${{ always() }}
|
||||
shell: cmd
|
||||
run: |
|
||||
python3 -m pip install -r ${{ env.LAYER_TESTS_INSTALL_DIR }}/requirements.txt
|
||||
|
||||
set PYTHONPATH=${{ env.OPENVINO_REPO }}\tools\mo;${{ env.LAYER_TESTS_INSTALL_DIR }};%PYTHONPATH%
|
||||
|
||||
call "${{ env.INSTALL_DIR }}\\setupvars.bat" && python3 -m pytest ${{ env.LAYER_TESTS_INSTALL_DIR }}/tensorflow2_keras_tests/ --use_new_frontend -m precommit_tf_fe --junitxml=${{ env.INSTALL_TEST_DIR }}/TEST-tf2_fe.xml
|
||||
env:
|
||||
TEST_DEVICE: CPU
|
||||
|
||||
- name: TensorFlow 1 Layer Tests - Legacy FE
|
||||
if: ${{ always() }}
|
||||
shell: cmd
|
||||
run: |
|
||||
python3 -m pip install -r ${{ env.LAYER_TESTS_INSTALL_DIR }}/requirements.txt
|
||||
|
||||
set PYTHONPATH=${{ env.OPENVINO_REPO }}\tools\mo;${{ env.LAYER_TESTS_INSTALL_DIR }};%PYTHONPATH%
|
||||
|
||||
call "${{ env.INSTALL_DIR }}\\setupvars.bat" && python3 -m pytest ${{ env.LAYER_TESTS_INSTALL_DIR }}/tensorflow_tests/test_tf_Roll.py --ir_version=10 --junitxml=${{ env.INSTALL_TEST_DIR }}/TEST-tf_Roll.xml
|
||||
|
||||
- name: TensorFlow 2 Layer Tests - Legacy FE
|
||||
if: ${{ always() }}
|
||||
shell: cmd
|
||||
run: |
|
||||
python3 -m pip install -r ${{ env.LAYER_TESTS_INSTALL_DIR }}/requirements.txt
|
||||
|
||||
set PYTHONPATH=${{ env.OPENVINO_REPO }}\tools\mo;${{ env.LAYER_TESTS_INSTALL_DIR }};%PYTHONPATH%
|
||||
|
||||
call "${{ env.INSTALL_DIR }}\\setupvars.bat" && python3 -m pytest ${{ env.LAYER_TESTS_INSTALL_DIR }}/tensorflow2_keras_tests/test_tf2_keras_activation.py --ir_version=11 --junitxml=${{ env.INSTALL_TEST_DIR }}/TEST-tf2_Activation.xml -k "sigmoid"
|
||||
env:
|
||||
TEST_DEVICE: CPU
|
||||
|
||||
- name: TensorFlow Lite Layer Tests - TFL FE
|
||||
if: ${{ always() }}
|
||||
shell: cmd
|
||||
run: |
|
||||
python3 -m pip install -r ${{ env.LAYER_TESTS_INSTALL_DIR }}/requirements.txt
|
||||
|
||||
python3 -m pip install -r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_mxnet.txt ^
|
||||
-r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_caffe.txt ^
|
||||
-r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_kaldi.txt ^
|
||||
-r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_onnx.txt ^
|
||||
-r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_tf2.txt ^
|
||||
-r ${{ env.OPENVINO_REPO }}/tools/mo/requirements_dev.txt
|
||||
|
||||
set PYTHONPATH=${{ env.OPENVINO_REPO }}\tools\mo;${{ env.LAYER_TESTS_INSTALL_DIR }};%PYTHONPATH%
|
||||
|
||||
call "${{ env.INSTALL_DIR }}\\setupvars.bat" && python3 -m pytest ${{ env.LAYER_TESTS_INSTALL_DIR }}/tensorflow_lite_tests/ --junitxml=${{ env.INSTALL_TEST_DIR }}/TEST-tfl_fe.xml
|
||||
env:
|
||||
TEST_DEVICE: CPU
|
||||
|
||||
- name: MO Python API Tests
|
||||
if: ${{ always() }}
|
||||
shell: cmd
|
||||
run: |
|
||||
python3 -m pip install -r ${{ env.LAYER_TESTS_INSTALL_DIR }}/requirements.txt
|
||||
|
||||
set PYTHONPATH=${{ env.OPENVINO_REPO }}\tools\mo;${{ env.LAYER_TESTS_INSTALL_DIR }};%PYTHONPATH%
|
||||
|
||||
call "${{ env.INSTALL_DIR }}\\setupvars.bat" && python3 -m pytest ${{ env.LAYER_TESTS_INSTALL_DIR }}/mo_python_api_tests --junitxml=${{ env.INSTALL_TEST_DIR }}/TEST-test_mo_convert.xml
|
||||
env:
|
||||
TEST_DEVICE: CPU
|
||||
|
||||
- name: Python Frontend tests
|
||||
if: ${{ always() }}
|
||||
shell: cmd
|
||||
run: |
|
||||
python3 -m pip install -r ${{ env.LAYER_TESTS_INSTALL_DIR }}/requirements.txt
|
||||
|
||||
set PYTHONPATH=${{ env.OPENVINO_REPO }}\tools\mo;${{ env.LAYER_TESTS_INSTALL_DIR }};%PYTHONPATH%
|
||||
|
||||
call "${{ env.INSTALL_DIR }}\\setupvars.bat" && python3 -m pytest ${{ env.LAYER_TESTS_INSTALL_DIR }}/py_frontend_tests --junitxml=${{ env.INSTALL_TEST_DIR }}/TEST-test_py_fontend.xml
|
||||
|
||||
- name: Upload Test Results
|
||||
uses: actions/upload-artifact@v3
|
||||
if: ${{ always() }}
|
||||
with:
|
||||
name: test-results-python
|
||||
path: ${{ env.INSTALL_TEST_DIR }}/TEST*.xml
|
||||
if-no-files-found: 'error'
|
||||
|
||||
CXX_Unit_Tests:
|
||||
needs: Build
|
||||
defaults:
|
||||
run:
|
||||
shell: pwsh
|
||||
runs-on: windows-latest
|
||||
env:
|
||||
INSTALL_DIR: "${{ github.workspace }}\\install"
|
||||
INSTALL_TEST_DIR: "${{ github.workspace }}\\install\\tests"
|
||||
|
||||
steps:
|
||||
- name: Create Directories
|
||||
run: |
|
||||
mkdir ${{ env.INSTALL_DIR }}
|
||||
mkdir ${{ env.INSTALL_TEST_DIR }}
|
||||
|
||||
- name: Download OpenVINO package
|
||||
uses: actions/download-artifact@v3
|
||||
with:
|
||||
name: openvino_package
|
||||
path: ${{ env.INSTALL_DIR }}
|
||||
|
||||
- name: Download OpenVINO tests package
|
||||
uses: actions/download-artifact@v3
|
||||
with:
|
||||
name: openvino_tests
|
||||
path: ${{ env.INSTALL_TEST_DIR }}
|
||||
|
||||
- name: Extract OpenVINO packages
|
||||
run: |
|
||||
pushd ${{ env.INSTALL_DIR }}
|
||||
Expand-Archive openvino_package.zip -DestinationPath "${{ env.INSTALL_DIR }}"
|
||||
popd
|
||||
pushd ${{ env.INSTALL_TEST_DIR }}
|
||||
Expand-Archive openvino_tests.zip -DestinationPath "${{ env.INSTALL_TEST_DIR }}"
|
||||
popd
|
||||
|
||||
- name: Check extraction
|
||||
run: |
|
||||
ls "${{ github.workspace }}"
|
||||
ls "${{ env.INSTALL_DIR }}"
|
||||
ls "${{ env.INSTALL_TEST_DIR }}"
|
||||
|
||||
- name: OpenVINO Core unit tests
|
||||
shell: cmd
|
||||
run: |
|
||||
call "${{ env.INSTALL_DIR }}\\setupvars.bat" && ${{ env.INSTALL_TEST_DIR }}/ov_core_unit_tests --gtest_print_time=1 --gtest_filter=-*IE_GPU* --gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-NGraphUT.xml
|
||||
|
||||
- name: OpenVINO Inference functional tests
|
||||
shell: cmd
|
||||
run: |
|
||||
call "${{ env.INSTALL_DIR }}\\setupvars.bat" && ${{ env.INSTALL_TEST_DIR }}/ov_inference_functional_tests --gtest_print_time=1 --gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-InferenceFunc.xml
|
||||
|
||||
- name: OpenVINO Inference unit tests
|
||||
shell: cmd
|
||||
run: |
|
||||
call "${{ env.INSTALL_DIR }}\\setupvars.bat" && ${{ env.INSTALL_TEST_DIR }}/ov_inference_unit_tests --gtest_print_time=1 --gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-InferenceUnit.xml
|
||||
|
||||
- name: Low Precision Transformations Tests
|
||||
shell: cmd
|
||||
run: |
|
||||
call "${{ env.INSTALL_DIR }}\\setupvars.bat" && ${{ env.INSTALL_TEST_DIR }}/ov_lp_transformations_tests --gtest_print_time=1 --gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-LpTransformations.xml
|
||||
|
||||
- name: OpenVINO Conditional compilation tests
|
||||
shell: cmd
|
||||
run: |
|
||||
call "${{ env.INSTALL_DIR }}\\setupvars.bat" && ${{ env.INSTALL_TEST_DIR }}/ov_conditional_compilation_tests --gtest_print_time=1 --gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-ConditionalCompilation.xml
|
||||
|
||||
- name: IR frontend tests
|
||||
shell: cmd
|
||||
run: |
|
||||
call "${{ env.INSTALL_DIR }}\\setupvars.bat" && ${{ env.INSTALL_TEST_DIR }}/ov_ir_frontend_tests --gtest_print_time=1 --gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-IRFrontend.xml
|
||||
|
||||
# - name: PaddlePaddle frontend tests # Disabled in Azure: https://github.com/openvinotoolkit/openvino/blob/master/.ci/azure/linux.yml#L403
|
||||
# shell: cmd
|
||||
# run: |
|
||||
# call "${{ env.INSTALL_DIR }}\\setupvars.bat" && ${{ env.INSTALL_TEST_DIR }}/paddle_tests --gtest_print_time=1 --gtest_filter=*smoke* --gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-PaddleTests.xml
|
||||
|
||||
# - name: ONNX frontend tests # Present in the "Build" job due to the fact that these tests require build directory
|
||||
# shell: cmd
|
||||
# run: |
|
||||
# call "${{ env.INSTALL_DIR }}\\setupvars.bat" && ${{ env.INSTALL_TEST_DIR }}/ov_onnx_frontend_tests --gtest_print_time=1 --gtest_filter=-*IE_GPU* --gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-ONNXFrontend.xml
|
||||
|
||||
- name: TensorFlow Common tests
|
||||
shell: cmd
|
||||
run: |
|
||||
call "${{ env.INSTALL_DIR }}\\setupvars.bat" && ${{ env.INSTALL_TEST_DIR }}/ov_tensorflow_common_tests --gtest_print_time=1 --gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-TensorFlowCommonFrontend.xml
|
||||
|
||||
- name: TensorFlow frontend tests
|
||||
shell: cmd
|
||||
run: |
|
||||
call "${{ env.INSTALL_DIR }}\\setupvars.bat" && ${{ env.INSTALL_TEST_DIR }}/ov_tensorflow_frontend_tests --gtest_print_time=1 --gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-TensorFlowFrontend.xml
|
||||
|
||||
- name: TensorFlow Lite frontend tests
|
||||
shell: cmd
|
||||
run: |
|
||||
call "${{ env.INSTALL_DIR }}\\setupvars.bat" && ${{ env.INSTALL_TEST_DIR }}/ov_tensorflow_lite_frontend_tests --gtest_print_time=1 --gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-TensorFlowLiteFrontend.xml
|
||||
|
||||
- name: Transformations Tests
|
||||
shell: cmd
|
||||
run: |
|
||||
call "${{ env.INSTALL_DIR }}\\setupvars.bat" && ${{ env.INSTALL_TEST_DIR }}/ov_transformations_tests --gtest_print_time=1 --gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-Transformations.xml
|
||||
|
||||
- name: Common test utils tests
|
||||
shell: cmd
|
||||
run: |
|
||||
call "${{ env.INSTALL_DIR }}\\setupvars.bat" && ${{ env.INSTALL_TEST_DIR }}/ov_util_tests --gtest_print_time=1 --gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-commonUtilsTests.xml
|
||||
|
||||
- name: Snippets func tests
|
||||
shell: cmd
|
||||
run: |
|
||||
call "${{ env.INSTALL_DIR }}\\setupvars.bat" && ${{ env.INSTALL_TEST_DIR }}/ov_snippets_func_tests --gtest_print_time=1 --gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-SnippetsFuncTests.xml
|
||||
|
||||
- name: CPU plugin unit tests
|
||||
shell: cmd
|
||||
run: |
|
||||
call "${{ env.INSTALL_DIR }}\\setupvars.bat" && ${{ env.INSTALL_TEST_DIR }}/ov_cpu_unit_tests --gtest_print_time=1 --gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-CPUUnitTests.xml
|
||||
|
||||
# - name: GNA plugin unit tests # Disabled in Azure: https://github.com/openvinotoolkit/openvino/blob/master/.ci/azure/linux.yml#L434
|
||||
# shell: cmd
|
||||
# run: |
|
||||
# call "${{ env.INSTALL_DIR }}\\setupvars.bat" && ${{ env.INSTALL_TEST_DIR }}/ov_gna_unit_tests --gtest_print_time=1 --gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-GNAUnitTests.xml
|
||||
|
||||
- name: AUTO UT
|
||||
shell: cmd
|
||||
run: |
|
||||
call "${{ env.INSTALL_DIR }}\\setupvars.bat" && ${{ env.INSTALL_TEST_DIR }}/ov_auto_unit_tests --gtest_print_time=1 --gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-ov_auto_unit_tests.xml
|
||||
|
||||
- name: Template plugin tests
|
||||
shell: cmd
|
||||
run: |
|
||||
call "${{ env.INSTALL_DIR }}\\setupvars.bat" && ${{ env.INSTALL_TEST_DIR }}/ov_template_func_tests --gtest_print_time=1 --gtest_filter=*smoke* --gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-TemplateFuncTests.xml
|
||||
|
||||
- name: Inference Engine C API tests
|
||||
shell: cmd
|
||||
run: |
|
||||
call "${{ env.INSTALL_DIR }}\\setupvars.bat" && ${{ env.INSTALL_TEST_DIR }}/InferenceEngineCAPITests --gtest_print_time=1 --gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-InferenceEngineCAPITests.xml
|
||||
|
||||
- name: OpenVINO C API tests
|
||||
shell: cmd
|
||||
run: |
|
||||
call "${{ env.INSTALL_DIR }}\\setupvars.bat" && ${{ env.INSTALL_TEST_DIR }}/ov_capi_test --gtest_print_time=1 --gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-OpenVINOCAPITests.xml
|
||||
|
||||
- name: AutoBatch FuncTests
|
||||
shell: cmd
|
||||
run: |
|
||||
call "${{ env.INSTALL_DIR }}\\setupvars.bat" && ${{ env.INSTALL_TEST_DIR }}/ov_auto_batch_func_tests --gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-ov_auto_batch_func_tests.xml
|
||||
|
||||
- name: Proxy Plugin Tests
|
||||
shell: cmd
|
||||
run: |
|
||||
call "${{ env.INSTALL_DIR }}\\setupvars.bat" && ${{ env.INSTALL_TEST_DIR }}/ov_proxy_plugin_tests --gtest_print_time=1 --gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-OVProxyTests.xml
|
||||
|
||||
- name: Hetero Func Tests
|
||||
shell: cmd
|
||||
run: |
|
||||
call "${{ env.INSTALL_DIR }}\\setupvars.bat" && ${{ env.INSTALL_TEST_DIR }}/ov_hetero_func_tests --gtest_print_time=1 --gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-OVHeteroFuncTests.xml
|
||||
|
||||
- name: Upload Test Results
|
||||
uses: actions/upload-artifact@v3
|
||||
if: ${{ always() }}
|
||||
with:
|
||||
name: test-results-cpp
|
||||
path: ${{ env.INSTALL_TEST_DIR }}/TEST*.xml
|
||||
if-no-files-found: 'error'
|
||||
|
||||
CPU_Functional_Tests:
|
||||
needs: Build
|
||||
defaults:
|
||||
run:
|
||||
shell: pwsh
|
||||
runs-on: windows-latest
|
||||
env:
|
||||
INSTALL_DIR: "${{ github.workspace }}\\install"
|
||||
INSTALL_TEST_DIR: "${{ github.workspace }}\\install\\tests"
|
||||
|
||||
steps:
|
||||
- name: Create Directories
|
||||
run: |
|
||||
mkdir ${{ env.INSTALL_DIR }}
|
||||
mkdir ${{ env.INSTALL_TEST_DIR }}
|
||||
|
||||
- name: Download OpenVINO package
|
||||
uses: actions/download-artifact@v3
|
||||
with:
|
||||
name: openvino_package
|
||||
path: ${{ env.INSTALL_DIR }}
|
||||
|
||||
- name: Download OpenVINO tests package
|
||||
uses: actions/download-artifact@v3
|
||||
with:
|
||||
name: openvino_tests
|
||||
path: ${{ env.INSTALL_TEST_DIR }}
|
||||
|
||||
- name: Extract OpenVINO packages
|
||||
run: |
|
||||
pushd ${{ env.INSTALL_DIR }}
|
||||
Expand-Archive openvino_package.zip -DestinationPath "${{ env.INSTALL_DIR }}"
|
||||
popd
|
||||
pushd ${{ env.INSTALL_TEST_DIR }}
|
||||
Expand-Archive openvino_tests.zip -DestinationPath "${{ env.INSTALL_TEST_DIR }}"
|
||||
popd
|
||||
|
||||
- name: Check extraction
|
||||
run: |
|
||||
ls "${{ github.workspace }}"
|
||||
ls "${{ env.INSTALL_DIR }}"
|
||||
ls "${{ env.INSTALL_TEST_DIR }}"
|
||||
|
||||
- name: Intel CPU plugin func tests
|
||||
shell: cmd
|
||||
run: |
|
||||
call "${{ env.INSTALL_DIR }}\\setupvars.bat" && ${{ env.INSTALL_TEST_DIR }}/ov_cpu_func_tests --gtest_print_time=1 --gtest_filter=*smoke* --gtest_output=xml:${{ env.INSTALL_TEST_DIR }}/TEST-CPUFuncTests.xml
|
||||
|
||||
- name: Upload Test Results
|
||||
uses: actions/upload-artifact@v3
|
||||
if: ${{ always() }}
|
||||
with:
|
||||
name: test-results-functional-cpu
|
||||
path: ${{ env.INSTALL_TEST_DIR }}/TEST*.xml
|
||||
if-no-files-found: 'error'
|
||||
63
modules/openvino-master/.gitignore
vendored
Normal file
@ -0,0 +1,63 @@
|
||||
# build/artifact dirs
|
||||
_*
|
||||
[Bb]uild*/
|
||||
cmake-build*
|
||||
|
||||
# but ensure we don't skip __init__.py and __main__.py
|
||||
!__init__.py
|
||||
!__main__.py
|
||||
# and sphinx documentation folders
|
||||
!docs/_*
|
||||
|
||||
# developer tools
|
||||
*.idea
|
||||
.vscode
|
||||
.vs/
|
||||
.vsconan/
|
||||
.DS_Store
|
||||
**/tags
|
||||
compile_commands.json
|
||||
bin/
|
||||
.local_vimrc
|
||||
.gdb_history
|
||||
.vimspector.json
|
||||
doc/
|
||||
docs/build_documentation/work_dir/
|
||||
temp/
|
||||
.repo/
|
||||
CMakeLists.txt.user
|
||||
docs/IE_PLUGIN_DG/html/
|
||||
CMakeUserPresets.json
|
||||
|
||||
*.project
|
||||
*.cproject
|
||||
*.pydevproject
|
||||
*.settings
|
||||
*/gen/
|
||||
*.swp
|
||||
/config.xml
|
||||
|
||||
# Python-specific
|
||||
*.?env*
|
||||
*.pyc
|
||||
__pycache__
|
||||
# Tests-specific
|
||||
*.coverage
|
||||
*htmlcov
|
||||
*pylint_report.txt
|
||||
*pylint_report_comments.txt
|
||||
|
||||
# Artifacts
|
||||
/tools/mo/*.bin
|
||||
/tools/mo/*.xml
|
||||
/tools/mo/*.json
|
||||
/tools/mo/*.so
|
||||
/tools/mo/*.txt
|
||||
/tools/mo/*.pb
|
||||
/tools/mo/*.pbtxt
|
||||
/tools/mo/!CMakeLists.txt
|
||||
/tools/mo/*.mapping
|
||||
/tools/mo/*.dat
|
||||
/tools/mo/*.svg
|
||||
/src/plugins/intel_cpu/tools/commit_slider/*.json
|
||||
/src/plugins/intel_cpu/tools/commit_slider/slider_cache/*
|
||||
77
modules/openvino-master/.gitmodules
vendored
Normal file
@ -0,0 +1,77 @@
|
||||
[submodule "src/plugins/intel_cpu/thirdparty/onednn"]
|
||||
path = src/plugins/intel_cpu/thirdparty/onednn
|
||||
url = https://github.com/openvinotoolkit/oneDNN.git
|
||||
ignore = dirty
|
||||
[submodule "thirdparty/xbyak"]
|
||||
path = thirdparty/xbyak
|
||||
url = https://github.com/herumi/xbyak.git
|
||||
ignore = dirty
|
||||
[submodule "thirdparty/zlib/zlib"]
|
||||
path = thirdparty/zlib/zlib
|
||||
url = https://github.com/madler/zlib.git
|
||||
ignore = dirty
|
||||
[submodule "thirdparty/pugixml"]
|
||||
path = thirdparty/pugixml
|
||||
url = https://github.com/zeux/pugixml.git
|
||||
ignore = dirty
|
||||
[submodule "thirdparty/ade"]
|
||||
path = thirdparty/ade
|
||||
url = https://github.com/opencv/ade.git
|
||||
ignore = dirty
|
||||
[submodule "thirdparty/gflags/gflags"]
|
||||
path = thirdparty/gflags/gflags
|
||||
url = https://github.com/gflags/gflags.git
|
||||
ignore = dirty
|
||||
[submodule "thirdparty/gtest/gtest"]
|
||||
path = thirdparty/gtest/gtest
|
||||
url = https://github.com/openvinotoolkit/googletest.git
|
||||
ignore = dirty
|
||||
[submodule "thirdparty/ocl/icd_loader"]
|
||||
path = thirdparty/ocl/icd_loader
|
||||
url = https://github.com/KhronosGroup/OpenCL-ICD-Loader.git
|
||||
ignore = dirty
|
||||
[submodule "thirdparty/ocl/cl_headers"]
|
||||
path = thirdparty/ocl/cl_headers
|
||||
url = https://github.com/KhronosGroup/OpenCL-Headers.git
|
||||
ignore = dirty
|
||||
[submodule "thirdparty/ocl/clhpp_headers"]
|
||||
path = thirdparty/ocl/clhpp_headers
|
||||
url = https://github.com/KhronosGroup/OpenCL-CLHPP.git
|
||||
ignore = dirty
|
||||
[submodule "thirdparty/onnx"]
|
||||
path = thirdparty/onnx/onnx
|
||||
url = https://github.com/onnx/onnx.git
|
||||
[submodule "thirdparty/protobuf"]
|
||||
path = thirdparty/protobuf/protobuf
|
||||
url = https://github.com/protocolbuffers/protobuf.git
|
||||
[submodule "src/bindings/python/thirdparty/pybind11"]
|
||||
path = src/bindings/python/thirdparty/pybind11
|
||||
url = https://github.com/pybind/pybind11.git
|
||||
[submodule "thirdparty/ittapi/ittapi"]
|
||||
path = thirdparty/ittapi/ittapi
|
||||
url = https://github.com/intel/ittapi.git
|
||||
[submodule "ncc"]
|
||||
path = cmake/developer_package/ncc_naming_style/ncc
|
||||
url = https://github.com/nithinn/ncc.git
|
||||
[submodule "thirdparty/onednn_gpu"]
|
||||
path = src/plugins/intel_gpu/thirdparty/onednn_gpu
|
||||
url = https://github.com/oneapi-src/oneDNN.git
|
||||
[submodule "tools/pot/thirdparty/open_model_zoo"]
|
||||
path = thirdparty/open_model_zoo
|
||||
url = https://github.com/openvinotoolkit/open_model_zoo.git
|
||||
[submodule "thirdparty/json/nlohmann_json"]
|
||||
path = thirdparty/json/nlohmann_json
|
||||
url = https://github.com/nlohmann/json.git
|
||||
shallow = true
|
||||
[submodule "thirdparty/flatbuffers/flatbuffers"]
|
||||
path = thirdparty/flatbuffers/flatbuffers
|
||||
url = https://github.com/google/flatbuffers.git
|
||||
[submodule "thirdparty/snappy"]
|
||||
path = thirdparty/snappy
|
||||
url = https://github.com/google/snappy.git
|
||||
[submodule "ARMComputeLibrary"]
|
||||
path = src/plugins/intel_cpu/thirdparty/ComputeLibrary
|
||||
url = https://github.com/ARM-software/ComputeLibrary.git
|
||||
[submodule "src/plugins/intel_cpu/thirdparty/mlas"]
|
||||
path = src/plugins/intel_cpu/thirdparty/mlas
|
||||
url = https://github.com/openvinotoolkit/mlas.git
|
||||
155
modules/openvino-master/CMakeLists.txt
Normal file
@ -0,0 +1,155 @@
|
||||
# Copyright (C) 2018-2023 Intel Corporation
|
||||
# SPDX-License-Identifier: Apache-2.0
|
||||
#
|
||||
|
||||
if(DEFINED BUILD_SHARED_LIBS AND NOT BUILD_SHARED_LIBS)
|
||||
# 3.17: 'target_link_libraries' does not work correctly when called from
|
||||
# different directory where 'add_library' is called: CMake generates
|
||||
# incorrect OpenVINOConfig.cmake in this case
|
||||
# 3.18: add_library cannot create ALIAS for non-GLOBAL targets
|
||||
cmake_minimum_required(VERSION 3.18)
|
||||
else()
|
||||
if(CPACK_GENERATOR STREQUAL "DEB")
|
||||
# we have to use CPACK_DEBIAN_PACKAGE_SHLIBDEPS_PRIVATE_DIRS variable
|
||||
cmake_minimum_required(VERSION 3.20)
|
||||
else()
|
||||
# default choice
|
||||
cmake_minimum_required(VERSION 3.13)
|
||||
endif()
|
||||
endif()
|
||||
|
||||
if(POLICY CMP0091)
|
||||
cmake_policy(SET CMP0091 NEW) # Enables use of MSVC_RUNTIME_LIBRARY
|
||||
endif()
|
||||
|
||||
project(OpenVINO DESCRIPTION "OpenVINO toolkit")
|
||||
|
||||
find_package(IEDevScripts REQUIRED
|
||||
PATHS "${OpenVINO_SOURCE_DIR}/cmake/developer_package"
|
||||
NO_CMAKE_FIND_ROOT_PATH
|
||||
NO_DEFAULT_PATH)
|
||||
|
||||
include(CTest)
|
||||
include(cmake/features.cmake)
|
||||
|
||||
# These options are shared with 3rdparty plugins by means of developer package
|
||||
include(cmake/dependencies.cmake)
|
||||
|
||||
if(ENABLE_COVERAGE)
|
||||
include(cmake/coverage.cmake)
|
||||
endif()
|
||||
|
||||
# resolving dependencies for the project
|
||||
message (STATUS "CMAKE_VERSION ......................... " ${CMAKE_VERSION})
|
||||
message (STATUS "OpenVINO_SOURCE_DIR ................... " ${OpenVINO_SOURCE_DIR})
|
||||
message (STATUS "OpenVINO_BINARY_DIR ................... " ${OpenVINO_BINARY_DIR})
|
||||
message (STATUS "CMAKE_GENERATOR ....................... " ${CMAKE_GENERATOR})
|
||||
message (STATUS "CPACK_GENERATOR ....................... " ${CPACK_GENERATOR})
|
||||
message (STATUS "CMAKE_C_COMPILER_ID ................... " ${CMAKE_C_COMPILER_ID})
|
||||
message (STATUS "CMAKE_CXX_COMPILER_ID ................. " ${CMAKE_CXX_COMPILER_ID})
|
||||
if(OV_GENERATOR_MULTI_CONFIG)
|
||||
string(REPLACE ";" " " config_types "${CMAKE_CONFIGURATION_TYPES}")
|
||||
message (STATUS "CMAKE_CONFIGURATION_TYPES ............. " ${config_types})
|
||||
unset(config_types)
|
||||
if(CMAKE_GENERATOR MATCHES "^Ninja Multi-Config$")
|
||||
message (STATUS "CMAKE_DEFAULT_BUILD_TYPE .............. " ${CMAKE_DEFAULT_BUILD_TYPE})
|
||||
endif()
|
||||
else()
|
||||
message (STATUS "CMAKE_BUILD_TYPE ...................... " ${CMAKE_BUILD_TYPE})
|
||||
endif()
|
||||
if(CMAKE_GENERATOR_PLATFORM)
|
||||
message (STATUS "CMAKE_GENERATOR_PLATFORM .............. " ${CMAKE_GENERATOR_PLATFORM})
|
||||
endif()
|
||||
if(CMAKE_GENERATOR_TOOLSET)
|
||||
message (STATUS "CMAKE_GENERATOR_TOOLSET ............... " ${CMAKE_GENERATOR_TOOLSET})
|
||||
endif()
|
||||
if(CMAKE_TOOLCHAIN_FILE)
|
||||
message (STATUS "CMAKE_TOOLCHAIN_FILE .................. " ${CMAKE_TOOLCHAIN_FILE})
|
||||
endif()
|
||||
if(NOT OV_GLIBC_VERSION VERSION_EQUAL 0.0)
|
||||
message (STATUS "GLIBC_VERSION ......................... " ${OV_GLIBC_VERSION})
|
||||
endif()
|
||||
|
||||
# remove file with exported targets to force its regeneration
|
||||
file(REMOVE "${CMAKE_BINARY_DIR}/ngraphTargets.cmake")
|
||||
file(REMOVE "${CMAKE_BINARY_DIR}/InferenceEngineTargets.cmake")
|
||||
file(REMOVE "${CMAKE_BINARY_DIR}/OpenVINOTargets.cmake")
|
||||
|
||||
# remove exported developer targets to force its regeneration
|
||||
macro(ov_clean_dev_targets)
|
||||
foreach(component IN LISTS openvino_export_components)
|
||||
file(REMOVE "${CMAKE_BINARY_DIR}/${component}_dev_targets.cmake")
|
||||
file(REMOVE "${CMAKE_BINARY_DIR}/ov_${component}_dev_targets.cmake")
|
||||
unset(${component} CACHE)
|
||||
endforeach()
|
||||
unset(openvino_export_components CACHE)
|
||||
unset(openvino_installed_targets CACHE)
|
||||
endmacro()
|
||||
ov_clean_dev_targets()
|
||||
|
||||
#
|
||||
# Build
|
||||
#
|
||||
|
||||
function(openvino_developer_export_targets)
|
||||
cmake_parse_arguments(EXPORT "" "COMPONENT" "TARGETS" ${ARGN})
|
||||
|
||||
if(EXPORT_UNPARSED_ARGUMENTS)
|
||||
message(FATAL_ERROR "openvino_developer_export_targets has unparsed arguments: ${EXPORT_UNPARSED_ARGUMENTS}")
|
||||
endif()
|
||||
|
||||
set(${EXPORT_COMPONENT} "${${EXPORT_COMPONENT}};${EXPORT_TARGETS}")
|
||||
|
||||
# to allow exporting of aliased targets with the original names
|
||||
foreach(target_name IN LISTS ${EXPORT_COMPONENT})
|
||||
if(TARGET "${target_name}")
|
||||
get_target_property(original_name ${target_name} ALIASED_TARGET)
|
||||
if(TARGET "${original_name}")
|
||||
message(STATUS "The name ${target_name} is an ALIAS for ${original_name}. "
|
||||
"It will be exported to the OpenVINODeveloperPackage with the original name.")
|
||||
list(REMOVE_ITEM ${EXPORT_COMPONENT} ${target_name})
|
||||
list(APPEND ${EXPORT_COMPONENT} ${original_name})
|
||||
endif()
|
||||
endif()
|
||||
endforeach()
|
||||
|
||||
list(REMOVE_DUPLICATES ${EXPORT_COMPONENT})
|
||||
set(${EXPORT_COMPONENT} "${${EXPORT_COMPONENT}}" CACHE INTERNAL
|
||||
"A list of OpenVINO ${EXPORT_COMPONENT} exported targets" FORCE)
|
||||
|
||||
list(APPEND openvino_export_components ${EXPORT_COMPONENT})
|
||||
list(REMOVE_DUPLICATES openvino_export_components)
|
||||
set(openvino_export_components "${openvino_export_components}" CACHE INTERNAL
|
||||
"A list of OpenVINO exported components" FORCE)
|
||||
endfunction()
|
||||
|
||||
# add target with processed tests model zoo
|
||||
if (ENABLE_TESTS)
|
||||
include(cmake/test_model_zoo.cmake)
|
||||
endif()
|
||||
|
||||
include(thirdparty/dependencies.cmake)
|
||||
add_subdirectory(src)
|
||||
|
||||
if(ENABLE_SAMPLES OR ENABLE_TESTS)
|
||||
add_subdirectory(samples)
|
||||
endif()
|
||||
|
||||
# Enable interpreter backend for tests
|
||||
if (ENABLE_TESTS OR ENABLE_TEMPLATE)
|
||||
add_subdirectory(src/plugins/template/backend)
|
||||
endif()
|
||||
include(cmake/extra_modules.cmake)
|
||||
add_subdirectory(docs)
|
||||
add_subdirectory(tools)
|
||||
add_subdirectory(scripts)
|
||||
add_subdirectory(licensing)
|
||||
|
||||
#
|
||||
# CPack
|
||||
#
|
||||
|
||||
# provides a callback function to describe each component in repo
|
||||
include(cmake/packaging/packaging.cmake)
|
||||
|
||||
ie_cpack(${IE_CPACK_COMPONENTS_ALL})
|
||||
88
modules/openvino-master/CONTRIBUTING.md
Normal file
@ -0,0 +1,88 @@
|
||||
# Contributing to OpenVINO
|
||||
|
||||
## How to contribute to the OpenVINO project
|
||||
|
||||
OpenVINO™ is always looking for opportunities to improve and your contributions
|
||||
play a big role in this process. There are several ways you can make the
|
||||
product better:
|
||||
|
||||
|
||||
### Provide Feedback
|
||||
|
||||
* **Report bugs / issues**
|
||||
If you experience faulty behavior in OpenVINO or its components, you can
|
||||
[create a new issue](https://github.com/openvinotoolkit/openvino/issues)
|
||||
in the GitHub issue tracker.
|
||||
|
||||
* **Propose new features / improvements**
|
||||
If you have a suggestion for improving OpenVINO or want to share your ideas, you can open a new
|
||||
[GitHub Discussion](https://github.com/openvinotoolkit/openvino/discussions).
|
||||
If your idea is already well defined, you can also create a
|
||||
[Feature Request Issue](https://github.com/openvinotoolkit/openvino/issues/new?assignees=octocat&labels=enhancement%2Cfeature&projects=&template=feature_request.yml&title=%5BFeature+Request%5D%3A+)
|
||||
In both cases, provide a detailed description, including use cases, benefits, and potential challenges.
|
||||
If your points are especially well aligned with the product vision, they will be included in the
|
||||
[development roadmap](./ROADMAP.md).
|
||||
User feedback is crucial for OpenVINO development and even if your input is not immediately prioritized,
|
||||
it may be used at a later time or undertaken by the community, regardless of the official roadmap.
|
||||
|
||||
|
||||
### Contribute Code Changes
|
||||
|
||||
* **Fix Bugs or Develop New Features**
|
||||
If you want to help improving OpenVINO, choose one of the issues reported in
|
||||
[GitHub Issue Tracker](https://github.com/openvinotoolkit/openvino/issues) and
|
||||
[create a Pull Request](./CONTRIBUTING_PR.md) addressing it. Consider one of the
|
||||
tasks listed as [first-time contributions](https://github.com/openvinotoolkit/openvino/issues/17502).
|
||||
If the feature you want to develop is more complex or not well defined by the reporter,
|
||||
it is always a good idea to [discuss it](https://github.com/openvinotoolkit/openvino/discussions)
|
||||
with OpenVINO developers first. Before creating a new PR, check if nobody is already
|
||||
working on it. In such a case, you may still help, having aligned with the other developer.
|
||||
|
||||
Importantly, always check if the change hasn't been implemented before you start working on it!
|
||||
You can build OpenVINO using the latest master branch and make sure that it still needs your
|
||||
changes. Also, do not address issues that only affect older non-LTS releases, like 2022.2.
|
||||
|
||||
* **Develop a New Device Plugin**
|
||||
Since the market of computing devices is constantly evolving, OpenVINO is always open to extending
|
||||
its support for new hardware. If you want to run inference on a device that is currently not supported,
|
||||
you can see how to develop a new plugin for it in the
|
||||
[Plugin Developer Guide](https://docs.openvino.ai/canonical/openvino_docs_ie_plugin_dg_overview.html).
|
||||
|
||||
|
||||
### Improve documentation
|
||||
|
||||
* **OpenVINO developer documentation** is contained entirely in this repository, under the
|
||||
[./docs/dev](https://github.com/openvinotoolkit/openvino/tree/master/docs/dev) folder.
|
||||
|
||||
* **User documentation** is built from several sources and published at
|
||||
[docs.openvino.ai](docs.openvino.ai), which is the recommended place for reading
|
||||
these documents. Use the files maintained in this repository only for editing purposes.
|
||||
|
||||
* The easiest way to help with documentation is to review it and provide feedback on the
|
||||
existing articles. Whether you notice a mistake, see the possibility of improving the text,
|
||||
or think more information should be added, you can reach out to any of the documentation
|
||||
contributors to discuss the potential changes.
|
||||
|
||||
You can also create a Pull Request directly, following the [editor's guide](./docs/CONTRIBUTING_DOCS.md).
|
||||
|
||||
|
||||
### Promote and Support OpenVINO
|
||||
|
||||
* **Popularize OpenVINO**
|
||||
Articles, tutorials, blog posts, demos, videos, and any other involvement
|
||||
in the OpenVINO community is always a welcome contribution. If you discuss
|
||||
or present OpenVINO on various social platforms, you are raising awareness
|
||||
of the product among A.I. enthusiasts and enabling other people to discover
|
||||
the toolkit. Feel free to reach out to OpenVINO developers if you need help
|
||||
with making such community-based content.
|
||||
|
||||
* **Help Other Community Members**
|
||||
If you are an experienced OpenVINO user and want to help, you can always
|
||||
share your expertise with the community. Check GitHub Discussions and
|
||||
Issues to see if you can help someone.
|
||||
|
||||
|
||||
## License
|
||||
|
||||
By contributing to the OpenVINO project, you agree that your contributions will be
|
||||
licensed under the terms stated in the [LICENSE](./LICENSE.md) file.
|
||||
111
modules/openvino-master/CONTRIBUTING_DOCS.md
Normal file
@ -0,0 +1,111 @@
|
||||
# OpenVINO Documentation Guide
|
||||
|
||||
## Basic article structure
|
||||
|
||||
OpenVINO documentation is built using Sphinx and the reStructuredText formatting.
|
||||
That means the basic formatting rules need to be used:
|
||||
|
||||
|
||||
### White Spaces
|
||||
|
||||
OpenVINO documentation is developed to be easily readable in both html and
|
||||
reStructuredText. Here are some suggestions on how to make it render nicely
|
||||
and improve document clarity.
|
||||
|
||||
### Headings (including the article title)
|
||||
|
||||
They are made by "underscoring" text with punctuation marks (at least as
|
||||
many marks as letters in the underscored header). We use the following convention:
|
||||
|
||||
```
|
||||
H1
|
||||
====================
|
||||
|
||||
H2
|
||||
####################
|
||||
|
||||
H3
|
||||
++++++++++++++++++++
|
||||
|
||||
H4
|
||||
--------------------
|
||||
|
||||
H5
|
||||
....................
|
||||
```
|
||||
|
||||
### Line length
|
||||
|
||||
In programming, a limit of 80 characters per line is a common BKM. It may also apply
|
||||
to reading natural languages fairly well. For this reason, we aim at lines of around
|
||||
70 to 100 characters long. The limit is not a strict rule but rather a guideline to
|
||||
follow in most cases. The breaks will not translate to html, and rightly so, but will
|
||||
make reading and editing documents in GitHub or an editor much easier.
|
||||
|
||||
### Tables
|
||||
|
||||
Tables may be difficult to implement well in websites. For example, longer portions
|
||||
of text, like descriptions, may render them difficult to read (e.g. improper cell
|
||||
widths or heights). Complex tables may also be difficult to read in source files.
|
||||
To prevent that, check the [table directive documentation](https://www.sphinx-doc.org/en/master/usage/restructuredtext/directives.html#table-directives)
|
||||
and see our custom directives. Use the following guidelines for easier editing:
|
||||
|
||||
* For very big and complex data sets: use a list instead of a table or remove
|
||||
the problematic content from the table and implement it differently.
|
||||
* For very big and complex data sets that need to use tables: use an external
|
||||
file (e.g. PDF) and link to it.
|
||||
* For medium tables that look bad in source (e.g. due to long lines of text),
|
||||
use the reStructuredText list table format.
|
||||
* For medium and small tables, use the reStructuredText grid or simple table formats.
|
||||
|
||||
|
||||
## Cross-linking
|
||||
|
||||
There are several directives Sphinx uses for linking, each has its purpose and format.
|
||||
Follow these guidelines for consistent results:
|
||||
|
||||
* Avoid absolute references to internal documents as much as possible (link to source, not html).
|
||||
* Note that sphinx uses the "back-tick" character and not the "inverted-comma" => ` vs. '
|
||||
* When a file path starts at the same directory is used, put "./" at its beginning.
|
||||
* Always add a space before the opening angle bracket ("<") for target files.
|
||||
|
||||
Use the following formatting for different links:
|
||||
|
||||
* link to an external page / file
|
||||
* `` `text <url> `__ ``
|
||||
* use a double underscore for consistency
|
||||
|
||||
* link to an internal documentation page / file
|
||||
* `` :doc:`a docs page <relative file path>` ``
|
||||
* Link to an rst or md file within our documentation, so that it renders properly in html
|
||||
|
||||
* link to a header on the same page
|
||||
* `` 'a header in the same article <this-is-section-header-title>`__ ``
|
||||
* anchors are created automatically for all existing headers
|
||||
* such anchor looks like the header, with minor adjustments:
|
||||
* all letters are lower case,
|
||||
* remove all special glyphs, like brackets,
|
||||
* replace spaces with hyphens
|
||||
|
||||
* Create an anchor in an article
|
||||
* `` .. _anchor-in-the target-article:: ``
|
||||
* put it before the header to which you want to link
|
||||
* See the rules for naming anchors / labels at the bottom of this article
|
||||
|
||||
* link to an anchor on a different page in our documentation
|
||||
* `` :ref:`the created anchor <anchor-in-the target-article>` ``
|
||||
* link to the anchor using just its name
|
||||
|
||||
|
||||
* anchors / labels
|
||||
|
||||
Read about anchors
|
||||
|
||||
Sphinx uses labels to create html anchors, which can be linked to from anywhere in documentation.
|
||||
Although they may be put at the top of any article to make linking to it very easy, we do not use
|
||||
this approach. Every label definition starts with an underscore, the underscore is not used in links.
|
||||
|
||||
Most importantly, every label needs to be globally unique. It means that it is always a good
|
||||
practice to start their labels with a clear identifier of the article they reside in.
|
||||
|
||||
|
||||
63
modules/openvino-master/CONTRIBUTING_PR.md
Normal file
@ -0,0 +1,63 @@
|
||||
# How to Prepare a Good PR
|
||||
|
||||
OpenVINO is an open-source project and you can contribute to its code directly.
|
||||
To do so, follow these guidelines for creating Pull Requests, so that your
|
||||
changes get the highest chance of being merged.
|
||||
|
||||
|
||||
## General Rules of a Good Pull Request
|
||||
|
||||
* Create your own fork of the repository and use it to create PRs.
|
||||
Avoid creating change branches in the main repository.
|
||||
* Choose a proper branch for your work and create your own branch based on it.
|
||||
* Give your branches, commits, and Pull Requests meaningful names and descriptions.
|
||||
It helps to track changes later. If your changes cover a particular component,
|
||||
you can indicate it in the PR name as a prefix, for example: ``[DOCS] PR name``.
|
||||
* Follow the [OpenVINO code style guide](https://github.com/openvinotoolkit/openvino/blob/master/docs/dev/coding_style.md).
|
||||
* Make your PRs small - each PR should address one issue. Remove all changes
|
||||
unrelated to the PR.
|
||||
* Document your contribution! If your changes may impact how the user works with
|
||||
OpenVINO, provide the information in proper articles. You can do it yourself,
|
||||
or contact one of OpenVINO documentation contributors to work together on
|
||||
developing the right content.
|
||||
* For Work In Progress, or checking test results early, use a Draft PR.
|
||||
|
||||
|
||||
## Ensure Change Quality
|
||||
|
||||
Your pull request will be automatically tested by OpenVINO™'s pre-commit and marked
|
||||
as "green" if it is ready for merging. If any builders fail, the status is "red,"
|
||||
you need to fix the issues listed in console logs. Any change to the PR branch will
|
||||
automatically trigger the checks, so you don't need to recreate the PR, Just wait
|
||||
for the updated results.
|
||||
|
||||
Regardless of the automated tests, you should ensure the quality of your changes:
|
||||
|
||||
* Test your changes locally:
|
||||
* Make sure to double-check your code.
|
||||
* Run tests locally to identify and fix potential issues (execute test binaries
|
||||
from the artifacts directory, e.g. ``<source dir>/bin/intel64/Release/ieFuncTests``)
|
||||
* Before creating a PR, make sure that your branch is up to date with the latest
|
||||
state of the branch you want to contribute to (e.g. git fetch upstream && git
|
||||
merge upstream/master).
|
||||
|
||||
|
||||
## Branching Policy
|
||||
|
||||
* The "master" branch is used for development and constitutes the base for each new release.
|
||||
* Each OpenVINO release has its own branch: ``releases/<year>/<release number>``.
|
||||
* The final release each year is considered a Long Term Support version,
|
||||
which means it remains active.
|
||||
* Contributions are accepted only by active branches, which are:
|
||||
* the "master" branch for future releases,
|
||||
* the most recently published version for fixes,
|
||||
* LTS versions (for two years from their release dates).
|
||||
|
||||
|
||||
## Need Additional Help? Check these Articles
|
||||
|
||||
* [How to create a fork](https://help.github.com/articles/fork-a-rep)
|
||||
* [Install Git](https://git-scm.com/book/en/v2/Getting-Started-First-Time-Git-Setup)
|
||||
* If you want to add a new sample, please have a look at the Guide for contributing
|
||||
to C++/C/Python IE samples and add the license statement at the top of new files for
|
||||
C++ example, Python example.
|
||||
19
modules/openvino-master/Jenkinsfile
vendored
Executable file
@ -0,0 +1,19 @@
|
||||
#!groovy
|
||||
|
||||
properties([
|
||||
parameters([
|
||||
booleanParam(defaultValue: false,
|
||||
description: 'Cancel the rest of parallel stages if one of them fails and return status immediately',
|
||||
name: 'failFast'),
|
||||
booleanParam(defaultValue: true,
|
||||
description: 'Whether to propagate commit status to GitHub',
|
||||
name: 'propagateStatus'),
|
||||
string(defaultValue: '',
|
||||
description: 'Pipeline shared library version (branch/tag/commit). Determined automatically if empty',
|
||||
name: 'library_version')
|
||||
])
|
||||
])
|
||||
|
||||
loadOpenVinoLibrary {
|
||||
entrypoint(this)
|
||||
}
|
||||
201
modules/openvino-master/LICENSE
Normal file
@ -0,0 +1,201 @@
|
||||
Apache License
|
||||
Version 2.0, January 2004
|
||||
http://www.apache.org/licenses/
|
||||
|
||||
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
|
||||
|
||||
1. Definitions.
|
||||
|
||||
"License" shall mean the terms and conditions for use, reproduction,
|
||||
and distribution as defined by Sections 1 through 9 of this document.
|
||||
|
||||
"Licensor" shall mean the copyright owner or entity authorized by
|
||||
the copyright owner that is granting the License.
|
||||
|
||||
"Legal Entity" shall mean the union of the acting entity and all
|
||||
other entities that control, are controlled by, or are under common
|
||||
control with that entity. For the purposes of this definition,
|
||||
"control" means (i) the power, direct or indirect, to cause the
|
||||
direction or management of such entity, whether by contract or
|
||||
otherwise, or (ii) ownership of fifty percent (50%) or more of the
|
||||
outstanding shares, or (iii) beneficial ownership of such entity.
|
||||
|
||||
"You" (or "Your") shall mean an individual or Legal Entity
|
||||
exercising permissions granted by this License.
|
||||
|
||||
"Source" form shall mean the preferred form for making modifications,
|
||||
including but not limited to software source code, documentation
|
||||
source, and configuration files.
|
||||
|
||||
"Object" form shall mean any form resulting from mechanical
|
||||
transformation or translation of a Source form, including but
|
||||
not limited to compiled object code, generated documentation,
|
||||
and conversions to other media types.
|
||||
|
||||
"Work" shall mean the work of authorship, whether in Source or
|
||||
Object form, made available under the License, as indicated by a
|
||||
copyright notice that is included in or attached to the work
|
||||
(an example is provided in the Appendix below).
|
||||
|
||||
"Derivative Works" shall mean any work, whether in Source or Object
|
||||
form, that is based on (or derived from) the Work and for which the
|
||||
editorial revisions, annotations, elaborations, or other modifications
|
||||
represent, as a whole, an original work of authorship. For the purposes
|
||||
of this License, Derivative Works shall not include works that remain
|
||||
separable from, or merely link (or bind by name) to the interfaces of,
|
||||
the Work and Derivative Works thereof.
|
||||
|
||||
"Contribution" shall mean any work of authorship, including
|
||||
the original version of the Work and any modifications or additions
|
||||
to that Work or Derivative Works thereof, that is intentionally
|
||||
submitted to Licensor for inclusion in the Work by the copyright owner
|
||||
or by an individual or Legal Entity authorized to submit on behalf of
|
||||
the copyright owner. For the purposes of this definition, "submitted"
|
||||
means any form of electronic, verbal, or written communication sent
|
||||
to the Licensor or its representatives, including but not limited to
|
||||
communication on electronic mailing lists, source code control systems,
|
||||
and issue tracking systems that are managed by, or on behalf of, the
|
||||
Licensor for the purpose of discussing and improving the Work, but
|
||||
excluding communication that is conspicuously marked or otherwise
|
||||
designated in writing by the copyright owner as "Not a Contribution."
|
||||
|
||||
"Contributor" shall mean Licensor and any individual or Legal Entity
|
||||
on behalf of whom a Contribution has been received by Licensor and
|
||||
subsequently incorporated within the Work.
|
||||
|
||||
2. Grant of Copyright License. Subject to the terms and conditions of
|
||||
this License, each Contributor hereby grants to You a perpetual,
|
||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
copyright license to reproduce, prepare Derivative Works of,
|
||||
publicly display, publicly perform, sublicense, and distribute the
|
||||
Work and such Derivative Works in Source or Object form.
|
||||
|
||||
3. Grant of Patent License. Subject to the terms and conditions of
|
||||
this License, each Contributor hereby grants to You a perpetual,
|
||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
(except as stated in this section) patent license to make, have made,
|
||||
use, offer to sell, sell, import, and otherwise transfer the Work,
|
||||
where such license applies only to those patent claims licensable
|
||||
by such Contributor that are necessarily infringed by their
|
||||
Contribution(s) alone or by combination of their Contribution(s)
|
||||
with the Work to which such Contribution(s) was submitted. If You
|
||||
institute patent litigation against any entity (including a
|
||||
cross-claim or counterclaim in a lawsuit) alleging that the Work
|
||||
or a Contribution incorporated within the Work constitutes direct
|
||||
or contributory patent infringement, then any patent licenses
|
||||
granted to You under this License for that Work shall terminate
|
||||
as of the date such litigation is filed.
|
||||
|
||||
4. Redistribution. You may reproduce and distribute copies of the
|
||||
Work or Derivative Works thereof in any medium, with or without
|
||||
modifications, and in Source or Object form, provided that You
|
||||
meet the following conditions:
|
||||
|
||||
(a) You must give any other recipients of the Work or
|
||||
Derivative Works a copy of this License; and
|
||||
|
||||
(b) You must cause any modified files to carry prominent notices
|
||||
stating that You changed the files; and
|
||||
|
||||
(c) You must retain, in the Source form of any Derivative Works
|
||||
that You distribute, all copyright, patent, trademark, and
|
||||
attribution notices from the Source form of the Work,
|
||||
excluding those notices that do not pertain to any part of
|
||||
the Derivative Works; and
|
||||
|
||||
(d) If the Work includes a "NOTICE" text file as part of its
|
||||
distribution, then any Derivative Works that You distribute must
|
||||
include a readable copy of the attribution notices contained
|
||||
within such NOTICE file, excluding those notices that do not
|
||||
pertain to any part of the Derivative Works, in at least one
|
||||
of the following places: within a NOTICE text file distributed
|
||||
as part of the Derivative Works; within the Source form or
|
||||
documentation, if provided along with the Derivative Works; or,
|
||||
within a display generated by the Derivative Works, if and
|
||||
wherever such third-party notices normally appear. The contents
|
||||
of the NOTICE file are for informational purposes only and
|
||||
do not modify the License. You may add Your own attribution
|
||||
notices within Derivative Works that You distribute, alongside
|
||||
or as an addendum to the NOTICE text from the Work, provided
|
||||
that such additional attribution notices cannot be construed
|
||||
as modifying the License.
|
||||
|
||||
You may add Your own copyright statement to Your modifications and
|
||||
may provide additional or different license terms and conditions
|
||||
for use, reproduction, or distribution of Your modifications, or
|
||||
for any such Derivative Works as a whole, provided Your use,
|
||||
reproduction, and distribution of the Work otherwise complies with
|
||||
the conditions stated in this License.
|
||||
|
||||
5. Submission of Contributions. Unless You explicitly state otherwise,
|
||||
any Contribution intentionally submitted for inclusion in the Work
|
||||
by You to the Licensor shall be under the terms and conditions of
|
||||
this License, without any additional terms or conditions.
|
||||
Notwithstanding the above, nothing herein shall supersede or modify
|
||||
the terms of any separate license agreement you may have executed
|
||||
with Licensor regarding such Contributions.
|
||||
|
||||
6. Trademarks. This License does not grant permission to use the trade
|
||||
names, trademarks, service marks, or product names of the Licensor,
|
||||
except as required for reasonable and customary use in describing the
|
||||
origin of the Work and reproducing the content of the NOTICE file.
|
||||
|
||||
7. Disclaimer of Warranty. Unless required by applicable law or
|
||||
agreed to in writing, Licensor provides the Work (and each
|
||||
Contributor provides its Contributions) on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
||||
implied, including, without limitation, any warranties or conditions
|
||||
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
|
||||
PARTICULAR PURPOSE. You are solely responsible for determining the
|
||||
appropriateness of using or redistributing the Work and assume any
|
||||
risks associated with Your exercise of permissions under this License.
|
||||
|
||||
8. Limitation of Liability. In no event and under no legal theory,
|
||||
whether in tort (including negligence), contract, or otherwise,
|
||||
unless required by applicable law (such as deliberate and grossly
|
||||
negligent acts) or agreed to in writing, shall any Contributor be
|
||||
liable to You for damages, including any direct, indirect, special,
|
||||
incidental, or consequential damages of any character arising as a
|
||||
result of this License or out of the use or inability to use the
|
||||
Work (including but not limited to damages for loss of goodwill,
|
||||
work stoppage, computer failure or malfunction, or any and all
|
||||
other commercial damages or losses), even if such Contributor
|
||||
has been advised of the possibility of such damages.
|
||||
|
||||
9. Accepting Warranty or Additional Liability. While redistributing
|
||||
the Work or Derivative Works thereof, You may choose to offer,
|
||||
and charge a fee for, acceptance of support, warranty, indemnity,
|
||||
or other liability obligations and/or rights consistent with this
|
||||
License. However, in accepting such obligations, You may act only
|
||||
on Your own behalf and on Your sole responsibility, not on behalf
|
||||
of any other Contributor, and only if You agree to indemnify,
|
||||
defend, and hold each Contributor harmless for any liability
|
||||
incurred by, or claims asserted against, such Contributor by reason
|
||||
of your accepting any such warranty or additional liability.
|
||||
|
||||
END OF TERMS AND CONDITIONS
|
||||
|
||||
APPENDIX: How to apply the Apache License to your work.
|
||||
|
||||
To apply the Apache License to your work, attach the following
|
||||
boilerplate notice, with the fields enclosed by brackets "[]"
|
||||
replaced with your own identifying information. (Don't include
|
||||
the brackets!) The text should be enclosed in the appropriate
|
||||
comment syntax for the file format. We also recommend that a
|
||||
file or class name and description of purpose be included on the
|
||||
same "printed page" as the copyright notice for easier
|
||||
identification within third-party archives.
|
||||
|
||||
Copyright [yyyy] [name of copyright owner]
|
||||
|
||||
Licensed under the Apache License, Version 2.0 (the "License");
|
||||
you may not use this file except in compliance with the License.
|
||||
You may obtain a copy of the License at
|
||||
|
||||
http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing, software
|
||||
distributed under the License is distributed on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
See the License for the specific language governing permissions and
|
||||
limitations under the License.
|
||||
201
modules/openvino-master/README.md
Normal file
@ -0,0 +1,201 @@
|
||||
<div align="center">
|
||||
<img src="docs/img/openvino-logo-purple-black.png" width="400px">
|
||||
|
||||
[](https://badge.fury.io/py/openvino)
|
||||
[](https://anaconda.org/conda-forge/openvino)
|
||||
[](https://formulae.brew.sh/formula/openvino)
|
||||
|
||||
[](https://pepy.tech/project/openvino)
|
||||
[](https://anaconda.org/conda-forge/openvino/files)
|
||||
[](https://formulae.brew.sh/formula/openvino)
|
||||
</div>
|
||||
|
||||
## Contents:
|
||||
|
||||
- [What is OpenVINO?](#what-is-openvino-toolkit)
|
||||
- [Components](#components)
|
||||
- [Supported Hardware matrix](#supported-hardware-matrix)
|
||||
- [License](#license)
|
||||
- [Documentation](#documentation)
|
||||
- [Tutorials](#tutorials)
|
||||
- [Products which use OpenVINO](#products-which-use-openvino)
|
||||
- [System requirements](#system-requirements)
|
||||
- [How to build](#how-to-build)
|
||||
- [How to contribute](#how-to-contribute)
|
||||
- [Get a support](#get-a-support)
|
||||
- [See also](#see-also)
|
||||
|
||||
## What is OpenVINO toolkit?
|
||||
|
||||
OpenVINO™ is an open-source toolkit for optimizing and deploying AI inference.
|
||||
- Boost deep learning performance in computer vision, automatic speech recognition, natural language processing and other common tasks
|
||||
- Use models trained with popular frameworks like TensorFlow, PyTorch and more
|
||||
- Reduce resource demands and efficiently deploy on a range of Intel® platforms from edge to cloud
|
||||
|
||||
|
||||
This open-source version includes several components: namely [Model Optimizer], [OpenVINO™ Runtime], [Post-Training Optimization Tool], as well as CPU, GPU, GNA, multi device and heterogeneous plugins to accelerate deep learning inference on Intel® CPUs and Intel® Processor Graphics.
|
||||
It supports pre-trained models from [Open Model Zoo], along with 100+ open
|
||||
source and public models in popular formats such as TensorFlow, ONNX, PaddlePaddle, MXNet, Caffe, Kaldi.
|
||||
|
||||
### Components
|
||||
* [OpenVINO™ Runtime] - is a set of C++ libraries with C and Python bindings providing a common API to deliver inference solutions on the platform of your choice.
|
||||
* [core](./src/core) - provides the base API for model representation and modification.
|
||||
* [inference](./src/inference) - provides an API to infer models on the device.
|
||||
* [transformations](./src/common/transformations) - contains the set of common transformations which are used in OpenVINO plugins.
|
||||
* [low precision transformations](./src/common/low_precision_transformations) - contains the set of transformations that are used in low precision models
|
||||
* [bindings](./src/bindings) - contains all available OpenVINO bindings which are maintained by the OpenVINO team.
|
||||
* [c](./src/bindings/c) - C API for OpenVINO™ Runtime
|
||||
* [python](./src/bindings/python) - Python API for OpenVINO™ Runtime
|
||||
* [Plugins](./src/plugins) - contains OpenVINO plugins which are maintained in open-source by the OpenVINO team. For more information, take a look at the [list of supported devices](#supported-hardware-matrix).
|
||||
* [Frontends](./src/frontends) - contains available OpenVINO frontends that allow reading models from the native framework format.
|
||||
* [Model Optimizer] - is a cross-platform command-line tool that facilitates the transition between training and deployment environments, performs static model analysis, and adjusts deep learning models for optimal execution on end-point target devices.
|
||||
* [Post-Training Optimization Tool] - is designed to accelerate the inference of deep learning models by applying special methods without model retraining or fine-tuning, for example, post-training 8-bit quantization.
|
||||
* [Samples] - applications in C, C++ and Python languages that show basic OpenVINO use cases.
|
||||
|
||||
## Supported Hardware matrix
|
||||
|
||||
The OpenVINO™ Runtime can infer models on different hardware devices. This section provides the list of supported devices.
|
||||
|
||||
<table>
|
||||
<thead>
|
||||
<tr>
|
||||
<th>Device</th>
|
||||
<th>Plugin</th>
|
||||
<th>Library</th>
|
||||
<th>ShortDescription</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
<tr>
|
||||
<td rowspan=2>CPU</td>
|
||||
<td> <a href="https://docs.openvino.ai/2023.1/openvino_docs_OV_UG_supported_plugins_CPU.html#doxid-openvino-docs-o-v-u-g-supported-plugins-c-p-u">Intel CPU</a></tb>
|
||||
<td><b><i><a href="./src/plugins/intel_cpu">openvino_intel_cpu_plugin</a></i></b></td>
|
||||
<td>Intel Xeon with Intel® Advanced Vector Extensions 2 (Intel® AVX2), Intel® Advanced Vector Extensions 512 (Intel® AVX-512), and AVX512_BF16, Intel Core Processors with Intel AVX2, Intel Atom Processors with Intel® Streaming SIMD Extensions (Intel® SSE)</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td> <a href="https://docs.openvino.ai/2023.1/openvino_docs_OV_UG_supported_plugins_CPU.html">ARM CPU</a></tb>
|
||||
<td><b><i><a href="https://github.com/openvinotoolkit/openvino_contrib/tree/master/modules/arm_plugin">openvino_arm_cpu_plugin</a></i></b></td>
|
||||
<td>Raspberry Pi™ 4 Model B, Apple® Mac mini with M1 chip, NVIDIA® Jetson Nano™, Android™ devices
|
||||
</tr>
|
||||
<tr>
|
||||
<td>GPU</td>
|
||||
<td><a href="https://docs.openvino.ai/2023.1/openvino_docs_OV_UG_supported_plugins_GPU.html#doxid-openvino-docs-o-v-u-g-supported-plugins-g-p-u">Intel GPU</a></td>
|
||||
<td><b><i><a href="./src/plugins/intel_gpu">openvino_intel_gpu_plugin</a></i></b></td>
|
||||
<td>Intel Processor Graphics, including Intel HD Graphics and Intel Iris Graphics</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>GNA</td>
|
||||
<td><a href="https://docs.openvino.ai/2023.1/openvino_docs_OV_UG_supported_plugins_GNA.html#doxid-openvino-docs-o-v-u-g-supported-plugins-g-n-a">Intel GNA</a></td>
|
||||
<td><b><i><a href="./src/plugins/intel_gna">openvino_intel_gna_plugin</a></i></b></td>
|
||||
<td>Intel Speech Enabling Developer Kit, Amazon Alexa* Premium Far-Field Developer Kit, Intel Pentium Silver J5005 Processor, Intel Pentium Silver N5000 Processor, Intel Celeron J4005 Processor, Intel Celeron J4105 Processor, Intel Celeron Processor N4100, Intel Celeron Processor N4000, Intel Core i3-8121U Processor, Intel Core i7-1065G7 Processor, Intel Core i7-1060G7 Processor, Intel Core i5-1035G4 Processor, Intel Core i5-1035G7 Processor, Intel Core i5-1035G1 Processor, Intel Core i5-1030G7 Processor, Intel Core i5-1030G4 Processor, Intel Core i3-1005G1 Processor, Intel Core i3-1000G1 Processor, Intel Core i3-1000G4 Processor</td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
OpenVINO™ Toolkit also contains several plugins which simplify loading models on several hardware devices:
|
||||
<table>
|
||||
<thead>
|
||||
<tr>
|
||||
<th>Plugin</th>
|
||||
<th>Library</th>
|
||||
<th>ShortDescription</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
<tr>
|
||||
<td><a href="https://docs.openvino.ai/2023.1/openvino_docs_OV_UG_supported_plugins_AUTO.html">Auto</a></td>
|
||||
<td><b><i><a href="./src/plugins/auto">openvino_auto_plugin</a></i></b></td>
|
||||
<td>Auto plugin enables selecting Intel device for inference automatically</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td><a href="https://docs.openvino.ai/2023.1/openvino_docs_OV_UG_Automatic_Batching.html">Auto Batch</a></td>
|
||||
<td><b><i><a href="./src/plugins/auto_batch">openvino_auto_batch_plugin</a></i></b></td>
|
||||
<td>Auto batch plugin performs on-the-fly automatic batching (i.e. grouping inference requests together) to improve device utilization, with no programming effort from the user</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td><a href="https://docs.openvino.ai/2023.1/openvino_docs_OV_UG_Hetero_execution.html#doxid-openvino-docs-o-v-u-g-hetero-execution">Hetero</a></td>
|
||||
<td><b><i><a href="./src/plugins/hetero">openvino_hetero_plugin</a></i></b></td>
|
||||
<td>Heterogeneous execution enables automatic inference splitting between several devices</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td><a href="https://docs.openvino.ai/2023.1/openvino_docs_OV_UG_Running_on_multiple_devices.html#doxid-openvino-docs-o-v-u-g-running-on-multiple-devices">Multi</a></td>
|
||||
<td><b><i><a href="./src/plugins/auto">openvino_auto_plugin</a></i></b></td>
|
||||
<td>Multi plugin enables simultaneous inference of the same model on several devices in parallel</td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
## License
|
||||
OpenVINO™ Toolkit is licensed under [Apache License Version 2.0](LICENSE).
|
||||
By contributing to the project, you agree to the license and copyright terms therein and release your contribution under these terms.
|
||||
|
||||
## Documentation
|
||||
|
||||
### User documentation
|
||||
|
||||
The latest documentation for OpenVINO™ Toolkit is available [here](https://docs.openvino.ai/). This documentation contains detailed information about all OpenVINO components and provides all the important information you may need to create an application based on binary OpenVINO distribution or own OpenVINO version without source code modification.
|
||||
|
||||
### Developer documentation
|
||||
|
||||
[Developer documentation](./docs/dev/index.md) contains information about architectural decisions which are applied inside the OpenVINO components. This documentation has all necessary information which could be needed in order to contribute to OpenVINO.
|
||||
|
||||
## Tutorials
|
||||
|
||||
The list of OpenVINO tutorials:
|
||||
|
||||
- [Jupyter notebooks](https://github.com/openvinotoolkit/openvino_notebooks)
|
||||
|
||||
## Products which use OpenVINO
|
||||
|
||||
- [OpenCV](https://opencv.org/)
|
||||
- [ONNX Runtime](https://onnxruntime.ai/)
|
||||
- [OpenVINO™ Integration with TensorFlow](https://www.intel.com/content/www/us/en/developer/tools/devcloud/edge/build/ovtfoverview.html)
|
||||
- [TNN](https://github.com/Tencent/TNN/tree/master)
|
||||
|
||||
## System requirements
|
||||
|
||||
The system requirements vary depending on platform and are available on dedicated pages:
|
||||
- [Linux](https://docs.openvino.ai/2023.1/openvino_docs_install_guides_installing_openvino_linux_header.html)
|
||||
- [Windows](https://docs.openvino.ai/2023.1/openvino_docs_install_guides_installing_openvino_windows_header.html)
|
||||
- [macOS](https://docs.openvino.ai/2023.1/openvino_docs_install_guides_installing_openvino_macos_header.html)
|
||||
|
||||
## How to build
|
||||
|
||||
See [How to build OpenVINO](./docs/dev/build.md) to get more information about the OpenVINO build process.
|
||||
|
||||
## How to contribute
|
||||
|
||||
See [Contributions Welcome](https://github.com/openvinotoolkit/openvino/issues/17502) for good first issues.
|
||||
|
||||
See [CONTRIBUTING](./CONTRIBUTING.md) for contribution details. Thank you!
|
||||
|
||||
## Get a support
|
||||
|
||||
Report questions, issues and suggestions, using:
|
||||
|
||||
* [GitHub* Issues](https://github.com/openvinotoolkit/openvino/issues)
|
||||
* The [`openvino`](https://stackoverflow.com/questions/tagged/openvino) tag on StackOverflow\*
|
||||
* [Forum](https://software.intel.com/en-us/forums/computer-vision)
|
||||
|
||||
## Additional Resources
|
||||
|
||||
* [OpenVINO Wiki](https://github.com/openvinotoolkit/openvino/wiki)
|
||||
* [OpenVINO Storage](https://storage.openvinotoolkit.org/)
|
||||
* Additional OpenVINO™ toolkit modules:
|
||||
* [openvino_contrib](https://github.com/openvinotoolkit/openvino_contrib)
|
||||
* [Intel® Distribution of OpenVINO™ toolkit Product Page](https://software.intel.com/content/www/us/en/develop/tools/openvino-toolkit.html)
|
||||
* [Intel® Distribution of OpenVINO™ toolkit Release Notes](https://software.intel.com/en-us/articles/OpenVINO-RelNotes)
|
||||
* [Neural Network Compression Framework (NNCF)](https://github.com/openvinotoolkit/nncf) - a suite of advanced algorithms for model inference optimization including quantization, filter pruning, binarization and sparsity
|
||||
* [OpenVINO™ Training Extensions (OTE)](https://github.com/openvinotoolkit/training_extensions) - convenient environment to train Deep Learning models and convert them using OpenVINO for optimized inference.
|
||||
* [OpenVINO™ Model Server (OVMS)](https://github.com/openvinotoolkit/model_server) - a scalable, high-performance solution for serving deep learning models optimized for Intel architectures
|
||||
* [Computer Vision Annotation Tool (CVAT)](https://github.com/opencv/cvat) - an online, interactive video and image annotation tool for computer vision purposes.
|
||||
* [Dataset Management Framework (Datumaro)](https://github.com/openvinotoolkit/datumaro) - a framework and CLI tool to build, transform, and analyze datasets.
|
||||
|
||||
---
|
||||
\* Other names and brands may be claimed as the property of others.
|
||||
|
||||
[Open Model Zoo]:https://github.com/openvinotoolkit/open_model_zoo
|
||||
[OpenVINO™ Runtime]:https://docs.openvino.ai/2023.1/openvino_docs_OV_UG_OV_Runtime_User_Guide.html
|
||||
[Model Optimizer]:https://docs.openvino.ai/2023.1/openvino_docs_MO_DG_Deep_Learning_Model_Optimizer_DevGuide.html
|
||||
[Post-Training Optimization Tool]:https://docs.openvino.ai/2023.1/pot_introduction.html
|
||||
[Samples]:https://github.com/openvinotoolkit/openvino/tree/master/samples
|
||||
12
modules/openvino-master/SECURITY.md
Normal file
@ -0,0 +1,12 @@
|
||||
# Security Policy
|
||||
|
||||
## Report a Vulnerability
|
||||
|
||||
Please report security issues or vulnerabilities to the [Intel® Security Center].
|
||||
|
||||
For more information on how Intel® works to resolve security issues, see
|
||||
[Vulnerability Handling Guidelines].
|
||||
|
||||
[Intel® Security Center]:https://www.intel.com/security
|
||||
|
||||
[Vulnerability Handling Guidelines]:https://www.intel.com/content/www/us/en/security-center/vulnerability-handling-guidelines.html
|
||||
11
modules/openvino-master/cmake/arm.toolchain.cmake
Normal file
@ -0,0 +1,11 @@
|
||||
# Copyright (C) 2018-2023 Intel Corporation
|
||||
# SPDX-License-Identifier: Apache-2.0
|
||||
#
|
||||
|
||||
set(CMAKE_SYSTEM_NAME Linux)
|
||||
set(CMAKE_SYSTEM_PROCESSOR armv7l)
|
||||
|
||||
set(CMAKE_C_COMPILER arm-linux-gnueabihf-gcc)
|
||||
set(CMAKE_CXX_COMPILER arm-linux-gnueabihf-g++)
|
||||
set(CMAKE_STRIP arm-linux-gnueabihf-strip)
|
||||
set(PKG_CONFIG_EXECUTABLE arm-linux-gnueabihf-pkg-config CACHE PATH "Path to ARM pkg-config")
|
||||