From 34d359fe035a92d48a399c6e6975c77513bd5139 Mon Sep 17 00:00:00 2001 From: Yuantao Feng Date: Sat, 9 Oct 2021 03:13:49 +0800 Subject: [PATCH] Merge pull request #20422 from fengyuentau:dnn_face Add DNN-based face detection and face recognition into modules/objdetect * Add DNN-based face detector impl and interface * Add a sample for DNN-based face detector * add recog * add notes * move samples from samples/cpp to samples/dnn * add documentation for dnn_face * add set/get methods for input size, nms & score threshold and topk * remove the DNN prefix from the face detector and face recognizer * remove default values in the constructor of impl * regenerate priors after setting input size * two filenames for readnet * Update face.hpp * Update face_recognize.cpp * Update face_match.cpp * Update face.hpp * Update face_recognize.cpp * Update face_match.cpp * Update face_recognize.cpp * Update dnn_face.markdown * Update dnn_face.markdown * Update face.hpp * Update dnn_face.markdown * add regression test for face detection * remove underscore prefix; fix warnings * add reference & acknowledgement for face detection * Update dnn_face.markdown * Update dnn_face.markdown * Update ts.hpp * Update test_face.cpp * Update face_match.cpp * fix a compile error for python interface; add python examples for face detection and recognition * Major changes for Vadim's comments: * Replace class name FaceDetector with FaceDetectorYN in related failes * Declare local mat before loop in modules/objdetect/src/face_detect.cpp * Make input image and save flag optional in samples/dnn/face_detect(.cpp, .py) * Add camera support in samples/dnn/face_detect(.cpp, .py) * correct file paths for regression test * fix convertion warnings; remove extra spaces * update face_recog * Update dnn_face.markdown * Fix warnings and errors for the default CI reports: * Remove trailing white spaces and extra new lines. * Fix convertion warnings for windows and iOS. * Add braces around initialization of subobjects. * Fix warnings and errors for the default CI systems: * Add prefix 'FR_' for each value name in enum DisType to solve the redefinition error for iOS compilation; Modify other code accordingly * Add bookmark '#tutorial_dnn_face' to solve warnings from doxygen * Correct documentations to solve warnings from doxygen * update FaceRecognizerSF * Fix the error for CI to find ONNX models correctly * add suffix f to float assignments * add backend & target options for initializing face recognizer * add checkeq for checking input size and preset size * update test and threshold * changes in response to alalek's comments: * fix typos in samples/dnn/face_match.py * import numpy before importing cv2 * add documentation to .setInputSize() * remove extra include in face_recognize.cpp * fix some bugs * Update dnn_face.markdown * update thresholds; remove useless code * add time suffix to YuNet filename in test * objdetect: update test code --- doc/tutorials/dnn/dnn_face/dnn_face.markdown | 95 ++++++ .../dnn_text_spotting.markdown | 2 +- .../dnn/table_of_content_dnn.markdown | 1 + modules/objdetect/CMakeLists.txt | 2 +- .../objdetect/include/opencv2/objdetect.hpp | 1 + .../include/opencv2/objdetect/face.hpp | 125 ++++++++ modules/objdetect/src/face_detect.cpp | 288 ++++++++++++++++++ modules/objdetect/src/face_recognize.cpp | 182 +++++++++++ modules/objdetect/test/test_face.cpp | 219 +++++++++++++ modules/objdetect/test/test_main.cpp | 17 +- modules/ts/include/opencv2/ts.hpp | 1 + samples/dnn/CMakeLists.txt | 1 + samples/dnn/face_detect.cpp | 132 ++++++++ samples/dnn/face_detect.py | 101 ++++++ samples/dnn/face_match.cpp | 103 +++++++ samples/dnn/face_match.py | 57 ++++ samples/dnn/results/audrybt1.jpg | Bin 0 -> 47680 bytes 17 files changed, 1324 insertions(+), 3 deletions(-) create mode 100644 doc/tutorials/dnn/dnn_face/dnn_face.markdown create mode 100644 modules/objdetect/include/opencv2/objdetect/face.hpp create mode 100644 modules/objdetect/src/face_detect.cpp create mode 100644 modules/objdetect/src/face_recognize.cpp create mode 100644 modules/objdetect/test/test_face.cpp create mode 100644 samples/dnn/face_detect.cpp create mode 100644 samples/dnn/face_detect.py create mode 100644 samples/dnn/face_match.cpp create mode 100644 samples/dnn/face_match.py create mode 100644 samples/dnn/results/audrybt1.jpg diff --git a/doc/tutorials/dnn/dnn_face/dnn_face.markdown b/doc/tutorials/dnn/dnn_face/dnn_face.markdown new file mode 100644 index 0000000000..e5092b8b92 --- /dev/null +++ b/doc/tutorials/dnn/dnn_face/dnn_face.markdown @@ -0,0 +1,95 @@ +# DNN-based Face Detection And Recognition {#tutorial_dnn_face} + +@tableofcontents + +@prev_tutorial{tutorial_dnn_text_spotting} +@next_tutorial{pytorch_cls_tutorial_dnn_conversion} + +| | | +| -: | :- | +| Original Author | Chengrui Wang, Yuantao Feng | +| Compatibility | OpenCV >= 4.5.1 | + +## Introduction + +In this section, we introduce the DNN-based module for face detection and face recognition. Models can be obtained in [Models](#Models). The usage of `FaceDetectorYN` and `FaceRecognizer` are presented in [Usage](#Usage). + +## Models + +There are two models (ONNX format) pre-trained and required for this module: +- [Face Detection](https://github.com/ShiqiYu/libfacedetection.train/tree/master/tasks/task1/onnx): + - Size: 337KB + - Results on WIDER Face Val set: 0.830(easy), 0.824(medium), 0.708(hard) +- [Face Recognition](https://drive.google.com/file/d/1ClK9WiB492c5OZFKveF3XiHCejoOxINW/view?usp=sharing) + - Size: 36.9MB + - Results: + + | Database | Accuracy | Threshold (normL2) | Threshold (cosine) | + | -------- | -------- | ------------------ | ------------------ | + | LFW | 99.60% | 1.128 | 0.363 | + | CALFW | 93.95% | 1.149 | 0.340 | + | CPLFW | 91.05% | 1.204 | 0.275 | + | AgeDB-30 | 94.90% | 1.202 | 0.277 | + | CFP-FP | 94.80% | 1.253 | 0.212 | + +## Usage + +### DNNFaceDetector + +```cpp +// Initialize FaceDetectorYN +Ptr faceDetector = FaceDetectorYN::create(onnx_path, "", image.size(), score_thresh, nms_thresh, top_k); + +// Forward +Mat faces; +faceDetector->detect(image, faces); +``` + +The detection output `faces` is a two-dimension array of type CV_32F, whose rows are the detected face instances, columns are the location of a face and 5 facial landmarks. The format of each row is as follows: + +``` +x1, y1, w, h, x_re, y_re, x_le, y_le, x_nt, y_nt, x_rcm, y_rcm, x_lcm, y_lcm +``` +, where `x1, y1, w, h` are the top-left coordinates, width and height of the face bounding box, `{x, y}_{re, le, nt, rcm, lcm}` stands for the coordinates of right eye, left eye, nose tip, the right corner and left corner of the mouth respectively. + + +### Face Recognition + +Following Face Detection, run codes below to extract face feature from facial image. + +```cpp +// Initialize FaceRecognizer with model path (cv::String) +Ptr faceRecognizer = FaceRecognizer::create(model_path, ""); + +// Aligning and cropping facial image through the first face of faces detected by dnn_face::DNNFaceDetector +Mat aligned_face; +faceRecognizer->alignCrop(image, faces.row(0), aligned_face); + +// Run feature extraction with given aligned_face (cv::Mat) +Mat feature; +faceRecognizer->feature(aligned_face, feature); +feature = feature.clone(); +``` + +After obtaining face features *feature1* and *feature2* of two facial images, run codes below to calculate the identity discrepancy between the two faces. + +```cpp +// Calculating the discrepancy between two face features by using cosine distance. +double cos_score = faceRecognizer->match(feature1, feature2, FaceRecognizer::DisType::COSINE); +// Calculating the discrepancy between two face features by using normL2 distance. +double L2_score = faceRecognizer->match(feature1, feature2, FaceRecognizer::DisType::NORM_L2); +``` + +For example, two faces have same identity if the cosine distance is greater than or equal to 0.363, or the normL2 distance is less than or equal to 1.128. + +## Reference: + +- https://github.com/ShiqiYu/libfacedetection +- https://github.com/ShiqiYu/libfacedetection.train +- https://github.com/zhongyy/SFace + +## Acknowledgement + +Thanks [Professor Shiqi Yu](https://github.com/ShiqiYu/) and [Yuantao Feng](https://github.com/fengyuentau) for training and providing the face detection model. + +Thanks [Professor Deng](http://www.whdeng.cn/), [PhD Candidate Zhong](https://github.com/zhongyy/) and [Master Candidate Wang](https://github.com/crywang/) for training and providing the face recognition model. diff --git a/doc/tutorials/dnn/dnn_text_spotting/dnn_text_spotting.markdown b/doc/tutorials/dnn/dnn_text_spotting/dnn_text_spotting.markdown index b0be2627b2..5c465941ca 100644 --- a/doc/tutorials/dnn/dnn_text_spotting/dnn_text_spotting.markdown +++ b/doc/tutorials/dnn/dnn_text_spotting/dnn_text_spotting.markdown @@ -3,7 +3,7 @@ @tableofcontents @prev_tutorial{tutorial_dnn_OCR} -@next_tutorial{pytorch_cls_tutorial_dnn_conversion} +@next_tutorial{tutorial_dnn_face} | | | | -: | :- | diff --git a/doc/tutorials/dnn/table_of_content_dnn.markdown b/doc/tutorials/dnn/table_of_content_dnn.markdown index 0d5e43ee11..3f74826dac 100644 --- a/doc/tutorials/dnn/table_of_content_dnn.markdown +++ b/doc/tutorials/dnn/table_of_content_dnn.markdown @@ -10,6 +10,7 @@ Deep Neural Networks (dnn module) {#tutorial_table_of_content_dnn} - @subpage tutorial_dnn_custom_layers - @subpage tutorial_dnn_OCR - @subpage tutorial_dnn_text_spotting +- @subpage tutorial_dnn_face #### PyTorch models with OpenCV In this section you will find the guides, which describe how to run classification, segmentation and detection PyTorch DNN models with OpenCV. diff --git a/modules/objdetect/CMakeLists.txt b/modules/objdetect/CMakeLists.txt index 3fa0c5d33b..f4d5b22b74 100644 --- a/modules/objdetect/CMakeLists.txt +++ b/modules/objdetect/CMakeLists.txt @@ -1,5 +1,5 @@ set(the_description "Object Detection") -ocv_define_module(objdetect opencv_core opencv_imgproc opencv_calib3d WRAP java objc python js) +ocv_define_module(objdetect opencv_core opencv_imgproc opencv_calib3d opencv_dnn WRAP java objc python js) if(HAVE_QUIRC) get_property(QUIRC_INCLUDE GLOBAL PROPERTY QUIRC_INCLUDE_DIR) diff --git a/modules/objdetect/include/opencv2/objdetect.hpp b/modules/objdetect/include/opencv2/objdetect.hpp index eaee1290ce..59dca7399b 100644 --- a/modules/objdetect/include/opencv2/objdetect.hpp +++ b/modules/objdetect/include/opencv2/objdetect.hpp @@ -768,5 +768,6 @@ protected: } #include "opencv2/objdetect/detection_based_tracker.hpp" +#include "opencv2/objdetect/face.hpp" #endif diff --git a/modules/objdetect/include/opencv2/objdetect/face.hpp b/modules/objdetect/include/opencv2/objdetect/face.hpp new file mode 100644 index 0000000000..f2429c5f31 --- /dev/null +++ b/modules/objdetect/include/opencv2/objdetect/face.hpp @@ -0,0 +1,125 @@ +// This file is part of OpenCV project. +// It is subject to the license terms in the LICENSE file found in the top-level directory +// of this distribution and at http://opencv.org/license.html. + +#ifndef OPENCV_OBJDETECT_FACE_HPP +#define OPENCV_OBJDETECT_FACE_HPP + +#include + +/** @defgroup dnn_face DNN-based face detection and recognition + */ + +namespace cv +{ + +/** @brief DNN-based face detector, model download link: https://github.com/ShiqiYu/libfacedetection.train/tree/master/tasks/task1/onnx. + */ +class CV_EXPORTS_W FaceDetectorYN +{ +public: + virtual ~FaceDetectorYN() {}; + + /** @brief Set the size for the network input, which overwrites the input size of creating model. Call this method when the size of input image does not match the input size when creating model + * + * @param input_size the size of the input image + */ + CV_WRAP virtual void setInputSize(const Size& input_size) = 0; + + CV_WRAP virtual Size getInputSize() = 0; + + /** @brief Set the score threshold to filter out bounding boxes of score less than the given value + * + * @param score_threshold threshold for filtering out bounding boxes + */ + CV_WRAP virtual void setScoreThreshold(float score_threshold) = 0; + + CV_WRAP virtual float getScoreThreshold() = 0; + + /** @brief Set the Non-maximum-suppression threshold to suppress bounding boxes that have IoU greater than the given value + * + * @param nms_threshold threshold for NMS operation + */ + CV_WRAP virtual void setNMSThreshold(float nms_threshold) = 0; + + CV_WRAP virtual float getNMSThreshold() = 0; + + /** @brief Set the number of bounding boxes preserved before NMS + * + * @param top_k the number of bounding boxes to preserve from top rank based on score + */ + CV_WRAP virtual void setTopK(int top_k) = 0; + + CV_WRAP virtual int getTopK() = 0; + + /** @brief A simple interface to detect face from given image + * + * @param image an image to detect + * @param faces detection results stored in a cv::Mat + */ + CV_WRAP virtual int detect(InputArray image, OutputArray faces) = 0; + + /** @brief Creates an instance of this class with given parameters + * + * @param model the path to the requested model + * @param config the path to the config file for compability, which is not requested for ONNX models + * @param input_size the size of the input image + * @param score_threshold the threshold to filter out bounding boxes of score smaller than the given value + * @param nms_threshold the threshold to suppress bounding boxes of IoU bigger than the given value + * @param top_k keep top K bboxes before NMS + * @param backend_id the id of backend + * @param target_id the id of target device + */ + CV_WRAP static Ptr create(const String& model, + const String& config, + const Size& input_size, + float score_threshold = 0.9f, + float nms_threshold = 0.3f, + int top_k = 5000, + int backend_id = 0, + int target_id = 0); +}; + +/** @brief DNN-based face recognizer, model download link: https://drive.google.com/file/d/1ClK9WiB492c5OZFKveF3XiHCejoOxINW/view. + */ +class CV_EXPORTS_W FaceRecognizerSF +{ +public: + virtual ~FaceRecognizerSF() {}; + + /** @brief Definition of distance used for calculating the distance between two face features + */ + enum DisType { FR_COSINE=0, FR_NORM_L2=1 }; + + /** @brief Aligning image to put face on the standard position + * @param src_img input image + * @param face_box the detection result used for indicate face in input image + * @param aligned_img output aligned image + */ + CV_WRAP virtual void alignCrop(InputArray src_img, InputArray face_box, OutputArray aligned_img) const = 0; + + /** @brief Extracting face feature from aligned image + * @param aligned_img input aligned image + * @param face_feature output face feature + */ + CV_WRAP virtual void feature(InputArray aligned_img, OutputArray face_feature) = 0; + + /** @brief Calculating the distance between two face features + * @param _face_feature1 the first input feature + * @param _face_feature2 the second input feature of the same size and the same type as _face_feature1 + * @param dis_type defining the similarity with optional values "FR_OSINE" or "FR_NORM_L2" + */ + CV_WRAP virtual double match(InputArray _face_feature1, InputArray _face_feature2, int dis_type = FaceRecognizerSF::FR_COSINE) const = 0; + + /** @brief Creates an instance of this class with given parameters + * @param model the path of the onnx model used for face recognition + * @param config the path to the config file for compability, which is not requested for ONNX models + * @param backend_id the id of backend + * @param target_id the id of target device + */ + CV_WRAP static Ptr create(const String& model, const String& config, int backend_id = 0, int target_id = 0); +}; + +} // namespace cv + +#endif diff --git a/modules/objdetect/src/face_detect.cpp b/modules/objdetect/src/face_detect.cpp new file mode 100644 index 0000000000..4095745b7e --- /dev/null +++ b/modules/objdetect/src/face_detect.cpp @@ -0,0 +1,288 @@ +// This file is part of OpenCV project. +// It is subject to the license terms in the LICENSE file found in the top-level directory +// of this distribution and at http://opencv.org/license.html. + +#include "precomp.hpp" + +#include "opencv2/imgproc.hpp" +#include "opencv2/core.hpp" +#include "opencv2/dnn.hpp" + +#include + +namespace cv +{ + +class FaceDetectorYNImpl : public FaceDetectorYN +{ +public: + FaceDetectorYNImpl(const String& model, + const String& config, + const Size& input_size, + float score_threshold, + float nms_threshold, + int top_k, + int backend_id, + int target_id) + { + net = dnn::readNet(model, config); + CV_Assert(!net.empty()); + + net.setPreferableBackend(backend_id); + net.setPreferableTarget(target_id); + + inputW = input_size.width; + inputH = input_size.height; + + scoreThreshold = score_threshold; + nmsThreshold = nms_threshold; + topK = top_k; + + generatePriors(); + } + + void setInputSize(const Size& input_size) override + { + inputW = input_size.width; + inputH = input_size.height; + generatePriors(); + } + + Size getInputSize() override + { + Size input_size; + input_size.width = inputW; + input_size.height = inputH; + return input_size; + } + + void setScoreThreshold(float score_threshold) override + { + scoreThreshold = score_threshold; + } + + float getScoreThreshold() override + { + return scoreThreshold; + } + + void setNMSThreshold(float nms_threshold) override + { + nmsThreshold = nms_threshold; + } + + float getNMSThreshold() override + { + return nmsThreshold; + } + + void setTopK(int top_k) override + { + topK = top_k; + } + + int getTopK() override + { + return topK; + } + + int detect(InputArray input_image, OutputArray faces) override + { + // TODO: more checkings should be done? + if (input_image.empty()) + { + return 0; + } + CV_CheckEQ(input_image.size(), Size(inputW, inputH), "Size does not match. Call setInputSize(size) if input size does not match the preset size"); + + // Build blob from input image + Mat input_blob = dnn::blobFromImage(input_image); + + // Forward + std::vector output_names = { "loc", "conf", "iou" }; + std::vector output_blobs; + net.setInput(input_blob); + net.forward(output_blobs, output_names); + + // Post process + Mat results = postProcess(output_blobs); + results.convertTo(faces, CV_32FC1); + return 1; + } +private: + void generatePriors() + { + // Calculate shapes of different scales according to the shape of input image + Size feature_map_2nd = { + int(int((inputW+1)/2)/2), int(int((inputH+1)/2)/2) + }; + Size feature_map_3rd = { + int(feature_map_2nd.width/2), int(feature_map_2nd.height/2) + }; + Size feature_map_4th = { + int(feature_map_3rd.width/2), int(feature_map_3rd.height/2) + }; + Size feature_map_5th = { + int(feature_map_4th.width/2), int(feature_map_4th.height/2) + }; + Size feature_map_6th = { + int(feature_map_5th.width/2), int(feature_map_5th.height/2) + }; + + std::vector feature_map_sizes; + feature_map_sizes.push_back(feature_map_3rd); + feature_map_sizes.push_back(feature_map_4th); + feature_map_sizes.push_back(feature_map_5th); + feature_map_sizes.push_back(feature_map_6th); + + // Fixed params for generating priors + const std::vector> min_sizes = { + {10.0f, 16.0f, 24.0f}, + {32.0f, 48.0f}, + {64.0f, 96.0f}, + {128.0f, 192.0f, 256.0f} + }; + const std::vector steps = { 8, 16, 32, 64 }; + + // Generate priors + priors.clear(); + for (size_t i = 0; i < feature_map_sizes.size(); ++i) + { + Size feature_map_size = feature_map_sizes[i]; + std::vector min_size = min_sizes[i]; + + for (int _h = 0; _h < feature_map_size.height; ++_h) + { + for (int _w = 0; _w < feature_map_size.width; ++_w) + { + for (size_t j = 0; j < min_size.size(); ++j) + { + float s_kx = min_size[j] / inputW; + float s_ky = min_size[j] / inputH; + + float cx = (_w + 0.5f) * steps[i] / inputW; + float cy = (_h + 0.5f) * steps[i] / inputH; + + Rect2f prior = { cx, cy, s_kx, s_ky }; + priors.push_back(prior); + } + } + } + } + } + + Mat postProcess(const std::vector& output_blobs) + { + // Extract from output_blobs + Mat loc = output_blobs[0]; + Mat conf = output_blobs[1]; + Mat iou = output_blobs[2]; + + // Decode from deltas and priors + const std::vector variance = {0.1f, 0.2f}; + float* loc_v = (float*)(loc.data); + float* conf_v = (float*)(conf.data); + float* iou_v = (float*)(iou.data); + Mat faces; + // (tl_x, tl_y, w, h, re_x, re_y, le_x, le_y, nt_x, nt_y, rcm_x, rcm_y, lcm_x, lcm_y, score) + // 'tl': top left point of the bounding box + // 're': right eye, 'le': left eye + // 'nt': nose tip + // 'rcm': right corner of mouth, 'lcm': left corner of mouth + Mat face(1, 15, CV_32FC1); + for (size_t i = 0; i < priors.size(); ++i) { + // Get score + float clsScore = conf_v[i*2+1]; + float iouScore = iou_v[i]; + // Clamp + if (iouScore < 0.f) { + iouScore = 0.f; + } + else if (iouScore > 1.f) { + iouScore = 1.f; + } + float score = std::sqrt(clsScore * iouScore); + face.at(0, 14) = score; + + // Get bounding box + float cx = (priors[i].x + loc_v[i*14+0] * variance[0] * priors[i].width) * inputW; + float cy = (priors[i].y + loc_v[i*14+1] * variance[0] * priors[i].height) * inputH; + float w = priors[i].width * exp(loc_v[i*14+2] * variance[0]) * inputW; + float h = priors[i].height * exp(loc_v[i*14+3] * variance[1]) * inputH; + float x1 = cx - w / 2; + float y1 = cy - h / 2; + face.at(0, 0) = x1; + face.at(0, 1) = y1; + face.at(0, 2) = w; + face.at(0, 3) = h; + + // Get landmarks + face.at(0, 4) = (priors[i].x + loc_v[i*14+ 4] * variance[0] * priors[i].width) * inputW; // right eye, x + face.at(0, 5) = (priors[i].y + loc_v[i*14+ 5] * variance[0] * priors[i].height) * inputH; // right eye, y + face.at(0, 6) = (priors[i].x + loc_v[i*14+ 6] * variance[0] * priors[i].width) * inputW; // left eye, x + face.at(0, 7) = (priors[i].y + loc_v[i*14+ 7] * variance[0] * priors[i].height) * inputH; // left eye, y + face.at(0, 8) = (priors[i].x + loc_v[i*14+ 8] * variance[0] * priors[i].width) * inputW; // nose tip, x + face.at(0, 9) = (priors[i].y + loc_v[i*14+ 9] * variance[0] * priors[i].height) * inputH; // nose tip, y + face.at(0, 10) = (priors[i].x + loc_v[i*14+10] * variance[0] * priors[i].width) * inputW; // right corner of mouth, x + face.at(0, 11) = (priors[i].y + loc_v[i*14+11] * variance[0] * priors[i].height) * inputH; // right corner of mouth, y + face.at(0, 12) = (priors[i].x + loc_v[i*14+12] * variance[0] * priors[i].width) * inputW; // left corner of mouth, x + face.at(0, 13) = (priors[i].y + loc_v[i*14+13] * variance[0] * priors[i].height) * inputH; // left corner of mouth, y + + faces.push_back(face); + } + + if (faces.rows > 1) + { + // Retrieve boxes and scores + std::vector faceBoxes; + std::vector faceScores; + for (int rIdx = 0; rIdx < faces.rows; rIdx++) + { + faceBoxes.push_back(Rect2i(int(faces.at(rIdx, 0)), + int(faces.at(rIdx, 1)), + int(faces.at(rIdx, 2)), + int(faces.at(rIdx, 3)))); + faceScores.push_back(faces.at(rIdx, 14)); + } + + std::vector keepIdx; + dnn::NMSBoxes(faceBoxes, faceScores, scoreThreshold, nmsThreshold, keepIdx, 1.f, topK); + + // Get NMS results + Mat nms_faces; + for (int idx: keepIdx) + { + nms_faces.push_back(faces.row(idx)); + } + return nms_faces; + } + else + { + return faces; + } + } +private: + dnn::Net net; + + int inputW; + int inputH; + float scoreThreshold; + float nmsThreshold; + int topK; + + std::vector priors; +}; + +Ptr FaceDetectorYN::create(const String& model, + const String& config, + const Size& input_size, + const float score_threshold, + const float nms_threshold, + const int top_k, + const int backend_id, + const int target_id) +{ + return makePtr(model, config, input_size, score_threshold, nms_threshold, top_k, backend_id, target_id); +} + +} // namespace cv diff --git a/modules/objdetect/src/face_recognize.cpp b/modules/objdetect/src/face_recognize.cpp new file mode 100644 index 0000000000..6550a13b4b --- /dev/null +++ b/modules/objdetect/src/face_recognize.cpp @@ -0,0 +1,182 @@ +// This file is part of OpenCV project. +// It is subject to the license terms in the LICENSE file found in the top-level directory +// of this distribution and at http://opencv.org/license.html. + +#include "precomp.hpp" + +#include "opencv2/dnn.hpp" + +#include + +namespace cv +{ + +class FaceRecognizerSFImpl : public FaceRecognizerSF +{ +public: + FaceRecognizerSFImpl(const String& model, const String& config, int backend_id, int target_id) + { + net = dnn::readNet(model, config); + CV_Assert(!net.empty()); + + net.setPreferableBackend(backend_id); + net.setPreferableTarget(target_id); + }; + void alignCrop(InputArray _src_img, InputArray _face_mat, OutputArray _aligned_img) const override + { + Mat face_mat = _face_mat.getMat(); + float src_point[5][2]; + for (int row = 0; row < 5; ++row) + { + for(int col = 0; col < 2; ++col) + { + src_point[row][col] = face_mat.at(0, row*2+col+4); + } + } + Mat warp_mat = getSimilarityTransformMatrix(src_point); + warpAffine(_src_img, _aligned_img, warp_mat, Size(112, 112), INTER_LINEAR); + }; + void feature(InputArray _aligned_img, OutputArray _face_feature) override + { + Mat inputBolb = dnn::blobFromImage(_aligned_img, 1, Size(112, 112), Scalar(0, 0, 0), true, false); + net.setInput(inputBolb); + net.forward(_face_feature); + }; + double match(InputArray _face_feature1, InputArray _face_feature2, int dis_type) const override + { + Mat face_feature1 = _face_feature1.getMat(), face_feature2 = _face_feature2.getMat(); + face_feature1 /= norm(face_feature1); + face_feature2 /= norm(face_feature2); + + if(dis_type == DisType::FR_COSINE){ + return sum(face_feature1.mul(face_feature2))[0]; + }else if(dis_type == DisType::FR_NORM_L2){ + return norm(face_feature1, face_feature2); + }else{ + throw std::invalid_argument("invalid parameter " + std::to_string(dis_type)); + } + + }; + +private: + Mat getSimilarityTransformMatrix(float src[5][2]) const { + float dst[5][2] = { {38.2946f, 51.6963f}, {73.5318f, 51.5014f}, {56.0252f, 71.7366f}, {41.5493f, 92.3655f}, {70.7299f, 92.2041f} }; + float avg0 = (src[0][0] + src[1][0] + src[2][0] + src[3][0] + src[4][0]) / 5; + float avg1 = (src[0][1] + src[1][1] + src[2][1] + src[3][1] + src[4][1]) / 5; + //Compute mean of src and dst. + float src_mean[2] = { avg0, avg1 }; + float dst_mean[2] = { 56.0262f, 71.9008f }; + //Subtract mean from src and dst. + float src_demean[5][2]; + for (int i = 0; i < 2; i++) + { + for (int j = 0; j < 5; j++) + { + src_demean[j][i] = src[j][i] - src_mean[i]; + } + } + float dst_demean[5][2]; + for (int i = 0; i < 2; i++) + { + for (int j = 0; j < 5; j++) + { + dst_demean[j][i] = dst[j][i] - dst_mean[i]; + } + } + double A00 = 0.0, A01 = 0.0, A10 = 0.0, A11 = 0.0; + for (int i = 0; i < 5; i++) + A00 += dst_demean[i][0] * src_demean[i][0]; + A00 = A00 / 5; + for (int i = 0; i < 5; i++) + A01 += dst_demean[i][0] * src_demean[i][1]; + A01 = A01 / 5; + for (int i = 0; i < 5; i++) + A10 += dst_demean[i][1] * src_demean[i][0]; + A10 = A10 / 5; + for (int i = 0; i < 5; i++) + A11 += dst_demean[i][1] * src_demean[i][1]; + A11 = A11 / 5; + Mat A = (Mat_(2, 2) << A00, A01, A10, A11); + double d[2] = { 1.0, 1.0 }; + double detA = A00 * A11 - A01 * A10; + if (detA < 0) + d[1] = -1; + double T[3][3] = { {1.0, 0.0, 0.0}, {0.0, 1.0, 0.0}, {0.0, 0.0, 1.0} }; + Mat s, u, vt, v; + SVD::compute(A, s, u, vt); + double smax = s.ptr(0)[0]>s.ptr(1)[0] ? s.ptr(0)[0] : s.ptr(1)[0]; + double tol = smax * 2 * FLT_MIN; + int rank = 0; + if (s.ptr(0)[0]>tol) + rank += 1; + if (s.ptr(1)[0]>tol) + rank += 1; + double arr_u[2][2] = { {u.ptr(0)[0], u.ptr(0)[1]}, {u.ptr(1)[0], u.ptr(1)[1]} }; + double arr_vt[2][2] = { {vt.ptr(0)[0], vt.ptr(0)[1]}, {vt.ptr(1)[0], vt.ptr(1)[1]} }; + double det_u = arr_u[0][0] * arr_u[1][1] - arr_u[0][1] * arr_u[1][0]; + double det_vt = arr_vt[0][0] * arr_vt[1][1] - arr_vt[0][1] * arr_vt[1][0]; + if (rank == 1) + { + if ((det_u*det_vt) > 0) + { + Mat uvt = u*vt; + T[0][0] = uvt.ptr(0)[0]; + T[0][1] = uvt.ptr(0)[1]; + T[1][0] = uvt.ptr(1)[0]; + T[1][1] = uvt.ptr(1)[1]; + } + else + { + double temp = d[1]; + d[1] = -1; + Mat D = (Mat_(2, 2) << d[0], 0.0, 0.0, d[1]); + Mat Dvt = D*vt; + Mat uDvt = u*Dvt; + T[0][0] = uDvt.ptr(0)[0]; + T[0][1] = uDvt.ptr(0)[1]; + T[1][0] = uDvt.ptr(1)[0]; + T[1][1] = uDvt.ptr(1)[1]; + d[1] = temp; + } + } + else + { + Mat D = (Mat_(2, 2) << d[0], 0.0, 0.0, d[1]); + Mat Dvt = D*vt; + Mat uDvt = u*Dvt; + T[0][0] = uDvt.ptr(0)[0]; + T[0][1] = uDvt.ptr(0)[1]; + T[1][0] = uDvt.ptr(1)[0]; + T[1][1] = uDvt.ptr(1)[1]; + } + double var1 = 0.0; + for (int i = 0; i < 5; i++) + var1 += src_demean[i][0] * src_demean[i][0]; + var1 = var1 / 5; + double var2 = 0.0; + for (int i = 0; i < 5; i++) + var2 += src_demean[i][1] * src_demean[i][1]; + var2 = var2 / 5; + double scale = 1.0 / (var1 + var2)* (s.ptr(0)[0] * d[0] + s.ptr(1)[0] * d[1]); + double TS[2]; + TS[0] = T[0][0] * src_mean[0] + T[0][1] * src_mean[1]; + TS[1] = T[1][0] * src_mean[0] + T[1][1] * src_mean[1]; + T[0][2] = dst_mean[0] - scale*TS[0]; + T[1][2] = dst_mean[1] - scale*TS[1]; + T[0][0] *= scale; + T[0][1] *= scale; + T[1][0] *= scale; + T[1][1] *= scale; + Mat transform_mat = (Mat_(2, 3) << T[0][0], T[0][1], T[0][2], T[1][0], T[1][1], T[1][2]); + return transform_mat; + } +private: + dnn::Net net; +}; + +Ptr FaceRecognizerSF::create(const String& model, const String& config, int backend_id, int target_id) +{ + return makePtr(model, config, backend_id, target_id); +} + +} // namespace cv diff --git a/modules/objdetect/test/test_face.cpp b/modules/objdetect/test/test_face.cpp new file mode 100644 index 0000000000..2e944c50df --- /dev/null +++ b/modules/objdetect/test/test_face.cpp @@ -0,0 +1,219 @@ +// This file is part of OpenCV project. +// It is subject to the license terms in the LICENSE file found in the top-level directory +// of this distribution and at http://opencv.org/license.html. + +#include "test_precomp.hpp" + +namespace opencv_test { namespace { + +// label format: +// image_name +// num_face +// face_1 +// face_.. +// face_num +std::map blobFromTXT(const std::string& path, int numCoords) +{ + std::ifstream ifs(path.c_str()); + CV_Assert(ifs.is_open()); + + std::map gt; + + Mat faces; + int faceNum = -1; + int faceCount = 0; + for (std::string line, key; getline(ifs, line); ) + { + std::istringstream iss(line); + if (line.find(".png") != std::string::npos) + { + // Get filename + iss >> key; + } + else if (line.find(" ") == std::string::npos) + { + // Get the number of faces + iss >> faceNum; + } + else + { + // Get faces + Mat face(1, numCoords, CV_32FC1); + for (int j = 0; j < numCoords; j++) + { + iss >> face.at(0, j); + } + faces.push_back(face); + faceCount++; + } + + if (faceCount == faceNum) + { + // Store faces + gt[key] = faces; + + faces.release(); + faceNum = -1; + faceCount = 0; + } + } + + return gt; +} + +TEST(Objdetect_face_detection, regression) +{ + // Pre-set params + float scoreThreshold = 0.7f; + float matchThreshold = 0.9f; + float l2disThreshold = 5.0f; + int numLM = 5; + int numCoords = 4 + 2 * numLM; + + // Load ground truth labels + std::map gt = blobFromTXT(findDataFile("dnn_face/detection/cascades_labels.txt"), numCoords); + // for (auto item: gt) + // { + // std::cout << item.first << " " << item.second.size() << std::endl; + // } + + // Initialize detector + std::string model = findDataFile("dnn/onnx/models/yunet-202109.onnx", false); + Ptr faceDetector = FaceDetectorYN::create(model, "", Size(300, 300)); + faceDetector->setScoreThreshold(0.7f); + + // Detect and match + for (auto item: gt) + { + std::string imagePath = findDataFile("cascadeandhog/images/" + item.first); + Mat image = imread(imagePath); + + // Set input size + faceDetector->setInputSize(image.size()); + + // Run detection + Mat faces; + faceDetector->detect(image, faces); + // std::cout << item.first << " " << item.second.rows << " " << faces.rows << std::endl; + + // Match bboxes and landmarks + std::vector matchedItem(item.second.rows, false); + for (int i = 0; i < faces.rows; i++) + { + if (faces.at(i, numCoords) < scoreThreshold) + continue; + + bool boxMatched = false; + std::vector lmMatched(numLM, false); + cv::Rect2f resBox(faces.at(i, 0), faces.at(i, 1), faces.at(i, 2), faces.at(i, 3)); + for (int j = 0; j < item.second.rows && !boxMatched; j++) + { + if (matchedItem[j]) + continue; + + // Retrieve bbox and compare IoU + cv::Rect2f gtBox(item.second.at(j, 0), item.second.at(j, 1), item.second.at(j, 2), item.second.at(j, 3)); + double interArea = (resBox & gtBox).area(); + double iou = interArea / (resBox.area() + gtBox.area() - interArea); + if (iou >= matchThreshold) + { + boxMatched = true; + matchedItem[j] = true; + } + + // Match landmarks if bbox is matched + if (!boxMatched) + continue; + for (int lmIdx = 0; lmIdx < numLM; lmIdx++) + { + float gtX = item.second.at(j, 4 + 2 * lmIdx); + float gtY = item.second.at(j, 4 + 2 * lmIdx + 1); + float resX = faces.at(i, 4 + 2 * lmIdx); + float resY = faces.at(i, 4 + 2 * lmIdx + 1); + float l2dis = cv::sqrt((gtX - resX) * (gtX - resX) + (gtY - resY) * (gtY - resY)); + + if (l2dis <= l2disThreshold) + { + lmMatched[lmIdx] = true; + } + } + } + EXPECT_TRUE(boxMatched) << "In image " << item.first << ", cannot match resBox " << resBox << " with any ground truth."; + if (boxMatched) + { + EXPECT_TRUE(std::all_of(lmMatched.begin(), lmMatched.end(), [](bool v) { return v; })) << "In image " << item.first << ", resBox " << resBox << " matched but its landmarks failed to match."; + } + } + } +} + +TEST(Objdetect_face_recognition, regression) +{ + // Pre-set params + float score_thresh = 0.9f; + float nms_thresh = 0.3f; + double cosine_similar_thresh = 0.363; + double l2norm_similar_thresh = 1.128; + + // Load ground truth labels + std::ifstream ifs(findDataFile("dnn_face/recognition/cascades_label.txt").c_str()); + CV_Assert(ifs.is_open()); + + std::set fSet; + std::map featureMap; + std::map, int> gtMap; + + + for (std::string line, key; getline(ifs, line);) + { + std::string fname1, fname2; + int label; + std::istringstream iss(line); + iss>>fname1>>fname2>>label; + // std::cout< faceDetector = FaceDetectorYN::create(detect_model, "", Size(150, 150), score_thresh, nms_thresh); + + std::string recog_model = findDataFile("dnn/onnx/models/face_recognizer_fast.onnx", false); + Ptr faceRecognizer = FaceRecognizerSF::create(recog_model, ""); + + // Detect and match + for (auto fname: fSet) + { + std::string imagePath = findDataFile("dnn_face/recognition/" + fname); + Mat image = imread(imagePath); + + Mat faces; + faceDetector->detect(image, faces); + + Mat aligned_face; + faceRecognizer->alignCrop(image, faces.row(0), aligned_face); + + Mat feature; + faceRecognizer->feature(aligned_face, feature); + + featureMap[fname] = feature.clone(); + } + + for (auto item: gtMap) + { + Mat feature1 = featureMap[item.first.first]; + Mat feature2 = featureMap[item.first.second]; + int label = item.second; + + double cos_score = faceRecognizer->match(feature1, feature2, FaceRecognizerSF::DisType::FR_COSINE); + double L2_score = faceRecognizer->match(feature1, feature2, FaceRecognizerSF::DisType::FR_NORM_L2); + + EXPECT_TRUE(label == 0 ? cos_score <= cosine_similar_thresh : cos_score > cosine_similar_thresh) << "Cosine match result of images " << item.first.first << " and " << item.first.second << " is different from ground truth (score: "<< cos_score <<";Thresh: "<< cosine_similar_thresh <<")."; + EXPECT_TRUE(label == 0 ? L2_score > l2norm_similar_thresh : L2_score <= l2norm_similar_thresh) << "L2norm match result of images " << item.first.first << " and " << item.first.second << " is different from ground truth (score: "<< L2_score <<";Thresh: "<< l2norm_similar_thresh <<")."; + } +} + +}} // namespace diff --git a/modules/objdetect/test/test_main.cpp b/modules/objdetect/test/test_main.cpp index 93e4d2860e..4031f0522b 100644 --- a/modules/objdetect/test/test_main.cpp +++ b/modules/objdetect/test/test_main.cpp @@ -7,4 +7,19 @@ #include #endif -CV_TEST_MAIN("cv") +static +void initTests() +{ +#ifdef HAVE_OPENCV_DNN + const char* extraTestDataPath = +#ifdef WINRT + NULL; +#else + getenv("OPENCV_DNN_TEST_DATA_PATH"); +#endif + if (extraTestDataPath) + cvtest::addDataSearchPath(extraTestDataPath); +#endif // HAVE_OPENCV_DNN +} + +CV_TEST_MAIN("cv", initTests()) diff --git a/modules/ts/include/opencv2/ts.hpp b/modules/ts/include/opencv2/ts.hpp index 394bc6e0fa..2e7a241d8e 100644 --- a/modules/ts/include/opencv2/ts.hpp +++ b/modules/ts/include/opencv2/ts.hpp @@ -37,6 +37,7 @@ #include #include #include +#include #ifndef OPENCV_32BIT_CONFIGURATION diff --git a/samples/dnn/CMakeLists.txt b/samples/dnn/CMakeLists.txt index 209fbb586c..9a1aeed339 100644 --- a/samples/dnn/CMakeLists.txt +++ b/samples/dnn/CMakeLists.txt @@ -4,6 +4,7 @@ set(OPENCV_DNN_SAMPLES_REQUIRED_DEPS opencv_core opencv_imgproc opencv_dnn + opencv_objdetect opencv_video opencv_imgcodecs opencv_videoio diff --git a/samples/dnn/face_detect.cpp b/samples/dnn/face_detect.cpp new file mode 100644 index 0000000000..8d91a10968 --- /dev/null +++ b/samples/dnn/face_detect.cpp @@ -0,0 +1,132 @@ +#include +#include +#include +#include + +#include + +using namespace cv; +using namespace std; + +static Mat visualize(Mat input, Mat faces, int thickness=2) +{ + Mat output = input.clone(); + for (int i = 0; i < faces.rows; i++) + { + // Print results + cout << "Face " << i + << ", top-left coordinates: (" << faces.at(i, 0) << ", " << faces.at(i, 1) << "), " + << "box width: " << faces.at(i, 2) << ", box height: " << faces.at(i, 3) << ", " + << "score: " << faces.at(i, 14) << "\n"; + + // Draw bounding box + rectangle(output, Rect2i(int(faces.at(i, 0)), int(faces.at(i, 1)), int(faces.at(i, 2)), int(faces.at(i, 3))), Scalar(0, 255, 0), thickness); + // Draw landmarks + circle(output, Point2i(int(faces.at(i, 4)), int(faces.at(i, 5))), 2, Scalar(255, 0, 0), thickness); + circle(output, Point2i(int(faces.at(i, 6)), int(faces.at(i, 7))), 2, Scalar( 0, 0, 255), thickness); + circle(output, Point2i(int(faces.at(i, 8)), int(faces.at(i, 9))), 2, Scalar( 0, 255, 0), thickness); + circle(output, Point2i(int(faces.at(i, 10)), int(faces.at(i, 11))), 2, Scalar(255, 0, 255), thickness); + circle(output, Point2i(int(faces.at(i, 12)), int(faces.at(i, 13))), 2, Scalar( 0, 255, 255), thickness); + } + return output; +} + +int main(int argc, char ** argv) +{ + CommandLineParser parser(argc, argv, + "{help h | | Print this message.}" + "{input i | | Path to the input image. Omit for detecting on default camera.}" + "{model m | yunet.onnx | Path to the model. Download yunet.onnx in https://github.com/ShiqiYu/libfacedetection.train/tree/master/tasks/task1/onnx.}" + "{score_threshold | 0.9 | Filter out faces of score < score_threshold.}" + "{nms_threshold | 0.3 | Suppress bounding boxes of iou >= nms_threshold.}" + "{top_k | 5000 | Keep top_k bounding boxes before NMS.}" + "{save s | false | Set true to save results. This flag is invalid when using camera.}" + "{vis v | true | Set true to open a window for result visualization. This flag is invalid when using camera.}" + ); + if (argc == 1 || parser.has("help")) + { + parser.printMessage(); + return -1; + } + + String modelPath = parser.get("model"); + + float scoreThreshold = parser.get("score_threshold"); + float nmsThreshold = parser.get("nms_threshold"); + int topK = parser.get("top_k"); + + bool save = parser.get("save"); + bool vis = parser.get("vis"); + + // Initialize FaceDetectorYN + Ptr detector = FaceDetectorYN::create(modelPath, "", Size(320, 320), scoreThreshold, nmsThreshold, topK); + + // If input is an image + if (parser.has("input")) + { + String input = parser.get("input"); + Mat image = imread(input); + + // Set input size before inference + detector->setInputSize(image.size()); + + // Inference + Mat faces; + detector->detect(image, faces); + + // Draw results on the input image + Mat result = visualize(image, faces); + + // Save results if save is true + if(save) + { + cout << "Results saved to result.jpg\n"; + imwrite("result.jpg", result); + } + + // Visualize results + if (vis) + { + namedWindow(input, WINDOW_AUTOSIZE); + imshow(input, result); + waitKey(0); + } + } + else + { + int deviceId = 0; + VideoCapture cap; + cap.open(deviceId, CAP_ANY); + int frameWidth = int(cap.get(CAP_PROP_FRAME_WIDTH)); + int frameHeight = int(cap.get(CAP_PROP_FRAME_HEIGHT)); + detector->setInputSize(Size(frameWidth, frameHeight)); + + Mat frame; + TickMeter tm; + String msg = "FPS: "; + while(waitKey(1) < 0) // Press any key to exit + { + // Get frame + if (!cap.read(frame)) + { + cerr << "No frames grabbed!\n"; + break; + } + + // Inference + Mat faces; + tm.start(); + detector->detect(frame, faces); + tm.stop(); + + // Draw results on the input image + Mat result = visualize(frame, faces); + putText(result, msg + to_string(tm.getFPS()), Point(0, 15), FONT_HERSHEY_SIMPLEX, 0.5, Scalar(0, 255, 0)); + + // Visualize results + imshow("Live", result); + + tm.reset(); + } + } +} \ No newline at end of file diff --git a/samples/dnn/face_detect.py b/samples/dnn/face_detect.py new file mode 100644 index 0000000000..65069d6590 --- /dev/null +++ b/samples/dnn/face_detect.py @@ -0,0 +1,101 @@ +import argparse + +import numpy as np +import cv2 as cv + +def str2bool(v): + if v.lower() in ['on', 'yes', 'true', 'y', 't']: + return True + elif v.lower() in ['off', 'no', 'false', 'n', 'f']: + return False + else: + raise NotImplementedError + +parser = argparse.ArgumentParser() +parser.add_argument('--input', '-i', type=str, help='Path to the input image.') +parser.add_argument('--model', '-m', type=str, default='yunet.onnx', help='Path to the model. Download the model at https://github.com/ShiqiYu/libfacedetection.train/tree/master/tasks/task1/onnx.') +parser.add_argument('--score_threshold', type=float, default=0.9, help='Filtering out faces of score < score_threshold.') +parser.add_argument('--nms_threshold', type=float, default=0.3, help='Suppress bounding boxes of iou >= nms_threshold.') +parser.add_argument('--top_k', type=int, default=5000, help='Keep top_k bounding boxes before NMS.') +parser.add_argument('--save', '-s', type=str2bool, default=False, help='Set true to save results. This flag is invalid when using camera.') +parser.add_argument('--vis', '-v', type=str2bool, default=True, help='Set true to open a window for result visualization. This flag is invalid when using camera.') +args = parser.parse_args() + +def visualize(input, faces, thickness=2): + output = input.copy() + if faces[1] is not None: + for idx, face in enumerate(faces[1]): + print('Face {}, top-left coordinates: ({:.0f}, {:.0f}), box width: {:.0f}, box height {:.0f}, score: {:.2f}'.format(idx, face[0], face[1], face[2], face[3], face[-1])) + + coords = face[:-1].astype(np.int32) + cv.rectangle(output, (coords[0], coords[1]), (coords[0]+coords[2], coords[1]+coords[3]), (0, 255, 0), 2) + cv.circle(output, (coords[4], coords[5]), 2, (255, 0, 0), 2) + cv.circle(output, (coords[6], coords[7]), 2, (0, 0, 255), 2) + cv.circle(output, (coords[8], coords[9]), 2, (0, 255, 0), 2) + cv.circle(output, (coords[10], coords[11]), 2, (255, 0, 255), 2) + cv.circle(output, (coords[12], coords[13]), 2, (0, 255, 255), 2) + return output + +if __name__ == '__main__': + + # Instantiate FaceDetectorYN + detector = cv.FaceDetectorYN.create( + args.model, + "", + (320, 320), + args.score_threshold, + args.nms_threshold, + args.top_k + ) + + # If input is an image + if args.input is not None: + image = cv.imread(args.input) + + # Set input size before inference + detector.setInputSize((image.shape[1], image.shape[0])) + + # Inference + faces = detector.detect(image) + + # Draw results on the input image + result = visualize(image, faces) + + # Save results if save is true + if args.save: + print('Resutls saved to result.jpg\n') + cv.imwrite('result.jpg', result) + + # Visualize results in a new window + if args.vis: + cv.namedWindow(args.input, cv.WINDOW_AUTOSIZE) + cv.imshow(args.input, result) + cv.waitKey(0) + else: # Omit input to call default camera + deviceId = 0 + cap = cv.VideoCapture(deviceId) + frameWidth = int(cap.get(cv.CAP_PROP_FRAME_WIDTH)) + frameHeight = int(cap.get(cv.CAP_PROP_FRAME_HEIGHT)) + detector.setInputSize([frameWidth, frameHeight]) + + tm = cv.TickMeter() + while cv.waitKey(1) < 0: + hasFrame, frame = cap.read() + if not hasFrame: + print('No frames grabbed!') + break + + # Inference + tm.start() + faces = detector.detect(frame) # faces is a tuple + tm.stop() + + # Draw results on the input image + frame = visualize(frame, faces) + + cv.putText(frame, 'FPS: {}'.format(tm.getFPS()), (0, 15), cv.FONT_HERSHEY_SIMPLEX, 0.5, (0, 255, 0)) + + # Visualize results in a new Window + cv.imshow('Live', frame) + + tm.reset() \ No newline at end of file diff --git a/samples/dnn/face_match.cpp b/samples/dnn/face_match.cpp new file mode 100644 index 0000000000..f24134b890 --- /dev/null +++ b/samples/dnn/face_match.cpp @@ -0,0 +1,103 @@ +// This file is part of OpenCV project. +// It is subject to the license terms in the LICENSE file found in the top-level directory +// of this distribution and at http://opencv.org/license.html. + +#include "opencv2/dnn.hpp" +#include "opencv2/imgproc.hpp" +#include "opencv2/highgui.hpp" + +#include + +#include "opencv2/objdetect.hpp" + + +using namespace cv; +using namespace std; + + +int main(int argc, char ** argv) +{ + if (argc != 5) + { + std::cerr << "Usage " << argv[0] << ": " + << " " + << " " + << "" + << "\n"; + return -1; + } + + String det_onnx_path = argv[1]; + String reg_onnx_path = argv[2]; + String image1_path = argv[3]; + String image2_path = argv[4]; + std::cout< faceDetector; + + faceDetector = FaceDetectorYN::create(det_onnx_path, "", image1.size(), score_thresh, nms_thresh, top_k); + Mat faces_1; + faceDetector->detect(image1, faces_1); + if (faces_1.rows < 1) + { + std::cerr << "Cannot find a face in " << image1_path << "\n"; + return -1; + } + + faceDetector = FaceDetectorYN::create(det_onnx_path, "", image2.size(), score_thresh, nms_thresh, top_k); + Mat faces_2; + faceDetector->detect(image2, faces_2); + if (faces_2.rows < 1) + { + std::cerr << "Cannot find a face in " << image2_path << "\n"; + return -1; + } + + // Initialize FaceRecognizerSF + Ptr faceRecognizer = FaceRecognizerSF::create(reg_onnx_path, ""); + + + Mat aligned_face1, aligned_face2; + faceRecognizer->alignCrop(image1, faces_1.row(0), aligned_face1); + faceRecognizer->alignCrop(image2, faces_2.row(0), aligned_face2); + + Mat feature1, feature2; + faceRecognizer->feature(aligned_face1, feature1); + feature1 = feature1.clone(); + faceRecognizer->feature(aligned_face2, feature2); + feature2 = feature2.clone(); + + double cos_score = faceRecognizer->match(feature1, feature2, FaceRecognizerSF::DisType::FR_COSINE); + double L2_score = faceRecognizer->match(feature1, feature2, FaceRecognizerSF::DisType::FR_NORM_L2); + + if(cos_score >= cosine_similar_thresh) + { + std::cout << "They have the same identity;"; + } + else + { + std::cout << "They have different identities;"; + } + std::cout << " Cosine Similarity: " << cos_score << ", threshold: " << cosine_similar_thresh << ". (higher value means higher similarity, max 1.0)\n"; + + if(L2_score <= l2norm_similar_thresh) + { + std::cout << "They have the same identity;"; + } + else + { + std::cout << "They have different identities."; + } + std::cout << " NormL2 Distance: " << L2_score << ", threshold: " << l2norm_similar_thresh << ". (lower value means higher similarity, min 0.0)\n"; + + return 0; +} diff --git a/samples/dnn/face_match.py b/samples/dnn/face_match.py new file mode 100644 index 0000000000..b36c9f6367 --- /dev/null +++ b/samples/dnn/face_match.py @@ -0,0 +1,57 @@ +import argparse + +import numpy as np +import cv2 as cv + +parser = argparse.ArgumentParser() +parser.add_argument('--input1', '-i1', type=str, help='Path to the input image1.') +parser.add_argument('--input2', '-i2', type=str, help='Path to the input image2.') +parser.add_argument('--face_detection_model', '-fd', type=str, help='Path to the face detection model. Download the model at https://github.com/ShiqiYu/libfacedetection.train/tree/master/tasks/task1/onnx.') +parser.add_argument('--face_recognition_model', '-fr', type=str, help='Path to the face recognition model. Download the model at https://drive.google.com/file/d/1ClK9WiB492c5OZFKveF3XiHCejoOxINW/view.') +args = parser.parse_args() + +# Read the input image +img1 = cv.imread(args.input1) +img2 = cv.imread(args.input2) + +# Instantiate face detector and recognizer +detector = cv.FaceDetectorYN.create( + args.face_detection_model, + "", + (img1.shape[1], img1.shape[0]) +) +recognizer = cv.FaceRecognizerSF.create( + args.face_recognition_model, + "" +) + +# Detect face +detector.setInputSize((img1.shape[1], img1.shape[0])) +face1 = detector.detect(img1) +detector.setInputSize((img2.shape[1], img2.shape[0])) +face2 = detector.detect(img2) +assert face1[1].shape[0] > 0, 'Cannot find a face in {}'.format(args.input1) +assert face2[1].shape[0] > 0, 'Cannot find a face in {}'.format(args.input2) + +# Align faces +face1_align = recognizer.alignCrop(img1, face1[1][0]) +face2_align = recognizer.alignCrop(img2, face2[1][0]) + +# Extract features +face1_feature = recognizer.faceFeature(face1_align) +face2_feature = recognizer.faceFeature(face2_align) + +# Calculate distance (0: cosine, 1: L2) +cosine_similarity_threshold = 0.363 +cosine_score = recognizer.faceMatch(face1_feature, face2_feature, 0) +msg = 'different identities' +if cosine_score >= cosine_similarity_threshold: + msg = 'the same identity' +print('They have {}. Cosine Similarity: {}, threshold: {} (higher value means higher similarity, max 1.0).'.format(msg, cosine_score, cosine_similarity_threshold)) + +l2_similarity_threshold = 1.128 +l2_score = recognizer.faceMatch(face1_feature, face2_feature, 1) +msg = 'different identities' +if l2_score <= l2_similarity_threshold: + msg = 'the same identity' +print('They have {}. NormL2 Distance: {}, threshold: {} (lower value means higher similarity, min 0.0).'.format(msg, l2_score, l2_similarity_threshold)) \ No newline at end of file diff --git a/samples/dnn/results/audrybt1.jpg b/samples/dnn/results/audrybt1.jpg new file mode 100644 index 0000000000000000000000000000000000000000..5829f82309cf904f7eddbe42ae555798ab3acff6 GIT binary patch literal 47680 zcmbTdbx<8m^foxSTae&x!GgO5f+x7U`^DWQxI4iK?(QVGh2VaX;4T-xL6`UUeYN|~ z*4FO!RCmqP^h}+n`qVl7oOAkh`E?V(mY0^324G+S0OsuiURME002b!I_TLEmUxSDH zZ$yBHhl58%Kt%kXj)a1Yh=hWKh=`1ajDq@Kd%HtNLq-4Z;lH2!uTNnS;NTEYkr0vo zXUhMl;k6gQL5A6dA%ugW24Hbu;Ba7G2LXyVIT8Lxyf9X$i%2PSSFUOs*SLCMcj(lWAg@@ncDnp)aAx@P7UmR8m_wytjO9-dy_KB3>j z!XqN1qLWin)6z3Czi0g@DlRE4E3c?*Y-(<4ZENr7>>n5$8Xg%P`#n21zp%KpyaL|Z z-r3#TKR7%(zP!4=xxKrG{(JZj7YqRRzhJ$M|F^hs-f+SGw+2Z6;evtneCu#H@CZ~K zi0?kBAelJhQga3)<4GhHHuRy;aH&G@Ou=q~?*_CLt}-vJBx|Ap-T1onU7 z0t4u9FmHqnHN1r7kChUYRr zJC59oi{?Z6B)?HsVp8OyF(Bu9s^gfs?w@J(K&$yhZyd>O3TI$yaX=xhxiUatG>OfaTrvTpU1c>$?bM6(GV_ z;&@k${MdVt=sfSUKp1En7>R4kmsrdT!Y|ZL3+sT@(T@Li8Jxe^-6nrI860CE_;CIE zS9_?R<^<*nk&JOsDa?_&RhJh&nMS~Ty&z9HMwpYL&gIA@tY|aLg|5{Z?)GD^XGyy~ z->AtG%(ImCD-d?EWAxzp3iL>HXZNcXs8oyzeYw&DSFHDp9zkxpkgs% zAqI&szr62>^9Q{G7o){uWZjO#7;Qc0Qr>eX4#-zqgd;+o2Tx2rcX8v2DWm@weK0wN zrjg15`|mI7o`qh4zlAbS9|K8zR^M)u486d+zA&mH6S311-#3zNbMwge3K(`DJeuxu7*~xFkzU@E%9XFRdW;1r%EI*+9f*@T4%V$A zykL7VD?=kLDTV|MQ;Lo>Cq$la*Ys2gM>K_R>L$d%xG$K0AzGnN7JOkZa8P(?9{4>+ zuIjDm2yPo3bB6*5Nf;Ze$Djv2uy3FTqOSDLPZK&&{xV{O>0Utt$@u1|DTXal{hhc5 zsth$13!z8P_WHLFb;tqnJl%qT$F~y_!g?ah@>V5+Na7^CbZGKV`{v2I8Vl>?fP-JM zyFpkRRTNNx@tuorPwxw!bfE)}#CE!Yq3$;-CUwh9<75Np#u6cwTF?5KYXZZFit^7I zj~@g3T*NHvz~CF}zi7YMj>ap|q;i{U6qI~zQ1eiXxY;0OK+?z`p(I0$O*F@EtuF|W z`_t8(R{%p#q3B4?#<6(WR+DLFZJ*MBCK02t9Q7hNd{mgM1yc6(KKl-T_7{ml>B~xf zQ(%Mx$Rv52(NEvat3JJy`*T|^KD(LzkI>+8!IW53;Q2%ej5M6GDdlOid>&IFqL!VY&Pj#z@FC=O# zEj$V-$a(4wjc<8SB53stcwrM;ep8chQ9lIikc~kzgo$lB%yNj2Ex92)h z;$1yy6Z`vv=-}I!44R+<(zs@rBcnIvv-{u5rvcL2=}C}E$=2)g(xdbaS(h`E5e^ZP z-z->MrLv0j82|8WTNib!@PXTt%>p^pH*@3<$LPo}XK|9wgfoRp(M4|p42g};@!Qz2 zI&3>6i72c037W!zp4TZF&t(Dh+D5$<6%sQ_A7oHWYscncVPkV}QqEGQiLcR~H7_)} zyj3ky!I>{3d*T{}psp(@5lJ6d=~Hvwt;CZ0{KodiA#KJ6XwYOkvid6yN6;*@71G{@ zw)0#K;4@kAsQDybAO4BH1w<78$7jb0dchA9gWjUl*qM+!}6z};bw^(zp?^{2vOTiaYRwt4)wux^SyO6~jHo zf#Z&yC|{*q({dVB_FDTy>ATXSH0v!=Sw#2;JykX9_ZpG`pP1<5{H;X)@inJlV^43HOh zUh!ie_X61H8qe|p$5d6K;JyL3lZVTPxuMynTH3FrBFj@JFM;wvR&=#H;WQjGpp6N0 zG*&rYp@_0h3Fqn}Eu$FpsEn9q0o^dr#X>l~wr14*$K4g)sIs9#Vo!0u$oz>7w5#0j zKvdtclP|9(3{N`RJ|ZA&8YGrTfeUS9?!kk0DML%_$_>53?=D+^j^lAmP7Yh8;^f3< zJFngJ4kaMj1>(?mbY3O|H6PX$&h3c$2gPOT6-re8scLgF>}rU8b~apj1rRzRlfj!) z&<{&PtP}&Hq!DsW*&1yfcG@&BhH@ZNZ~Bp{(OH3`ai67(fIk$trO*#MHC)?MA>Z0M z!@>)a6bcpWNeyW*)x#(_^FHa{!}S0Zp902>AnJ}##(IE|(OUh5j?{psb7QN&-8 z=2rmS&lQr?Hn-c<7IJ*I$Ud|BS;uc|eB|l?CTdKK4~eYD_7&J%&(+d+1;Uw^{D?dI z=DWqVE+RW?@ zXMQiv)wg;1x5lvg0#CoCHi2fufzB_a0r$TC)5uc7pJ-tF%%OQ4Bz-O0^{YMEtEOm$ z7dEb16f{1(*MB3x4}U9ANt|AM--*yIO0jg$?qRb;XbbMKl000t$z_FQU7Y=OV8jsYiM?9NMc(c zH{)Lo|Cn2BhQ3i6M8;cYg=;e++maHy;&uRP#nk4e-P+)j!yxOunQ(I~UH zv9T}IWv#kSi$WTl`yFO9H8DW4iwLQKRFid^sCb^*Kv#P&`T-x#WM!X*rs(&;FZo(6 z;8bABnj6DRS?AYZva~eM10?G5Xyv0#3LJzk0|@8jp$7kGZ+YdN9B$0kSD;~+(<<7AvUEyDLDT7L5cdJ zo4@7ate$7Oo>k8(Vfb`al~j9UP|eJIo47cY+=H?x>x`Q}_3?;utDUz=dltJntpd00 z!EWDHD;0cz88^(duC0sBofg480l7ij3f|y7R!pPvs84!e zcb&J?&6$C{KF%G5$6L&uSUr)J%z4(<^V@s#MnKrD03%yOe&H# z1ot%8o)h>EiLoYsF8S=wkFIuUDY}sFG-qh{p6f(ZBW$w4)8Vd3Z~XCy#zG@CoT9g~ z&mTtMzWOi5VW&k#&}7EPD2H2xXJR#H*Sp3syz>Zje6O_*B>01*Jnt&rj15vq@k8Qe zc#{8Jw!a{kdtDc{C0f(#JL)|dEZU~y-+bg9cUOMkw$cp;!=R* z2(B+|6xMxZnV@z0lKf9xXgJb?8E(4DZ~2xd*fQDBEB#_9Yj;@eqkB)-b|Qh-7=QXE z8#{_wGu$MM$RXUUAF^ra&_m483z;vW14vbe{L9CzU6{n8L}2rP^zOpIVUv)}n}%ls zDx)3luhVSf7`8CaTPjBzr-)A!zLpo1gByK>>d_ci9^j>&}A`UOe_*P6(K^!%i z+Z*h2dw(D{o4~0U;5yoOP(z0YsO9O<0zy^*OkS^=bpad$;RI6`1?)^&&@mtvcIr>9`Ud{71LBA+2+SC3Y*@h1%IWN!sVF$ihw<5KKlV8eY-pbZ6- z#d9za=gG3UHKYGQSXbpyWFV1^f%t1EPQrQRJGs$gjjvXO7W11-u(8JeYvytHR&<8U&U16d_ zUh+Kq{&|>1PdJ~~_~6;fFN2R<6>}_@*+BTKjiUO`WpAi+dSITMJ>SO$@OUvuE%8{T z(=`ezR(K=q#<#0$|3Re2^tgwk#GQMyz;BcDC>q{Jm^os5t!{3V6g7Lo(D~a9fvZbJ zii4?EJt=R1HMII{0F)>~V6yJz|GZEIAC6Z`&XH$mpE|i*bk{ z(SD~k@+2nQJLCfqJPke;Wq|yC9@}U zmi?&o()S8raWjj(@gRx-e6JXznb|(0FYU8;os&^H)CT^w0t(Kd9pRsdXtmY?Au?|) z?FkEl)Jq8Ju}XlSc(V$qURjePgNuUx&ueo^6sxRBQAs6Ps-)5lYzr<0p&>pCVyiSS zC`XSU0~Yfe?;~Fby!Zyd40mh1oL>u%@2b&T@5f1yi!D{rVC9e);BmG8LgYy%jQW4U zKn);|TC;3hIetV9c73#;>ADxEj~X&}UrzQldxf784(*1jQZ%+|*?&_-3~w_@G@nR2 zR~eSbwh-ec-Y}FjNgVM#!rnU>;Qk@nUC_F$_6|{4I zCC?K_dtJL2;y*0l08CBNPBr}r`kU?#v%Na6fR9eS3{1qy)5&ehJ+=IwOOy7mA4;2J zxhsALC|hNB62w!Z@6@4=DZ`M3j*4caKSjeldhhQ<#!pR&-m z*7sc34DYKae)44hyU%&1MlOf?C`ne9&?#zYCpL(CEJz<}Q@(t3Bs^j)XK8T6`HiVs zz44zf_63u#6++|?U~ObGW2aEDJOJChxJj$3KWV0B*9T8t!sneSyjk8?{md}9%xfW4Z-H@LsE zkyx{+jl=y>A!#`5`dnflLn1?TGSx?rAGlr4L(;_`SF(PZ-g+h+zby_Y?=;?@W?ydX zz4M%;S$#X$GDkIMq?o#EfMb)ELh+c=54i%Fm0Xn&a48U%zM^h`bs~$ks(7eL(ep`odWvp=v8=k4!DgF}! z5TX$QZ&o;D@E!D#`>-~TC~FTSudh?NV^4|CR*gJLnOppD#mNY#(L8Q#Pt2;M#!h=& z<0$OJE7xZl>^zsVLpFNISZ(y~nA0u^{Hg4#FIBj8SXCqPdL%y+VMqyd(#N}U%ZKBV_fo^6;0xn>3F}ox$MEfR6NYrdLoF&dHdf`z zssCvPkb2NFcK7psSHhr;A>W#39HaECygGiNm@iC)3VqI+b&KbJl5=g(_UDBAULxKx zAVBf)H;Tb#9wM{WpeXg?MKhU@9^^h$-BwK($Fbt?Fb&5!4C4ImzgXnW8=`|2z`@IG zFk{iv$xCx!V8UfanQVA#rN9$HBB>3qHbH`gcPC@ z#8+c|--|8Ov&6FUb|>a<0YE%izaoXo9mqjJu#I zxSU||cQSZiI0W_kSE9F= zRb}82`cwENUw;qollpUg0NoKePQBiEwj;icXUb8NZGsK{&4)spm9Sm?#NylDQ|A7c zk8eJ?S!^IM#;5F?u+5`@pBSWNKcRxu+R26s!uGH&D#6sz%R-f79$p*I`2N*US?*~HMr}v~GDROx?$`KGOJelV z4C@3!^oRm|1vK7xSw8X!I6Rm)!>k6XyQDD-awxqfXS9eFEZrV#O#D5bl7lU{6cpH} z`HS&M5o%U**a+3@WtTB=`(u8oSyVFQjnjUden^j*R4-+A&8BVTxMnJvdie?roqg=> z!7l6IeFg0Ez%!9mP+VVx~mLncr#5L+*v5?}&|Yeff)2ohGnMpVO8v@JjtDwVgr$vgOEtii=&*Xr+80oKP$wtOEy9as9qN17}i zt@n+N+GNWAT+CgRFPL0&%<5=RFw=aJ#tlAIn z_W_~W98u|EE7(7-7>q}f#o>`G*-ADu!LCLdOo5Ja&1Y`D89Y*s=qtI+Vd9UG#KG1W zf?_()Dr+5VHMkl~66&C=L`yI-9`D8t5*_~gwWBdX-7;BWQ7;mev-g;GnQk+;E*&H1 zIyfiK_T{^Gl>*47d_#t3XnZ{*a{T(7a3yo86$M=_p{w^ogSGkK&z4Au&>e|Ms@FejY!ys zA1K!#_t8Azi`+S@-&hb;bb738P3Ce=Tygn`WXklV+JIx(2*aHZvs@rL`t(G1z&Y6n zV_>%cNrYiUEe7x1SYeD}S8JACBy}X0Ue80DG?U`od}T7fz)lYyOvVnHuL1~vuXB#g zGUDt*E53h&9Jb3~#jq#(W_Le3EGJb;;}Fvem_~el zp1H)#ejl}6|A?5zZN72rBCcsuWb_@jmJ>Tan{|||V>Cj~-&bdv@=~$R${7UDWRalI z;zZY;7(~OFG?$DyGrdE*gO3#}tni%VTrNpaLbvul-C4dh3h@vGN;OGv%#AcI5>6I^ zi_W9E6pIW$d=e^ouDnq1Xp3I`@+iI*>g8?av8RHMDPyE42OA`qANC&7+ESrCne8x8 z(H<$TV(KEGlK*p*wX5CP0L&@QA=! z^bsTQBmU*y354_iK7HqlJP+Og7?q~{N9 zvz`3ri2wkunna!oxPdX$=p{6apwOj@7hiEs$l$*#NflUqPqCQGRC`wc8>MeOL`iAj zCm>aShTxiW#YlNT%cpoNOfXIfo+Vp8S|tsjeN@|@1CyJ7_XLT!P9n`6iQ^-#J@vle z>us}c_w+w|p_ji5bnBZ5h$;j|Rapn;Ftv&PsCbeLri7Yp<9^%#VV;I?O`T)?oA`fl z=l>kHJx^~vw_kyk9|zm?!P{rjC?<2~4B2v8L)|Gd&R3IOTWQq##tGY#UPD-D0gJi^ zy++>z{qJLI-y6(35=xaU>y6h7QcBVYXKPa`55&IH34$YHzR5U!G_xu9!<(%>@AcYO zNKNO}55X8Li1pn07yXaqE)~Mp=lD&GU}^Jy@MOHppWuOep8KmjpY~o;+%D5)6~3`J za|kjGdP%)H`ynL28-XR8Hr*3N0kvHDE*f*zZbf>P`^?u>r^4lLTA5i@uWk;9Oa#=$ zPYPuUzJ58-j!;Oh&VdTPZ}WIWUCh-z#AC>|8P{%(ac;CSs2I$G#ril^7H%)eCp)3@k3^PSqV7 zmSrL`M0NlU_G8czasFx9<@Lx0^TLy6U5TI6B|0;GE-50J=$vxf+_ptK4iXi89=t`e zvh*v^BvFlt>p14smuN|txlfc4fp;`LGEqKTq08Z_o;v5;omw>UyE&a zIAUOE8+v(eGM$4Vnz1as=gJ!)B9Vr2Y)LRj&{;a#CZK`5BEnD4p_iQ_)vAFGh zQ?V6uAqX-QN*My29^70_@31G}Q3x{g79%UJ(AxupC zSsYQyBeZM{Wn6^96Y<6Kw*BVwUvoc{(phvTauYhq6~zVZDWN+LC)QJIYJg@oZ;mms zRFbhsmA}_Lsgz2YVP-Z?!B*C!@W(CivL}WggpaP~Z=b|k2C)onf7KiwJa+AKE&WbRVsJC{3wcUGoq9ny(rU-gl;6k;q>4%V&G|r^`vTp; z$4k*)`IBsWe~$(#sb<&xj)#Oih-f(l2e8xBk@3SH+j#O`+nRz|>3uyT&zfaR(eJj3 z8D-5@gXG-~R)Y{(mIws>uqQwkj?Q-x15!>j*9|{WOam8g6GQk%$pke5vFYPqflJPp z{=9uOl28{lI;(g?tW4DFFuHk68AA@`j<9NX?ULDFi&?352-&^ILzZ-*1xN_K^<(GW zeLH628ee%-kEZ`kPIC$|e}xOb8ZTrxC+O`Hb{`unK(f|jt0Quk1EYO|mtaV76`fdi zH+G6UE<=JV`+5QyRW=L7NYj%gtP409GtS}I&Cmd3P;Jdf<4gzJ?&nX@%3 z8;I>#hvx$;=)xqAnv8pAKP3 z7&UF!4+#6IKeT@BY z&%B*wGJcNi3KLh#(n-#XhB3P51fU**0v}6x zeAfDW%d_C=+Ea5YqDfAsalPHqGX}{K;Eud`&WR?<+`A8OzsP}Eb3f71DpbIgP`9lb zmMiTlt-f0_Lsuk@=fV)gPut?&UaWj8L~yTNZxN0e$VYw42G${4Cwh6l#ZR%DnSGf} zH$IDMK9^)R?A1g){!wE1>Gwt(5ilSxDO>G+9Ca^?IIGa8IyQ^?ncZ2GsLeIF)~jDK z<5%%6gT7ke&ip7X#kTZ+ls5XGzi7@}P0E-UVZ3bgG_?&+t?CVJkY^nXI&S%k^)D`| zN?N%WDN&EpHjWUwTw1Y=?5QIKH;qoUFFArl8YLQvfA^Vbq8Gc=^oo_~9*skBnR_i$T~SOq@GDjKo$gwW)noO4 zY=Tum216~^f>(9uSN2VUg;`4Y(RG>;TxDmMJhsDZPEIK**|z7Lb_a*@d-Dz&lQ26S zB}BtA@VAKFMRy&~&7$xxgMp|^+ZQ9R04^D`g1-baW5{RobUA~de(I2^;B=u+z3@5c zvjOAztp}$QkLFOFZi7dWr>`wbX9fQp(UnKz>|cTF$*;e8McF;_WGQ#>??w*Gsu%oq zxih}zF~En%#uFyYyHzB6nG2bH*M4AzQ?XTHd2oYt>n;2+R+~2~0lfc4H$UE{Un$sC zMU`0Na9`0#C9Y9Y3O==Sx~vGG#X1R8v@rjheL%EM$XY@pgJ!$1BMJi(`57Uy{KrD( z3vJ;CL-rt(t|Tyj%4wcU$P9X-erV7jc0tlSh0j!|ba!KG>-?F*njo!;ciZg0L`5`; zl)Phc0WyIHWrVJ0#kxruW#PcTL@y5L#p%oB3S&GeeFyHf^wa19Ab*y66EzAp=jGR|d}U zI~E`Pck3d($*jg55&?cJ4u~#A-umPtvF6H%Pe^hbj+sUf7NyxTVnhqNs zE^Pyln9d@fjC6sxuNt^MMaaEcMZ!#GH2E@pwJ4y_d{{&4-8=Y6uDAb}i;!9UChTH1 z*{>Q$7_MQ4D5WHv*tG9Dt1JVA1|vz&vq<)#+;lhDdNzZX)vA*8_hB5gCco`zuRDGn zr|+asGR22{Y$4OwkNV>}0295J&OwM2L|8#&&Ybc$(HmYPPLZ*u`H+XsTdH-@G`e@> zy7J}BC%0pVEOJU72;aVXdGtuJGk;84>q8%_S42C%=1NEbI8%}_)o0X1^PTRT`^sfL z<`YJoA>@qx;?$$@5-w*=iO&zigV__Tn0NS*GO$5E|JxeuH`A*#f6~t8!P#3Ia*fd1s|}y?wDg>wNe=o%L~;xH$~{FDcx^Tsliw1-4wEZw zsnP#Ubw|mA0>s1--!5**2>eBiWp`Syc4n4`e|^CTLxzD%7qAR)oh>!Ou+?SDX9x$G zyi7*XYc%xOrFt_YOZbg>A9$cJ(9|+5c7c4^5*&Sa6ns~+HTB#Zwdj#+X&qU0>Rlc^ z&ERj~A-4WT12bb~wRI&YcAmau?M-~H*!^oxEAR-#}zWEUiF#hi@Uf?$M#l^?BK)lje@jE&$q7r*6?TLU_()|>dCUayh6cCr znIo`B%N?K+WgOu({>=3-(3hLGa8fcF?#UN}3bOUF^lbikko!b21({P^Q^)utwQ#`D z^Cdt!NK9OmB(SePu)%1MadUk~PYA8z;x}tvW3qO)Q{-+^mQ3zvI0R*^ji9eXuZ=J5RT>lxU%^U9C>l>P4Ea7EbdIY2+zlNVj)RA|L3*32}`y zWiI{$BizHMn7d^t+_nL}u)F7_qg29xvsWW7!erq#v)2#bgmj4r^FMy03mPHy%Hr@C z9XxYhW_Rb$3BM=_QihG*Drx+(3CDs`KG774p#C8w|5V5R zT+_4pGrzv>bZ+ZR@Ir3hy^aVPstLJmEi1LY6Z9Rdv+;JY3Y|(fE=-JteeWy*9lR%h-<7eui!RIRNERXdt(&4h2Y5@;BA~qj zAu)dLZ)4PxiDP6wJ`a~&v9@DG{|jMj;&X<48?93jwAdc@id;KL^tdu%z6Z)fGkEt9 zak_FT)|>dH#?@>m)3NxQ|7Oa{2yxqVwg*L60I}Aw)f4ge%iwP2M`P&0=E9$(lAahp zg0`ghegy4}IrdUym}VWB4^R2}K)|KW5oZ$$gJ~vZkl-tDTY0ut1G1v6RNyRRo}<0H ztFR%1t5Ah`m_{PPh|A%$^fI)eiGIJIMqgydId9eqCwx)&@n@oBim0{7P}@V zxGl!svw`-6AdZpLhr}RI^}^9@`74?Fk9lj%q;`XLSZ+hQ!Gn+j&Jl?)E3*mR{{O2SzC4k1fqj7WAZ`mRn2L?%a z&-XIpUa<+u?Z=3Vw1FtAr%ccBdA%eciDe>9IUKmL$nHs`OrKOXG_#Yur$t|qR{%%; zV;$B*cg;|PX>CH%RS`DIzl=h-@Y}-kS~q^vg@dp%uIB9XUE0#k0%p(-aCYdNKC&1@ zH`m;LgU1JU-xjA%D&peXKsWD?HFi)fW0*I?QzrnI{JgC`amGbT1Tm^%ezHKWN|{3s zFxsNbnGq)k{m_HvZ)=6H3TuLSer4ljW$T02Hv0D0q--a~hl9iaa`69)7Z+GfX?7zf zh^X+27XqotI-8ZPa$F@!-YVQVgVVl&$98f~|D}681Apiltl1LhoUpfK zWej+Bq@0l;xhN{JDm44M? z%x8c9UF%)zXo-d_JZaNACq+3`MFe+4gauYVIb$AQtvoRnLLm2=B^scXnWD_ugVY2#X2(A>y-3_>PbbdC;Vn6?*HRenAB zk-|INVcU}3oQB(u@G*`r3qAU3-Crw5))3u>&~2fe4i~Bg$psP+4fK!P@3Cs?-kB1SGuXC+0q#FeE2N*gTRc1 zYP?NV7D=q)>7)i)S29)Q$8m|70Sia_!1`Mnc~X0D`6X@452-cU5oviC1=TpgQ`mp) z709mvJUI*vxC{GT)|H58;?thqj#>0glG8RXy;ZrCQkb@73U_J^oH5NFVr?akxw2E- z4~_;Bzr_3+hnM^#+o*1i<@2K7v)NAm;pe626TaB?U7;`iMbnALqBG2lDhab!xPXYq z!5r<)kvIk8jFjwBhNE>lmOE9*2V&VRHV>M5y<4Jo~ z6wA-BVd=@BRJ3rbMw+NO|NT+uN&CabKrOiCJU%({qm^tTieJFqU?~2iXjIk)s+fM;9Q3Sb&LjTLrF>95q z!+kVhl~32bs)|gacG_$1cEyUO(K+3cZD2dBDObbxPJ0~_wJ|3|1X_S~I?E@gEJtFY z4U1J?Qpfq66u_q@FPp zryK92>)V5C#adV$sC%W`X(nNDKEUmhgfAFlvgeRuiwicn)R5hMcdd~kjn0lc9giz! z__O+nMAjs?+nl^PighS^3ZB3dc9-*87&a#-Q}C_oZ=ZKg{?3M=t)E&lbKL%TE6{o_ zyG{kU@__)p(V-d?|3%8M!Ie{&d5ZEX~gc-B#g^S_`LjwD7& z65b((-0A@V@^5ZKfPhu}Hh+r^9lds8lo<(NT>A?6o*(2Oyk)bp>;(tayfZvq)l*iMTloa_ukwO>L2bC? zX~7WgLu`Ers3_w_qz_jh#A_dU=8jjnRWq7v3BU{NTYj5K7Z>5L8kh9njKQM z{>Jl4ob=|g#XX7bFuUvpPM*s3=ori6Ao&Wr0z_~K|%D5-o*c|&6$iW0Pj(^@E7{w73j~8c*{9Q zoYFudUQo**H2tl-SK8JDFZj^xG8;yexpPWH{@hejcsWU#t_%AtxLB-IxbrC}pV2Um zSJ4Z~?Ty-NhK2A9q_M*6wh>|bQyuVu~z`f6S_?cZtML>;>VWC zZ-nJlQX2INgxa4ZSB;o{dl3tIwoHaKrDd>>4@93v%6a95qTTwe#F{M zvvo4@=m08i?e1=6qF07ytqDOI8$Ykg(h!aI=>pLYtNr&=HAYG4@1f3 z-o`TI=H_sYk&8U;JDwTI{0FX|mRcq><#U8voIvAQl2F*l((c^gT~W(2N-vR z!qoB~?F@b`HGdEmxT}#l#g@LK)*txbjyXn~&vpynh|~+lyQkDFjpFuL%J2*{jkja0 zIjcGFa%enZu~*pB-mN?*vQN2s+m-w(EHO%6nsN+vdIbH^e^;$^zUCU62i@tBR|cJ$ zcgz;Y^Rr7930-W!8iSn2n0iyzRLZ8@d45h*#F?d>!I?-CAvsa?^`d{P7vWl}^>Xk3 zRJQhWz+|Gkiq=xde*hzBSwGnYhSt}r1>iX&(dmenidv+jTG=ML(jhd-v|?yq6BrTs za6#0ySkyJIUEiKm#Cu6UONPgO(7-=RQ%`!ubg=4S%RXKTnB==Pm2{wWS20e}5RxHJ z1~{bAjpglw$+&*&6Rk)N!v_ETqv}1%0Pjma4+C4%-rBCC+&^pnhXO`a6;a>QF%x^B zmP3knP14=AGiYvynm*C~I>x)*2~nJsw1WRMkeCtfkNZ#ZKvZb$i00g>MaGwzKOTuI zI`}vji9h~bA?`(&0S7_aiHd8EC*cNWnqR$d!ev1PNt;^EEe0fLA5JG$WD0rGHhFcCrm@2F=ZNMA>bG`J)xLEETQ zY}3$%T!f5nDFeLR&lh1Yuf&wup zv=ToUdZ~!rrr;A|TOPcK@vI^W+ zYSMTV2Y)CJU0++OJyK2>M(i8=od8tg()pAlz3k2T^nDi^IKSvb0%r^ulTQ4vfbhH) zyQ%me1yaP(quZw=0O5|;OKml0=gAuQi4bDE8@U7!U!F2bBc?Qf>U+sPYTR$x~np`Gsv znglf7v1unFZi!RqV6H~Y!x?4wK-|Ros`4X^AW9?@QJMs+#Sn%#`-@sqOwElg-I*RN zPHVXbt^Rz{z|F%~U^QVxOY6=E$q#o+4B@B=DZX!_Xk$Y>Hd~TA89R=W!HJr^C`hC0 zY4tMV1&#I13A~3sLhg;AUoSm(9_<2%FL@$XcnpXSJ~iM7I(<^D50{2Vgy;QqP~VF* zKCdDL;HfA^+ZdZdevith(di+GhFP@mrpHilpXfW?sfVE6_dmzD0i zFA-|W`W>S;C_3!EU*uHk*fL{I{U22QWmKE(6ZVY;cP(0iyBBvW#i2-{xDCT^y5kBNZL?? z(YCP_@(_eE-Y^cDsT@{T=)01`1q`;e-A6gRa%o&+U-iK={c)<59Vu!^;-)|r^rM0_ zbzR~5hPpO)uiqrNab;t*&@OfFYleEogKyzPl5Y|Y)wyoKu~DY>z)J1;;q2$T?B6RZ zW4twS=9mdMr60#edq2B9nE41~FROg2GuoixF7Amu{L0a?cBTB*`w;T9W(16kc%ym? zSXAAGWx7$|MKX1+#9{A_#_C@hL}L$c%O6qQl}rwIPkOsl!u}d9k2>z*j`gkok|GtK z)~r=A)8n>Wb;i<%v59@DJj131lJ_ae4yGiQ&6Uo^IbHD9bPcBC!YBvp7IJ-0j<)?(T_5}P`^b#XhN`8+a4pON#>L1z$PJ}+Sg-l-UoxMqkW%= zI;akQqHOPfim3q@z$o2-nM&swsEVY*0pFriQ6rPbsHtu85QWXy)_2OQmYHT1R-Jk2 z5+9!EIfjYMLRs>OL}%C#qCFH9gjr=KiY!b7>24%+mSd64H23}cd->}{?3pu}`zIE( zxwJH@loY;tYG~%`!|1=lwSO?tPQ72>$3Lp&5~?_s1v0upFSYD(OgOGb?Iz5}5-bm; z^YN*2dqXhHI=bK+buspF&&9223&I3+nQN;bZ4Rjx5>UjC#P=i^UIU;brt>vZDW%RS z%_S2}@h%Pr-)&wekfGdxKins0*43Drn>eM5W!f}p$t*yHdD-UKH&&xm&&jS z?sVMxS1qWbSm(Ji$Uly)Xz4p#9e^rp_&#Vpum4+z?N-&x$f#bdM9mp=HaCUf=hPOTZdglmPU_)7vhCgx^?E@x)jxs59=e|G1p*u0 zg;QD4dpgsqhnhl3)`J{yD;}=;;W2BP%oazwD`@`?yF2yYK~UjyqNWG2Dc6X%{UgRx zI%|K5CwbApt1mud%v5Mpe>hMFOr40$H%cN|1XRjcJ2~j@T%x`!e`1HOb)KG|cl{*L z-bqW?4Eo{0kU({DIyp4hzoFg)8N?Bbkhj;Wf$SfZdKo{$)qQTu^n#V>n_DM`3OYgxOI5LhDd`19}SKxdr2L?1_+wTei z)|chcsLb*cUm(f7?lOJofw$##ew!N=?ShyjSeM0idN%u1Q-q+IBbaP!eaRT<@ty9* zuz9*O>jY5sWE^|FYXucWgnYqo8J`r7+%z?gll@n}YRQ<*PXV-p$C2b?`PZUzYkwrT ziHWsD7q7hjQt0!Ay^+hS;oWl@trP5Ddzotu%i<_v1@im6*d)pe)s(eG>js1oGD8hg z>`F~ksnZ>$v~}L^gs1f0?A}Z+rVeK#CQ_5#y~z~R1Vt}W{d`Z*2-k1(vC*|&mz7}=dMK6JSIvt)y{HSa=e!0dS}l zni5ApimUT>Dm(U?-0fmFjS!+JKir>=7DG{pH&EOEIXr`9MvNtF(R8&Dl0GOV5U@Jb z9&6JS9$8^_!K1+TrXmVGSA)I;;Cezyf%e`sM|6D}1afC|Eczax8HsCa&rD9p;DL;`$ zOYVtH*ObeTD2qC~mg=DOI+^`M`E&KH;tuu3&#IJRd1{8wWGG@$HFO~X9%s3R5!^~D zJie7VMb4FXNa(|)4+E(uhPMUnJkPZy@dO^jXXgeX4MCqHZ`CLSO z2*$+}ntp#@GELQ6j2@ZfhcP4wmp>ZiGs2;PQYa(8(#J$mO5eztA!Vj`3;qa^0Cna%*n$ zxElpXg0gdIN3G)v)vWuI$}X)sjT9c9Lu*$@YZ8y=V4QJI>^1;}yr~uJoFoP2$TJgCH(5Hv376sxD@ku~4%5;Gdz5-5jsDEw(1mg#aCbNTCW z&sR6#5*F11H@JXg5+o$Se3!c!V)Ua(M60qoRNYxX32kX=f+60PEHhZAcTca~0thpT zo~$!95G?7Gfkw^q8NKVX#gBR7?B`<-oO_X$HIlVu0Ti)e9$v-=ka0eTI6M0l91@i# zh`1f2q7T0Jg5m!1)9a1xcnyPlR>TL())iRY?xc+H*fJiAp^j-@@Z_Z%BTHk z*OT6?*&bTYeqGCZLqg54zn={;v~dZ*80t=bH8sXHTy4(S`?Y}*kJ3HTolxbMY-*lU zxuV?Hm{=2A-XN>ewaKVdQtgbb&no8VjS=NXva+2i>z~T4s?@tR$e9f>n3D7P*-)oZuWP#-Hf1Q z#sU7Y((iDiMI@-oKF#x9Fun7q8#1}Xz7w9x6W-#OnoUsljliPi(r7#7xnK|#bhU5M z&65H-_KAB6Ee7V<$P;zN9r8+K4ps>yzsLIb%L}#b;>n+xw*AT=b3s8S(-$4dI2f`ThN|L7yT zDmqYffmGIThCq@rS_Rn!9jx3ZIXKdE*C|r8?yo+gm`;%*+_NSY-*)a!u9)#7`I7il zo_ZSqK#NP87<}yzDSXVhDEuwwtwzVIa7;_+vAA}*+*p0N*+F1xNF8emuDj61fuC>53W|A2l~#dGOE4r0p8oqskQT zx1i^)`;Jz-JX0FX2xl}UVjDHJq%mfFuCIvTU0V^isP!5NKbr$fDeQ6(eQlbgd)b=k zX^ZnSTFzDm+rt{WR$&dG=iLiKTx zM)vTn1Z}+l=}ZrGj(jZD(|z~TeB6KZFF1yej&i5ukDmTwx8QESm%->`q$`4qC8%N5 zZJ5*^MSH~MUY=g?yEdl_egTRbnPP9GgYUS@bCWVItx9GpYKmUU$ksWZ`m~`%9@c}u-m$% zG*jXVdP-DsRsSMaV95p`;ceGi$XL7`%-al?dwM0Nv6w%9*s{=RS$Fbpad5r@cQzo_ zF)HYdU6aCzQ0|@_=)|v(Tf}eHtq`_;>z5`l*3^)?4jNwcF+W=&S>HEIMWzaYP<&@3 zQ|8mwmpuIDNw&lOtM_o;D(9uUD3x>&kHCaCiRHgP`@RwuH)mq6j0og(a%$%Bq5Qwc zEQM3eqy~QvZ}9`^K**_%+^hHLgGndE7VA8uPUV-=G95g#?=-}n&;aj@N0}>=M~6Y_ zWr}Y2Y4-CPt!+(d6Q`>N>)v1NwA;-`AB=zS*r}MeG8Zs)jJIpmw|De(zHYpSVk<;b ztO=5tfdSS`_IiTKXh2)E*3a#9Ij+*|MdKC@USD}x)5sMCxsnRyaj->Gt_YC>0GQw@ zrX)-8MBU8ato|RK%@Xnn{xa-U@qftGHkoTTI#BR&3+b|taZ$ov{X~uOV}JYSI+mP? zX;qL^R@404J6KCbk%~SLV}Q7zqXJ6;DJiKnnfkP~wWc{iCZ^P%cPmfz_dS(m&3w+} ziJkb;4|q$voNQj@tDP6=6M}weV!id$7Qe3H?l5wQ*kNwdzoe%4=!-^jvkknUX|cMB z;H>d*{ip5`PU}~hmvyHH7y%2UWM9jE(f6!ATw3~f+9#HaUIVlxsHIlW(RL@oYNIy+ z2l0h$(NlmN$9n-1I{_T0lM#_HIuP8HG+Zr7cM(XUmc%3%OzVf0oIDGy%c% z7va&RUDXIJ#*uYuPn*6C_%?UzhWd-s}CAHRY%Cnq+#7ac=iilgG+|R=ASXILY^@O7PQy3 zfO7!3{oRMW51G7yyXf_^)-7VB7H-c61CX;(;1@LZTs7v+QUveDlXepTE|CV&fvQC{E z%s?vmCggwPU~&KV)M-9Lx^XODX8!{;hYd`JFTMV-nMmpn8(%tKmGGr#zEU)!-5U7% zQ89O=PN;6d@?J!HyU2g~b`B9X9Q2o8aJsJ>@6EdlkG7BJmpkaT0`%P2J6TjmH%MX8 z@|DxHLy6;2|En>MdE`h|qA~0EhQd#7KJV=KKJc^@hNOlfZADM#LUopQYl1OfZT{Dn z<4^gWrpRJDEmwg4UAN=>?M&R)qLp6VK_<8kLj3>#G)_-aXWI5@_$Bg?v*wTke z*@Gp|gfyGm<|YTm;k$F!8f?=;n4Ami2jGMZDsgklt^Ts4QJgCw zM#5aqLCG$zM<@pd%a6!NAmYc4z46fWM9@cXpH+(g^Eoa)C5<#8OBmbQC#xn~#;;=Mhi4 zVN3Dpm#gC+Yk&rp^*=q$tiCBu2j=F1;q`x(nU)v^7 z1ZBb8_S`QzO-pgmvX9~bJ-fUgU8rmCEVHsSz7{&2Wq{-Qw>-AQsi$!?sVD;*ulAcBky1d=9;bz=^CA9$Z`Kf;FUafIl|J*Dn=5JT3TLmMX}W=or)g24Y$R%(ssbQzap_FJ*_j(#BceeD76E&E zM(Xmnh=JZI#eJqg#S=^7X19Iy;IoXqs*foV(Z-3~DrG!n_9^9IgiMy7#2*~j#9*Lb zZ6z4@e~p6w?~&C(UIw?HbYphJo=E3+&zr8uLO#fwQ;-(#H=oX~CJCHzKB3edwD*({ z-AmAG^elvJfA8kc9W$!Q`zmaF{uc6baaLdUXQ~Wdci0WaHDEP*GlJDRi6kWbKXFA# zlH`?omFH=F8qIG?5Lz1`Ipr)F`NNovkc{KV$9NZDrt+G?Ufe(z`TD}t>ERwRPp_7D zM@|(US)AMRhP${n2L?msmUwUM=TTGJezq7HFhzJn@lefg6FUxxy}T1s!nW5F@0pj9 zcQt6j)<$k~;Uzh@c|q>8PqOgyB9Bgz&Z4bR$Qv&dqmet~y!O4@vsJmqx-=x;7grX7 z0q_#6nZE4cP-(bOF2;uc|gtl7SW3J8aIIj_vnWif&*+ z%H`Gi;~%q}WJXH(nu9m#I+~YC8jZ9Drc%t+IGC`waO-8>!hWGa_7k{?J}%al#XV}y zCFaoXY0j+CcxT0S8(?MIxTIQ#9#hLC0C0*sDJFKY=1H2?q*iL&G+V^Er~zWsHs4bf zB}tZ9YMdQkyFqea*SB@eC8^4;1K~bteD*aaa8{SO6sQoEu(YBsc}H zIXg?uEj%M#N9&w86rDr3um9NTzLvj|8c)2TpxQIzK_T%i93C$Zh{&9pTUaH)UucvWd*XmO)B z>06c%j|L-4;}Oprah82g!oV5KU=zF2L`V0UeGMIG)JJ_L)hrg$%}AO+#FVy)Z3~^P zHt&KM#c?^8bRF^3{R9T%SBZ^?5gG6`xy9Uhh{e?JwDqY;=71z_PN)Gp!;y?AT+tI7 zwPxLP%*oykdVanuBPh+eipk8XDCD+JOdc(HB~K~0^Qt+|R0*bvZEKjudmo?TwSR=P zYLV%u(J@AAxtz}Q5%zvjq{8~-YDKqxMsW3cx(gYIOtijuIzzQ~X(>)lmf*_mpwz{p z`UDX^;5IQ8%&281P}jRC>w-wO8RX_&?kv_QwPx8bxCFua9w9L6S?%RNr6w|cUuatO zxS`S~+4GgX9)9`0LTbd5xiw>pG=Sz(y7es>qVj9g9p%H=N=r9_Kkf;QH>b9PA_XMU z*4&K)>7GzwsoITr-(Pcs0EfsMZa0m=z>h6^1WNV@be$i+ ztX|F;Cc2;`x$E3=AJ?XxuZ5RlxgypHM-KA|aOF>a>ChRLSLUty9yfLMxgIeZmN6B6 z(>H8&!so$xIX+txMk!L#)A&tK1Kd&IO~r!WyfnXZwgI91C?6Kc)ea5(H-qu*>X1Fn zj>}*mxH!U70kycPRS1;|prVkCFy?%-KarBzo>J4?n?$T&hK~Op<>-0{5d0U}jD^At zdJ@C7f6%{f`t}t)V!4k}63R(EU_#!81s8CD*Bw~*)F(nhQRhPWgQ*eP1GHYO(CNWk zn$CImJ?2OxYmSC!?C(8d`LnsoK;XtvjL$c3@{Op4v}Iko3MPS%mc0=is5SquCIYgl z5OneMO68Ub;el*qJeoSvqXbI{`z|qvmhGCqwdF}`cg62^A;xB@w~dvT3)Bfjv%s)y zv4lIToEB6#h7usQ&QfZ-1~qEOrN#11se0A&qtZ-cuf`#oO<;nw_!on^-oBm zAV0X~RB3S5S?+g-JCKbnjv0KOe@4Add37Yx5+ZYOu_`PQ#rL%#(>@JQXsn)8`2&kF zTK}kEggmC)SW+euTbof8?pB{8 zdx;S-AEH-(e)_L29pXJI$*QCX+a4}CX~0%gs0^<#gMZziTiup{Fn_w$+*FzCX$1I) zBuF*4Xm7M8(eu8#f8VRVjSm%)r=_iJ#!N3aKVBC6*`lO?UXS{5gNln|?I|eQbPYis zWb(0EG0tN3cKCbbxWAW+|W z>OOjof6f}ht0kEX{>E;LC22;?5Bte1It9IBQfgyp6@64w9HHJlgcN?o%}8((bgQ%X0%=PxhvSjBJ;OvImVLP_h~>c?`oU0!diX}uKi77sVxd;W z^Z{O$|1D>`e4J@{^0Jf$5FkJBu{6~tth&!H|NdPcL$h7vhow>&7oTGMydv&7mFpR6 z;<%U%ln}sc*4?-}jm=h5^{^But>8hU3{LhD#<@~vu~IPGCXFnAP?L`HEwQWH@@Lz! zKxk!GVPvx%S`w|8x%>7*lclpghE=ElNitk@Uzasn77ZBETulFV(5}Y!_jRhEvxzwN z@&Nos(IR`D#}RUa>}~K#-=@vKm=^M3iGoq`+B|qT*xvE$+ql;zea-cYZchfKo8~eNg?-Ub zvh~YW#Q|jBikb!JeKvERv+nl4VMT!WUOqeBb$ElgyG%S;F~FYzu?ohp*P~WL|I7qM z-tyW!1=KQ9t!nLjGo2Ybb1dHG{zd z=i)+P*awFE|8Jy)-vJ7&;NGWW4pdWa?k@FPTFL&6lH6T`1ga#voRw`3y5elZo9Rd5 zVAQ7P^f24Hm@Sl={8(1_R@51<2ZSLC`bQc6694K--MsCl?kMa9JNyy#uVoASSyHK{b&I6wVHX~ls%j>aCS(9z7OdA4Yq-vI}vLUUTz&opu*l3VDRi<{?Y~mCp~L5 zLFpnU`d#`?T0)^o>py|E2$!Y*0M^Z|kC+8sr!&UCKACEOZ3V}!YyxEuq*N4hJ9RLZ z?wXc)>fkj@yT%M{+=t`!X?klh_{xGr1;^ofMivU}aRevf?ojS&Hz2Z*Hr|AJ?589?f6r5SG%igl+YE|D9Ig#^PQw!H_`+My& zsZF_StBkV5DF>P^-)TagpA$$*1!irI`vk6pq%k~SzV{Mm4RKDWwAJGMeBAV?OPI&g z!sa$^}c!>yqg{2ZJUUbJI(ov+rDBbPO3m^ZG)W8)kXz5u_{Trt8?B9{FQV z<5Za-oY5RV0oYmLjNQ7CxqzKb4!pwSTkq8qKi}2Kn!#Ai9P4gZZ#AFs3{P1Df@nSv zI3@|F>`yW>E&I@uYk|sYbKW@J{LLU^{|y!YI>G`+1!L`PrVRMfF{JN2?{8AhGq$!Z z*tX@~JAWx^^O04x4lux}50NKBX)gYIOdAzl6Xu`ZLb6tDnEr)R113;mZ(am&n2)>K zP`-De<3UiA?S78sb7#m?YjAN%P2iR6#jww!nzHq!$zlL>`BzF}_LQmY21ya&7n%AS zYQ`O|>9^0-01|F*NX19r4&ak?n?GmD6jgoPh>qED^PTem7@ttj9nBY+Dcq;eno2XI%r+WKx0 zD@Hn~#9!sZGU@rriZS zXWwC)*B-9P5Kfsc8M)zVg3ixJ_xFA9 zl-+C5>Q9TG2~^Z`t;(FcI%n0vxfx75h||&@r@vQ?FN6X2rXU2?3(iqluu_yldhe!q zBeD%#s{Q-i;0$}(V^fRd2?G!&tN(u-g8y>~{_p4+|K4z>Og1u>pPQ8a1{BC2^R)qE zwXuQag5|as6T!sy%(n^%zmdl?IlH@rPQ%+EmxN2Bm5O*txPWxJO+rspXsKaBXdiv! zX5u4f=;#aJ9;awkiE1WT%!B;8oq2BgPF`|$-Blezbv6AH~NKK__Pq5`$cWM*?}2EN39KS-W6Sm=S@u-GJ~Z#KIX zHpcDx##7HRx9}CWOO{IIU90e^**TV+4)v;SvAt3fu zX7jBOFpUxLT>K~@hYQwoc#;{C1IutX)FW2ge~5kjI`1L$>d9)4xnKf6GkwtOP211r zv{z1!=F+z?3Hgo|`V1Augae5Yn=_4)iPDyD?)kIsV#nhU=-T=bvL!oAb0}ShA~A*~ z(>{Y~kxrtTHWflrP~V`ew&8B_R(%xvJE~;FktcmWXa)C#ZIP$Ot_yl{Y546l=rD7M zw$|`)wL=PDZanZQL&in$#Lpb_a!bo7(>CdwkAW@|q{2=iFH<^|ri(~@xGhgTL`pmP zLWH{6K7az5l*B(9C~vB`;?rS|m?1sp>lEorfjKkjTC%puRfNeTZ};Z}b@-~3Pz40GkXctcmyZv)hor{}$&eG)40FEc@G#qzQ&HLW?h zX||KSk(q4Y0^_%^k{D$$Un=(&#f)ud$7e`?a4T1=^rzWXZuW5qsKkrq8I0er`dzUG z*ykuRzdfrSNtf6uuL6Z>CfD| zG_}0D&2~tz|K9Fk52}OIJt;@c9q0!MNZ< zQ2y%95sh=5T;(SqNN-u%*uiIe^6mVGOc@(X_)5aXhZ$*@23sd1OFShP+83`@8NoJj z=6HSz6K9<#X*XwcL+vGzcu^T-Ud^Moi9Us)eVGE$(4k25=f-3L@^;8xUe9&r{7xsa zR2{5U?5q*s;^*Py5)>lXt_6Vnbc_I#8$}FyDY(AsMVp75T4M)BE>lkeL^iSyve+f2 zmy&60Fnh&i2w0g!5AaoVbKCOqlAJw&7G{;yAaLZtvy$lPAabh~X_|wMrEKf*OQ3<3 zw>k)n@(&>oU;!pbR8joPo?+BP8x=A1I$vvbpdNNIs|31Fi=+GH6PU{AZR+DNuJnGE z`8zJRoo|WWbg?RT%YL5)cfXyJ$TEV*|8BUVWaKN@_! zpAJ$h-2}`|L27c#q)8r-4w)mRK=w0lxU=Q&quYY}aJbSgH+{6UX~-w1*2|@1p#a|h z0I1(=LPd@~|3A{D;rtd{`QrD#t=lAIA)F8_47aGVy*kdcZ+vd6;^Io)|IOK|)7QE( zSF9S6+x@9-71;pEy490ZiRyHFwDwO|T@$U|8+f72l^1j?`oy~a?R4&IO3RxT6TowP z{pIUL<%^8>@Ap=ch<4)ITZhSxfIihO+~2>z4)Ce0uI8`*n!ra(F+v_j(Zr>5zXSdm z{Rdd$ml+ZkOab@42>;2IlhJ9N+DXoJM9V^u*BoTO$>K!QI#j&_4d-=bGIglYeuFUH z;VMb?&z+@le`{qTNj`l0;_)w=5#wsDe7D+i5RM*)y{i9B*0ohCB}rh-^LEO={JfdQ zO?F%gGgpqwhxq`Fr2F)0CVQXAmB<93Z)(G%g}>>ntmb4P_X8iT1wDA3yzea0xj(u2 zlh1QF`JB{bSB?Am$}ny~+e#T0PuFt6YS|L(I-r(LN~v9qSzuUG_t)EwB$WQ`=lN`_ z{{X&?qz|rjOhJFYxfs{fn!HWty!>f34zN-qn9?qlzs<)y{wdNh+>#qC#M<~^zo8l@ zzS5DFlHbR?oFD*Mpwys``5Y8t_)Bm!t&yT34#@kbT zxMB20l2GyaK9M@d9$yPYV!ZUg%YQY+R5NT z{izGH>}pMo1BU7x)NGBqN4QK;(>dsUFK+!v<ivDHB%izga`FSN>A&Ns>JH#EHy1pS6U5IqJ6qP8@Uc^qtIN(=- zO7Fe!qMT}ZjcudZ+r0#J*Z97QZs(7=>)YPr87=r+5dM|rgonkrB{X}KQ;G0?{KsZ= zTI`=PZ3`fhvp!?tL$U59Nub@;e&rvijob#8?r)=xQ#wEf^VlPCepheu_Q)eG@+i2K ztE+KJA|I!l+vNUEfAPY!zne-dw)ppe$Yu(DKYfci%rH1UFM~ov%J6aU6||Cj@`dew ze&mH8QFu?&8_F?X!(UtbsS|;}EcrR@wp}*ag+KdF3?*AgO~MvZ!I?}3Mv1n&Hs>Eo zaeD0=@4Lx>QL~*pckw~2r!$oag(dB+vNY0936(#R5wLKTW_H-t#=VcDC%%t5YF?~8 z-$uyAHoF+ww>QLbtV-Y)9mE>2B<-k7{TfMdMEgU8ZviW+`qQ487@JT~Dy;jQD*{&D zRit^7Iwd$dG3)t8*~@z*^#25Ae%rx!R z4MoWsQOR)yRHGK(XTI93(;U(d!~y2a-}8%ok{M<9Ydc;|y-}T^aGVn8!>T(nc04I2 z1@F~*PwzXuI5}pZheQ(!C7J;89(79Ae%0@jL}9e+PlIl=^5gOezobvexJa=}HBJT2 zP=mRswr}zY5}VgdD+|7pi{nM6+%Npu-dR8_06*fYk7JMZjVOM+UKsu(@T$7X0({K$RQ! z9@63bA0Tc<{$1wQ7x(M6SX6lEm=}wVb}ipzsL{;-&zpl{Ipgp|`M_9y2b&pB|PkRGw%AwxxRTLi_$rfAg?Gdm;6p zx7aWd4ZRjn@bLH#P{f#yV^<&`|ELKyND>Dv{RjkG&KXlwwgk_?VuJ!2 zSIh$4Mhh+=T1!bQeYM(Z!^L*MMve0Z{pY#pQ!k-Fvi_M>^o7=|bu;xZLY=dcXxx}y zkKt1SO*ZhoG77$G=z*n{njLtf=fWltRS5u!JnlOxAHA;rj6nCbu#fk)Co?ts3B`Kr z)vsA~eu83#vPONKPoQNlr67_&Hz#42UF|sqELMfA18qt5-~3G|Bb&e@r(;f+U)nVY z@_~w<7r5tIQF%cmywk?0cZQv%%J&e0yo0wM-emrM?Gf6XV}_fRJ(~aw^suN8nYZR7 zMF{u)UB6_0kJUZzLUgW_%DncD*5fB*q$PWIe0W;B`@2$v}OJ7zy|kG+4~YE3IpY0I>BC;meOuR1cTH7?XaY`cLJxc za`LHd_pm%RH@3=v`pN#jgJW$=v!t6sLL16P?;3W}f9Xz7wq?6X;|J+r?$c@F*!kb4 z?D~I{NlsE@M7J5E>fOr?3gM{zppK~RsnC#8bmBDU5{cA{vc=HG3-n~NKyx!E$#}@j{fBGVwz^0|6V*&EPvU)IxHlQ+pfPuun^0(!y9wkG5|6rrP>*KEq-PZ%7kd z^LnR0TAxd>0Bk|+Dbmg6FQy0|&Z#t?bbU*c^GHH+{n(Ufr^h}nU?xD za1(tC)CO-O-en#PwzyTSvlua}H(ut7-(@2X)u}l4&RejYL{6kG!*Rs1#3%jz`e{uL zq(iX6pJcW>%CEafp4vn-e(|g@De-gin0$m9{T{JQT}%*)yIoo7bUv!kLCUH?-wLMh zgJt`V8#v^&O#9^<%fK15uQUpnNu-jmIgnTz$_Sd0FZF$vS!gAKcZQcUio|-#7duZ0 zL5sZV2GE=F1hTz6XD}3$G6RyQod4+KI$oBIk_-u2%;G-c6H^rr0W8*dIjrRNGJ3WvEwEwtvB=5CwfnIGi+2eyS-kc|Yr;U8!slX=>&u=gmxm1Gjy;b5 zdwL4^?-_7u!tU8rA~w8AZrrRj{|8MK$O*eOc?oj)hujDk!Fc@PPY+znGKQjjM{{X8 zUHlXHUHW`a*`q$n?dR9km~z>`Bdk(2*{vKCHJhX)2o9|Z$cP~cxtw;M@2m(_jLHkt zitM_578OINxHH*XKlTFn)Q~>;o5dW_XNSe@Sfm5SxNl1QrSl`uk605#Ha;TWE8;zn z!jgH=ULCg*e4xDq<~g}q+oFv4>x|_MleSAqj^3Nk4pDY?*&)m*Q1WjmXoqfepMfzFBVU!)|1!n$LfnM7DQtn|?8)H4 zZN#zehMAh38@?wMNoPOX*U8I9!r0t`#-^kEy&B#2#`bojbLjCIpPkO2A{#O4ijIF@ zFEewGFkpGl!nl4?h5dH6B;+520WHzeUjb|2^7*^P%h)` ziK(rN_~DRxEyn1!W$M{gAhh3j_{15kq{eu_MdnZF`yVciR`OtE}zy^TJC7Hih}^-%yV+f%*UV zeH5h(gr3Z=)g`;4nr^JS+DZ3(FHT9q+{{MW?=!gwq>lxL$|et081!}cax@cITw??| zsTB@M(#BW>XpuC8tQ!y1CmrqB$-U{A7hc#xja#Q(i)rhE;Z%rd|EkE!PXX6=Wwwa! z#dQnU`hM$_K{`f+VS-P_(3!sg^6!hC>B!GSG3jAwoi@9?9;OI_+5*a&En{RDi+7z6Ptn zhY?NFWCK>hV>i5dq*q*0ooWJx(mB6D^PXmbHW>&Huq>PZ)vnS@&yK`t)NQf@e&B2K zUnZ0GskWxy8mXjgQ)dcM19Ze6M`ShjJicZgeUp_PgL}(EY1y^mzKxWz-8}aLrYX~L z@8TGi(oCb#PQ51ULE)siQL%tq?cd6lukQ_qWtrd(kYI+)V0w5Gi{$$f+W9FTBPi)# z +#fdH%U%X7nC-OI%MqV8B;l0==zh=JV5f6K|o7Ie|tkX+!>+92g>>X$bYbm^`u z3EVhOFhK!a6t6$Me5YrC@TD%3&0(ys;txiUb9NYJNS7nZon$+Y&FF`e_1iKd^Yj)2 z{}perLEx2sEkiU5t0_AGLejSvX(H)d-uSkte)`Z6)s82C#YaZANdkrN)P{CgiC4e9 z$8q^i-Ale#{3aj&@8~Z|465+R$&YC|wQ+VURCKEU0b1li+#j(hV1_+( zdgKR3@8jNaCSR5VQG*Dd9nsKkn`Xjx1fP-9*Y|I!^=VCa4jb+37C52f23R+B2hlxM z{B#ydM95nBmB_Tf$j;-4r?T=a-qCg4f$0C^+`@hNF1MZakH(+omg-(>C#%C3)~1c- zbZMr*v;#F}RHIhCSwHN_A^Osdo4%_(BuDTh9W6oKyb(Az;A$CkH_zfg-TP|R) z@W*!L45$2K_y~DdYJ-vsnKNo32Z}z*FmmhbpQX$yuG6e+EB^O6C#QZoszt8j?=j_( zD_@6)%FR@$q?T#g{ciq9eT*tCY<#b&i3#8e3A~|?0{D=OTD$e44P!GkOOsxP^=mm7 zv_;yb$vtC@L?rLQ#XtIlCGH_}mD?y*;jlQU6XxE=O0*sM zxKc$>vH8c)+mTZBK!+kzE5cHY-rp+}` z?tWoApE0?5JiMZ;uo~tks@i`U!vIWq(CM5#6%Hd(+zp~Xo%c*u1G+c>@ZX;XKmu69 zKTKP1HuJ?+{OwK0QiYdue`#H#Zd`Mw3B&HiCxxmk)n*x)qaK05Uo2};#w|HR zIJ(*R?sDoJXak|t_jyCPQuhwK^f>ooLwPGneHA6e?MWF99SeoHi6GXFFie{;UX}s+ zbUzvA<+xy%fW5@uo5!Cgg;)JGPFs3hW-~N1OEz;Ul)jDXC=alFyFAM2^pWFd+?@vg zcJV9waBs66)rt_!owT1-@{L&2Bz;9=_Ff;X!$hGE=)eb1@)&fB-ABT;2STx6or@yI z%^IKQ58VWLIgKq&O_HVhA!Pt=c9x$z+V4G4jzRuwdN3`vE%UZW?ZpFFwn-m~k*G1I zRwcmoOt;z3)9e4!(^*F~`G@^~bTc}nOOP&U7>Kk|N)8DD>FzEG=@3v_Qb0f?MoBne zG!oJ=MuWgc4jFuZd(Qbi&;L93+3tJi{#@7ldS9=14HHw-lg{7%rE&iONJOg^ts_7K z1&y^KJ_wO#4+47~^A2ILzwvo)&b2uW>HJA@V$ITRw#?GQqrPs!?$kFaSRQDKHNvqQ zI(bHAF6Xtmre?me!sdwO+i;{ell)Bkd4 z1dKR1&-hc#T2kWSBvFap`(;;6ILMo{vZxgvD2P&tzxK^>-gT*4-$H<1G`hKX_`O%4f1+CK`>Jheyi(^yo9`>4Vx*IFKy#Y*qGa$A;EUb~)=+cMkP zr=0v<**sf6!tL#>V;;R@z+1o+k-@KM(fjH>4RS{;kfviwA@F}@>ykdgH9j6|JE+UE zjdJ|jH6V9Y?AdrG<83EP0?sxj8Stfk0`+O%$`iPD7bRGjpXxQX-y|o!No6%47!pf` z$Lh|{);E1<_%g(~Jw;l}heOb3N(*g>pAV7v=0K9*b6$6mb9S3Cl|B1Zhe<&|evHHF z3brF8Jt_N%KXX4wC8kfT%D}BI^4|LOoARK`=Ltp89pV(%+geuS&cml3hO_${p($=Bbs_v`=O z^(tA|dRs4CwgsJa7lamN)3R49*OO^P$>EP&qw-S8QCUyYdXqx~nl>ic&m9sJvl+cF z*}@{}(LUIM-*`5_ZF4mNW-?b1CG-+`g1Pam;4;d6i!HTdwA=39RBZrKd{}Q)`z7;E z)h^Fe*&>T0c};13VpY7F&i9e^C@X@1ECq)M=ps8-AoR^!>0+XH@P4G8U^Lfc{yj8zG^;a*FK5yLflXs?sRp$(3KINbi5t*F55~oOWQ#Hpz=Y< ztVj(Z4N;{Yr3@jJ!z*$ESstqpA7=!h2nG7z5iYz{FX>1<54kR*IC#X8vK;Ga{x-J+ zpH!)1K0BzM(hmYE5!_Ew0`m>$TpP6F*)b9FX+YuafB4z7tcyEOEJ*6tobi$Fl-{%XqaIIdLJ zDp&r{zitM<{4@BT`BQU)0o(%0FJB4j*Qj>3I_*93dh~48VJE0eK;f0^}_#9Sy-db(@;8Q;=Xl>;pC?fZb+G2lc@930hnwF4|!q|S) zvOBX;H@A9tYJH+q=3EpVpL8LtrThL2l2t9dDbYf>)-=bH*}o6ME^DSV(XVd?`y#*s zdBUgW=?gseKi>22kCEF0!uy?40R(f4gO74|WXvvBKlS$S{n7Ypl1DL<-AR-Q%?#b4 z#5peOs+5_5I=NJ<)0rNBklrZ_-Gywa$EtWdc1%W@119feAPf@2jmn)1h?4w|J%U~@TJH}%(kKFo$22On2q`5$i#(;bO}-_FWo0FI zMXv{l5d@Z-2aJv8|W*T0{7S3wbb>VY4X`E!9s{9^7jKIDnYhkV(TnoGF>~vNZ7PC&;`V zdiHd_l42w|nLb^Wvx!Zxbnwk1ZN~)+-_|s{Yi&bt@~|SbxTKJ zKWu1ATd;Q}M)M|^XkY~vBdyHs1)QYsn8hd}nAh2^NOsenVIQ|-3*&K#SzySJ)q>M( z>vX~f5At!wxADs3g$L#^n-kR|NW29p-!JX`K)pOt^H>gL;z>ZL*56HpNRS2kPdVC6 z!$XCQ{~2kH1^)d>^|1xf@=d*XE`(w?eGRuPb!V~N9DTi@2WgX)cV5o)lt0@%PQIDwmb@v3Ncr`v9ZvuT7lDra|)?VcHk)Q|9&i zV60SN=H-FlEvB|K8(B^Dm0G34S#E6b#}Qs08934L37@NdTfq-tEm2q$yWVqrvY!*b zL#YpH`VUaQlINidueSV~Ho17ZdGLC$VIhxgwSipgaqVhv`e^hr(XWh^sZFRt5ke8f&uy+BdOl)^Ge079t@jKCxP2|N z=g1Y!grAP+E45kPI;%k6!$YKp?Yw)%=3ej_aP|IS7rIG?8XM9udsC8#>JtEh7DB(K z3$IP-;q@C$No;(>ayfb8!Lgm3&DKR6#-J+iEVminF!e7ZEa|d1xBmf3T%V!{=c$N1 z;#=lYoiHd>_B7cfqOt0Wgtud%4*}~BQ;+J`!_8u!X70K>f&4F5M=|Jim;%OdbNj~suMZi|E{{9KA z)%v%nC(m{0nuW8zD>ipX$nU2Uh4vwuK3Fr(`Tq;dy^Va91hzWG$l(L2;-IDtLV4mko90XDKLg#fsXq@It+LyMk-xl|6ZeKy%E(ZFp==dFHCmk(5c>CZjnz#iDbu z$^qgw`eK&%6&=H}F)1pBb6|vjcPuB`$E5MBN1Vplo_bW#VCSXp^!J}rBv67=dK@++ zk*71B_1()w=@cf6N0a3nnbI))FCS8$u%^&BULMM^Q9wQS=Am2eObLn!|RZtVciMUCzzi;I=Yr5_s#0t{M9Cmm{qq z?;vW@=c2H#H1GB!Hi)CII9>MVt8F%C+|8I?7XIV_vTkL`r=4x>>LML^HX>%$`egm)u1ZZwhCxuEzj z$f4fc@`HS1jW0GrT(Lq9N$wrKXu+Y+$=g0WMdzv$$fhA13Jgof8+*jd-Ws2ktxfrC9CYb0#34F>+&Dkb>n6owOR zzWu!nsDsXjzcfG)B_YspSY@47w3Ks}>yh!~qmh5^ag#JwS}Zj7zTe7JaQX$ql2LCA z21D})@72waC-(?wWw0v(>&1kX@U`UW#YJ*+WzTZo{irq>c&(_pW=jYN0+Ff6tpo=R z2lY~*JeIl^Mcf^J{(R(l*JeJEc*7D6&x)o6ggW7;J7Tb!FaBkBvD=U}Y3EEU8TvFp z=qCfofDTSJEq#F~;mt~2OiTYOm@eR7-*qT*BctKr_XZm}B?v*bt{ME-nu)96_$F}m zi?I1~0IsHS8Mfxatq)EaG}5SB7YzR<$`lP#jJ6*J_C+BZ8L!oUqM|mThOkwJB%XzUUHL>?%jt5A+HoBkdA&4rfb#w^c8pYElm9e^NptZ&YEi4&|Wsj zIhHp0n~*h<2w4WC$950{BRNq63Q8faXkIIqMjUIoKFCwFlMQ4OX&NybHK2@%EOGIz zui2qYNkNB({@d%La{k{{V7vn_^kXmRI%pWw+E$Zv6&73tLH?=Otn~eA>olUBKDX%d z7Je;Q-DmO_V@5I9tR?)z{k1SD62|Ha8t!9`a*Rs80THRlJ6j8bE7&UY-CSz-RO>u; zxy+lD?0h+n;%Y4K!Lwox5``IxTpea611sZ;%r_6OaSmtDCBL*GBK>^5d)jRM9P=|C zO;C|3(`SYP)}2^eFB=e;d)<#JyQlGPK{dn#1CNTtm zfA3JUTPD3g$i?VL$AuN*!k){O68y3(F_$hUTZfp07k95P3hPgO`F3G=?p3KcjK%fF zPIGnG{({FT?^czF$eH<601r55v$VjFvL!^JW@7)js##LAEPk9zP~xJQlhwXjwb5X( z;QI69PNkO)CyusFY!|~iLGX7f!NaG~p&7b8D+$GQ%WZ?EM&LIv-{0!wB11j$P%oZc zeHEj0iEITkP(v_84Q&6-rHgKGwRPb;l!rj4GF39w52K+ox=q=04Pef<7HH4)zwmP5 zZ7~RXv~onuLe`2|<_zx>^2t^?VDm8pzt&SWDVSE5WyGC%#2hFPFg-Rw!by9cb75b* z-za&-d}gNAb|>@kXigs@`|ho&DsJ9ueMxf(JpAprXF`)jI`5ydp%VPcawP*Z0c*Jg*%?|c!IK&ktRpZcu^WrjEDc?tXAK* zc4tpVv>xF0hzrJDP)2(PvuP@pknUo{+z=?vtmM zu`jJ2c^eXqDS$p){Rc?>*O{o7Fb)OT^%ihYyAI39C6kXzo7y&d4*=Fv9|y0g!Q}5W zf?Krad)XT6k$qeD3yr9Lu*RP@un%isV>TdF-FH7Eb+?$?+$=i4`AR*PRy>u1p-ZRP>Wq~{OvPcu&~ zdO(aHe+=u0j0UB6v+3{e<2eHwkSWlvL}=36!sjTK={l)Ww87P=$o4r`M`}d5nsByl zF1uGAuHQxAs&#ky)5N6mE)--a7~#B}EkddFag=d6xzhFg3*hk~b8~7Yvu_wBi^)Ph z7VFsu6!~a<*j00ScE*)TCgeOIwyy}F%g>@V_l5EE;C>GPODAAOb+psg8I|j!#)<~b zBilExLS25}rViHc3sl9JSnW{Z!Ug(?S|djy1zQYOX`xu!QaD>gSncv*LzqwfrcUYO znY8YthL7n|Hy`j$w@b9AAjwxvQiHOS9$Z)$L9NX90iu1S%gaVQtm7Ji4g9cqSO_cu z%H-8MRVWV{^A)ieuii7GC{LfKDl{{#W)e@+1Cs-4Puq_BMD2I0<~tNyGCq}-Z5RC} z3afE7b-^WZBv6$nTZ~J;{~wliRdAk^Kkuu%)>O%EykUb^>BYwr2`%BgTIsuSbz4@n zb4=88GRG3(B1~^Of&MU&@Ur-IbXK8SJU! z3=aF)FE>uxf25|exn<>94}U3Zs(mJXV*NO0Vo$ka0JdEs4rJ>SjsR;eTsjP`CwG$K zc*{fwSx5=J%Y0&W__2J=D(-NM|yI!O7eXZ9tyy=6(Nafgb+i8%e~FCrkxjsC~x zfB%Y-_JEVT2b@+#HO#J|2n46gUpMnF?jS)z0p@>du^onSI*e|}LCfki>0faoXEPY}Q* zF@$OHF2#)V2cOR9?eP+MtShJ6CL#Ck;?4{0J7;vC{P5&<;ncwfaFm@IAe=WV_wT%H z5&rc5nj%0h*GG8zd$Ye@F*SEi@t5b1FRZ}aQcWyVYfgPcP3&#xU7IQpQr{*e4F$Z8 zY?;~VqWFq->1#RFVtH>HxwDNe8cgY&I*Cfk01K*7v)b*1t)qAM;O3n8cH@=~xxJQBbq2=N%qWvvLIV^9tN` z=Pe?3$Osu3AaeB}WdyKp=S&8&^COp#HgPA2eB%%flU(~$=_g&p;#PxIs%sNC_*eTI z9h;=q>pEXuOSpf|+N!gj@vZk209UR(e|_h_31+tMF6p2*121A;eO=LObTRA7onEaX zq3Kypm(*dY7`fkAkYghmz$O++@W>{%3gxxVzW4HpY=~k#2n2KIv;kRjN>mSHOfh6% zFrdY&yK2<)Uvv5b-D>wsS?z#So7nE&cSnkpf6Rt0)9(A?@fk@c1K1GCm$lf0Cs2+_ls`SB(X8^ovl~zC(ru|=HQcb>Y4~+#`s>SMU zn8miy-H*bK1C`GzZS=^6*5d`2-RvXpVFS@1yb(>#zi^q*?RG*Uxjp(%F6~~3`jn#Q zj_DG3TznP@zw04cJ#{<&Q&Hsg5s_~9Z$m`eYV(*H9e@P$VDjvCVv~0V(ak(cX=QlW zeX!H3s7JxZi~U_rF+0dL({t4gzcmfX_rg8g5=0;=S)LhUd>RU0*@W5i;GOxVR9DzN z<$~3Rs(jvzPuDBpT_sHMG3l``@;S`Xqm7 zl5Vhv-dN_w$E`yh15J55#dxxreFq4`A*O2;sny~VsfI)baA1mp0Tf7!y<6ce>#dYi z4S{6wWy=;7EGf zl(@)sG_hVpaS4RQ(^RA+eH;kg& zi*+aGHzk10?JIg*W9}>xXnpvh5}Pfys^?awT)a}RFC!68uMqnc&r=TPv)1fh6@o%Y z4_VTUY#f-=^7<}PY24q>w4FQX9q&o*%P=ymWpyuqMiVZG(%3%2)<`HqGH_u-e&%%W za`pnBYPTfi0oA9ezusC(Sy?cb5{cPJKD_{ha+n!I2|Y6K2Y<)Uc1^#p>(w;MUwkcO zImGcw-e9>;(ZI?+cwrObj3RQrc59kZyW&MFHor|K#htDzqQo@o1Ac0%phNMXzySca zy(PfozA1$F741VV4~>?pd1e%Q*z3L@Ckwe^Cth1oOcqB#o^hujqNU0x^bN@Aok&ZW zeKgQm;;J0Ws{0LoRXX-F-egx~4{u6)lu7PPl^O3OpHhYaAVrwvGmS;%leTyXCk^56 z%5%^?6h-5M4u8nP$PQxbDo?0DXlB zgDq7$=@YPZvO|H0wevOWsc7(ilk$0#VeVZj?u&F1Q62bQ=&RPSmtvE8 z@=t@sXK9!^rv7H7MwKM?(f2tr6q0k<{fvnV$g<&T8DLW>kCyOg9mgJ9k)4!@yP2AV zoBBYThrVsIFSCszXF*EaL|hJzB8$8}OZ%*Miv|_uE0alL^_tULVf^*W8` z9nlx%yG1hO=2qmF9;;@J#9gwO{$V-T;ls-?Z25dp)}rB(@>mY`3hN87p$#U}Mq%e9 zSEYbsi583FP?Mpw2?t<-e7WHRXmhcpzg$26DtjAq~Y?%nz@Hyc`ikaaJwQi72i=L0_!> zn`*)C5UJSwAAkuf4Z`z4R8`1**mx=KTK0l9_Z*XwD!lbK<}hwPyjpG8fM)v@XmM`)pUf22AL-%>`7Z%43{{=a8}bxbr&JXqVL1@ZjxDbu_}>%19Gs%pE~wi`6URAWWxl0j-N!FMU(<~ zT%2xkA3xTSC&-OlV%w;dC|ihlxMMd-uDL&1%%jv;l{0{U|8`B2=R<(g7N6Fp=!VX( zlPRg$pWti3-!*fQ5+lbl+cH_+wez-^n#a)!UQ{mPN7gLhZVzrS)yA9QX1*SRy$urL zy1qe7yV^E~ZljV~gojU!+gZ)}NKV~QHJF{C+p|AJ3#Tx057u;0upQi|*Z0VsbI6fz zOE$E~HqX8@@slC;g{h6V$-io&+TJ1oN)irB_ zSC)7~zmae6L2xxy+r$)To!@N44C>)KUn1I5iOC2@Sv@X#i zhAZ3zi6L&`ACGQgXDNh6#-vBt;n}g`lONqZw*h_^rcKFZDbNy;4&-%%qd@lTx_!T? zd*_AhZsK?gX;03nPr(+(r51eD#cKKdA1F(5tyN-Ai*zuP#(4)2cb>V;8tWZuNH|rv z4FbCTpx`&+mM6CNpB)_A@8%8rJ|phU;$_HWUqiL#>Q;x2Y0avBiRVgln`8MBB;xt) zS*{~K6>gp*Kju`5)jth6q{na=vw%W@WMoyYSwTp!7L6@{T(WDljm56qa6| z_`XN2{%FBx(^q)Tkm|!6tzE5YhB0F-~;kBaFU3ej~;}=U!!xkI1(AG zDX|uV2yPMkH!W>*mm+qj_Fci2PU*am&pcKkb?M*Ci#yUXJOL`Ub1ILADuaI=4?slx ze7?HK^f~`|n+$6G@Vl4XeM?h+7ROLrHd87izjVF_hX@Lyc>D#iD%(6$)|QRfn;QL~ zH^X~6j>`p#B9t`WPFlQc$aszd{koUhQfWpgJ8eNy&Q2N{ntEXf5cMWO$&)q#F7{x7 z*P~Ce4a9i_h{?0bamz#}YZ6+2(!T0M@eeg{KU;h>ySzFSz-xPWU`Hsw|Lld&XPK#= zn&Q++AOQg6j#jVHCUQZAWl`++l7*Iiu!pl$54l>9J8IPZf=7g zd$bQbs9r@<%3syaF20I&mrNjE@{2kC{g2$~vJ9v?6K)n!Qx*pP5GaoOR5EZ{{r0Ni zo^iNWF#S;UOYi3!x!V^Vr)mc{95z)Udzp*ff8gP^Nd4f$2l_t;viTAtNvGA~ntK8y z!&@+(Ri!9>fJzqH=}r||W2xcQ$TR##bxjklea)eJja^?kGguUvbz}474>D@QyYAD;jQp`5*l9tIzUutsy6@%mG#x2}{7=N5PqHR1b_??Ko(jL_$-?)AKZlM&~*Xk7A z*4E)14@vc3455R+zk40l^&g-mOO>OYv0%8~jn$>jxf?I@M}{n^UlC?qfgDlPEBgIA zSD&A7V1~0bh5C1vE=bg_@@iAL~{&YDSuz{xAd>*9H+&Pw|XsxB?7qf(` zu`@-1{qlhQGC4Y&;+o*8*@MEM8K)0WTn3YAetUYL;+5W*V^v?W$juSMHQy97u{akQ@qD@61}pQw*| z`pgUbb7DK@utdiLq4JA^3ixI~AHH6mb8z4<@k$9L26y1uRQ7v`ym?Rl16^S{D-3UM*O(KzqOf};lZ%`6#L%yJ zLefA9Z&Mm4ktns`creK9fvc-@y3?ywFN9X8b;1GEm6MU#2zpN-sHHDNW00i6oHq9| z4BTNqZ>^1hr2Pl*q3^1`kRdis^fFP3C1w-KLC2%``+bAoT)qS$(keMzKQ-3g7kh{d z#s}eb$u5uHYWk&~9uI4kxa9lb&_3*5|5QixGU|O$V=cKM7O0UecO9*&V)2#Qf|Xw^7FT3!l6h_SIcr*zR-S;T3- zzPKk6B+1kL!2jLnDiVR=M@fc7QCgtkg_C+F-MC{5U^zMs&o3DT=R$&z4jQ@fX*g5Eq6g=ABE1B8V4d<I`@n4y zLuKRtN`r`k@m~MQ+a7MV#ymq29GJC2?xbTiTjMLnMt^<)L@TguN(=#lIH<6wPrA0^ zCm0swX#{RWB;0MKx9yum^3>%rq$mrT*B|uJOL5Snm4SJm6?>)&8N`Hc#_S_)27pA9 zb>p?J3@bNj1?Pv!>MrzyIen5;yfpIpr;ghEa!gp^(6y8?Tn`l(k2s5uqB@iQ+|akh z7aeQ^{pW8dI$ZLD$$CuhPm+ z&s$`gCn@F!b*gY{_8ik*OU4ZVGJi%3hXb(xQ71&k2@B&OkmcAk`%r}gMbgresNdnz zU^wnDxvQ#=TTXz}aW^3qTix4)spe0(g_(!U<}(T`g!v5;;X*+rqaO@k*qM%yg=QpPjFplGvGYYw)JRYdM{2 z*m^%buHC>epx3wV>=h3eDeyU4Mk#(}MeBp(MwERHLJngl_@=tjk?j@2_>|$J=~}$k z`-|t&*svUSx_Kubat3)*_jXbw&^q_G1=_$O@yU)*CT^^_xB~nl{c2lgQK2VkkpiLC znZPbn>N!_*d5Kk+T|29SD+WWukA4Sdj1OyjNnZ~POv}U5e|Ag>;H|ZKh*`}!87~}+ z=X88f$hepc-o((wVPlm+&Nj<=4eEnO z!`Gd52=nwQD{q(_MJ&(&n%)M*fbTo!;6vRo#2NGQw+$nS6)mQ03rYomR7n}fMKo}1 z!PHc+KbDeMxmL&tA@)J~j0t^Xj`GZ{W)X@oFfKS!G_?)ylNhJ&$E#JMB+|V;Th@CZ zxkqrUSWhXN+HhDLT%Xs6IC~UW_)^NV;;dbb^l#@x14{&&X3ntr=>w$UV#@O`5x{iF zIke|9L3J^!J5(tCHpN43Jv(&ATig$^$=FL2IsPA@S@wjs3B!wC-l6~x24&Q)}jkEYI=BwX*u(;tbRx;uEs)C=WiL!YwJx%8<`%6z* zcUY**BN z*P;Q`l?4I-M6(dP2xHIi;T7VY#s~r2@5q;N_3Azqj z)q_Y@HHUt58{(8`-C`8rW^;aM8Cj#Wnm>LEwKlxHfXUz#{Yh zb6`_(o^i4`K+Xul8kerGFLi9i4!IL>9T#d#!`Kq?o@uf=TwWe1{43RuT+9DeF1;c$ z6TM?wR8a2+Px4IhYyo7&<*`2S7XUDrg+3>*q;{!q*Osr=zhv2uVRJE6d6Z-NmBl6X z0h4Aw(G)1bjU8eo@mcQZpl{KhyzFaE%$N(CsP%9q$bk7bO-Y=QsR#uV5*0R=Y?b@y zEeID+uQT3Om{<&m!P3aP_Yuw~+j{Mu9fwq>8oW6S>0KQtamBJ7Q4xO|%zqR*gT<6R za^EC)V2qrw=0(eHoe@FZ5bK^p?m2ohXC(;5ljqOy->tla5oj*6FmaypzUwJiL;wdl zmi_}kyHs-OHnf`1zIR3u8+6I2%HH(#FDgMr-MX;$o$?RF1jXmtY|8dl_r>8pNU3Y> zKbJb;9@UV<%R8R1tv)IXja_TKx|d3kR$~SeB<}*`z0}erX@{cldpUzswr&z~w!EO~ zULloWmuu*d7>$8wTa^xV_L0Rnxs=pS(iDOX!s-+a1Qu?U4)oE4YoTN^P^*Zk`tr5b zz<-T}LDd@xX@41%vtS*yM7C);cmRhT_P{+1=e!d8N)`5;v|+P3mi-Sv*29GeaYRPs zelikmsg*FFoQToQR!OE6)zSx=A#4EXoa;1nb3-r~{t&yt6SozW_~rY&*vUynjMu)3 zeXAY^pi>BfPoB#Q-)3HEif)~LA4PddcCR(Suo7nuy{0gG4w>gmKAVz#HIbnG@K;Tb zaKnt?W895*8GDsV8oVCM)fk8rM~9UEOlz+R2ym)(c_p*k(v_eLAH-Q@3b5;AmCY-1 z489{Bn+@XPA^0vgT@`-!6>Tuom48A8=b**BMA26bEADGa4-;&2PAfR@9^vmmcNMIT#=K;`DOKVN2JYSl=Gsb=Nfw?9M9kLAs}nO3YP#m}Y-PpQ54*Ll|7-F^PG-ZcouyxiU%%8pGAfk5r6IcuhY; zyoL>ywN%liPyX~g4PDW!GiS^ex8P&m7bG5lJ(HM>!1)hA9Ql!v*kLNP1@Hbz%`b-9 z3-v%>EyWzU2b-PW*bd~$ecFSZGp&?{`yz4}?9$e=r*Qz3lh35{M?vt8IUV8&O;=u4g!jf;Z6acu5b2+IGCOP$Jh3f-qpGpmB z2l50BuFm7QOKiV*LC8VnEW;ovozE!vw-NZsH}X_6EG9LoXd7XZWX*@-PwsgWRu^1n z@p7|d%q}W6)tq?c= zwpR$fiZvdYpxAy6cGW;RSX&$a8Y~CsMO5zOjK|HB`&%2-*aD}I{@K7m7&;nIbT4DV zv^G+>E4>Pz%UGw;R3OMU03b!(zn{9@9*o+=HVJ@8zK7{vK`+v9t>3(@A5&MS2hsx` z&a*IGi}b1-UTCievN%+R&Yv$Ge|o8%g94vD^AsuX&EpT)Kka1_ir5Iug8GcC!u#5y zPJ|E!tNGXO+feOauwB`>ciunEthY@aV6{EAYAsL%WNfDhyP^|g{nWbnjs?TJ`0zAx z_JI&(6;K9vX{*LODMR~ENe=EqE0j70PoCZY3_l9YzzBr-&mqyh_;>3;%f%& z?I%fJJs(J0*Wu zag@G~GfeEqD!i`D5VltiDZ_VgP&D^DH1%-D{4q8<*C&AiUW;od?TlxWX8J5NtZu9f z%Ze39Os!)c0Yop++@xbn4n-K~QzN>T>L=9>}EhOTP+5JokM@qZ!P5k(_d@N>@G0 zpk*BY+5?(39Dy=aiR0eU68sT!PKao=M(`?{XNFPKl-4vnZQ?d6ZrVfxu{>5g(d@bh z@AgqaU%<$4=#1q_{qq_ZNRHeHzL6?N9!E=lqW>oA!{p+IGRip*6U;K0+QlC4qbO5k zxq(kTD$G3CC<)_5>T41+*`#x@_C7x3z;Cl5J&5EDBLhzZG-@-NHJf{AbOLsNX!eIU zpS)w*xmb*eO$IceAW-PF&cfXD`_i_;Gs@K|cc%`pMQ+cbDWR9d%@fMO!OCn(9sb%T zz5Ak`r{%$g+fk`nN&W|3Ac)8b=O6OlHUCt>&JN^Dczf$Stk=^_*u?O^ks??CdVWlw zOGbPzC9*v;FHcgV-c?F7@v&asbK}1T)+GM{IL&6SLp8p{zk#qzv_i04VxWRXS%5)d zrhZU_;7Lib7YPX!Cq3`Q{8rT9j0QmEbc(*!iLKJe=hr#!w-Fw2|VuX_(^+#0L{Gju-+aD*i6Q9W8x%kiDM|*4kvn`KS z`C=2l!DU9*-apEhB@%%j7GMRKZ+^HujaT|NRDv!ITX!2*|1$lM5A6Z*b zKCsrel_}TmNOI>ouPYgt@f04zVXx7nOBN-@5r{{hJNv7>;% uGnxMYjZusI*su8Dhy5VzzA%!#+l%&y)F<`Z;atWwXe#o5qT{0eTmFCU9RAw? literal 0 HcmV?d00001