0

I am trying to integrate OpenCV framework in my cocoapod project.

The steps I have followed to integrate OpenCV

  1. Deploy OpenCV framework to the private repository and make the necessary changes.
  2. Created the cocoapod project and added the URL of OpenCV repository to the podfile.
  3. Added the OpenCV dependency in .podspec file.
  4. run pod install in terminal. (This will successfully install the OpenCV in my pod project).
  5. Added .cpp, .hpp, .h (Wrapper) and .mm files called (OpenCV.cpp, OpenCV.hpp, OpenCV_Wrapper.h, OpenCV_Wrapper.h).
  6. Also import the headers in umbrella.h file in this case the header will be
#import "OpenCV.hpp"
#import "OpenCV_Wrapper.h"

Now when I build project I am getting following erros

  • Core.hpp header must be compiled as C++
  • OpenCV 4.x+ requires enabled C++11 support
  • 'array' file not found enter image description here

Till now I have followed several articles some of them are listed below

  1. https://medium.com/@borisohayon/ios-opencv-and-swift-1ee3e3a5735b
  2. https://medium.com/@yiweini/opencv-with-swift-step-by-step-c3cc1d1ee5f1
  3. https://medium.com/onfido-tech/building-a-simple-lane-detection-ios-app-using-opencv-4f70d8a6e6bc

In all those 3 article they basically integrate OpenCV in project but mine requirement is little bit different as I have to integrate in pod project.

There are similar question already there but it won't work at-least in my case

0 Answers0