I am trying to integrate OpenCV framework in my cocoapod project.
The steps I have followed to integrate OpenCV
- Deploy OpenCV framework to the private repository and make the necessary changes.
- Created the cocoapod project and added the URL of OpenCV repository to the podfile.
- Added the OpenCV dependency in
.podspecfile. - run
pod installin terminal. (This will successfully install the OpenCV in my pod project). - Added
.cpp,.hpp,.h(Wrapper) and.mmfiles called (OpenCV.cpp, OpenCV.hpp, OpenCV_Wrapper.h, OpenCV_Wrapper.h). - Also import the headers in umbrella.h file in this case the header will be
#import "OpenCV.hpp" #import "OpenCV_Wrapper.h"
Now when I build project I am getting following erros
- Core.hpp header must be compiled as C++
- OpenCV 4.x+ requires enabled C++11 support
- 'array' file not found
Till now I have followed several articles some of them are listed below
- https://medium.com/@borisohayon/ios-opencv-and-swift-1ee3e3a5735b
- https://medium.com/@yiweini/opencv-with-swift-step-by-step-c3cc1d1ee5f1
- https://medium.com/onfido-tech/building-a-simple-lane-detection-ios-app-using-opencv-4f70d8a6e6bc
In all those 3 article they basically integrate OpenCV in project but mine requirement is little bit different as I have to integrate in pod project.
There are similar question already there but it won't work at-least in my case