I have already coded an augmented reality app for iPhone using Swift and ARKit. Right now, the 3d models (.scn files) are stored within the app, locally. The app currently has a menu where users can pick a model to render on the screen. Currently, it will not let you click on a model until the ground is detected, and then the app orients the 3d object on the detected surface. This is all completed - you will not have to worry about writing code for plane detection, just database stuff.
I would like to host the models on firestore and give each of the models specific longitude and lattitude coordinates. My app will first detect the ground to prepare for orienting the object. If a horizontal plane is detected (this part is already taken care of) and if the user is within a radius of 500 feet of these coordinates, the full 3d model (.scn file and its textures) should render on their screen, just as it did before when the objects were stored locally.
Right now the 3d objects render beautifully when a surface is detected. I will show you how that part works. All I need is to:
a.) host the 3d files (.scn and textures) on firestore
b.) assign geocoordinates to these files (attributes)
c.) render these 3d files when the user enters the location of the 3d model's geocoordinates. the 3d models should render the exact same way that they rendered when they were stored locally.
Will send you my app if you agree to this project.