Show HN: FaceLandmarks – ARKit Face Mesh Vertex Tool https://ift.tt/rxAmfPY

Show HN: FaceLandmarks – ARKit Face Mesh Vertex Tool Hey everyone. FaceLandmarks.com is a little project I put together last weekend, while working with Apple's ARKit for iOS face tracking. For those not familiar, since iOS 11 and A9 processor was released, all iPhones (i.e. iPhone 6s and newer) support augmented reality capabilities. When tracking a face using the front camera and the ARKit framework, a face mesh is generated using exactly 1,220 vertices that are mapped to specific points on the face. These vertices are accessible through ARFaceGeometry, ARFaceAnchor, and ARSCNFaceGeometry within ARKit, and provide a foundation for developers to do facial tracking for common use cases like: social media filters, accessibility, avatars, virtual try on, etc. While the ARKit's tech is impressive and has a smooth DX, the most frustrating part for me was identifying the vertex indexes for specific points on the face mesh model. Apple does not provide a comprehensive mapping of these vertices, besides a handful of major face landmarks. Vertex 0 is on the center upper lip, for example, but there is seemingly little rhyme or reason for the vertex mapping. While devs could download the vertex mapping, open up with a 3d rendering software, and identify vertex indexes (which is what I originally did), I decided to make a simple web app which simplifies this process. FaceLandmarks.com uses Three.js to render a model of the face mesh, with clickable vertices so you can zoom, pan, and easily identify its vertex. In the future, I hope to continue adding semantic labels for each vertex (there are about 2 dozen so far) for searchability. It was a fun afternoon project and hope it may be helpful to others in this niche case. Enjoy! https://ift.tt/1n8aEq9 March 30, 2024 at 11:18PM

Comments

Popular Posts