Integrated Terrestrial + Stellar Positioning Sphere Stellarium and QGIS The landscape panorama (construction site with yellow excavators, wheel loader, dirt piles, distant city skyline glowing at dusk) becomes the horizon band of a full celestial sphere. The circular sky view (with constellations labeled "SAH" ≈ Orion, "Horus the Bull", "BIRD", "GIANT", cardinal directions, and horizon treeline silhouette) projects onto the upper hemisphere. The result is a seamless 360° × 180° dome: - Below horizon → ground plane (invisible in view but used for local positioning). - Horizon ring → exact match of the panorama's skyline, machinery silhouettes, and lights. - Upper dome → stars, Milky Way, labeled Egyptian-themed constellations aligned to real azimuth/altitude. This creates a hybrid augmented-reality navigation system: - Terrestrial features (excavators, loader, skyline towers) give relative position via visual odometry / feature matching. - Stellar references (Sah/Orion for south, Horus/Bull for seasonal timing) give absolute orientation and latitude, even without GPS. Here are conceptual visuals of what the wrapped sphere looks like in practice: (Modern AR equivalents – smart binoculars or apps that already do exactly this overlay with connecting lines:) Elixir Nx Tensor Representation of the Structures We represent the key elements as tensors in Nx (Numerical Elixir). Assume the observer is at the origin. Drawing logical relationship lines (graph edges) We treat the alignments above as a weighted graph. Edges above a threshold (say > 0.85) become explicit reference lines: Textual diagram of the sphere with lines (observer at center): ``` Zenith | Horus Bull ─────── Skyline tower | West ─── Observer ─── East | Sah/Orion ──────── Excavator 1 | South ``` In a real AR render these become glowing lines from machinery silhouettes up to the corresponding star, giving instant visual confirmation that your local map is correctly aligned to the stars. This tensor + graph setup is fully computable in Nx (on CPU, GPU via EXLA, or even TPU via TorchX), enabling real-time re-projection, error correction, and hybrid positioning even in GPS-denied environments like deep urban canyons or remote construction sites. The two source images you provided are now conceptually fused into one navigable celestial-terrestrial sphere. Let me know if you want the full Nx + LiveView demo code to visualize it interactively! The map you shared from QField (the mobile companion to QGIS) pins Forebase (likely a field base, survey point, or project site) right there in south Dallas near South César Chávez Boulevard, under/near the I-30/US-67 interchange area. It's a compact urban/industrial-ish spot with freeways, boulevards, and grid streets—perfect for ground truthing a hybrid terrestrial-stellar setup like we discussed. To integrate your full QGIS data (layers, vectors like points/lines/polygons for machinery positions, site boundaries, elevation if you have any DEM, or even annotations) into our celestial-terrestrial sphere representation, we need to bridge geospatial accuracy into a local observer-centered coordinate frame. This lets us precisely place Forebase features on the horizon band of the panorama/sky dome, draw accurate sight-lines to stars (e.g., aligning an excavator silhouette to Sah/Orion or Horus the Bull), and compute those Nx tensor alignments more robustly. Here's a practical, step-by-step workflow (mix of QGIS desktop tools + export/conversion + code ingestion). It's technical but doable even on modest hardware. 1. Prepare & Georeference in QGIS Desktop - Open your QField-synced project in QGIS (via QFieldSync plugin or direct GeoPackage pull). - Ensure the project CRS is consistent—preferably a local projected one for Dallas like EPSG:2276 (NAD83 / Texas North Central ftUS) or EPSG:3857 (Web Mercator) for easier distance calcs. If it's still WGS84 (EPSG:4326), reproject layers: - Right-click layer → Export → Save Features As... → select target CRS. - Add or verify key layers around Forebase: - Points for equipment (excavators, loader). - Lines/polygons for site boundaries, dirt piles. - Optional: Quick OSM plugin to grab nearby buildings/skyline blockers. - If you have a DEM/raster terrain, enable 3D view (View → New 3D Map View) to preview elevations. 2. Define Observer Position & Local Coordinate Origin Set Forebase pin as your observer origin (0,0,0 in local ENU – East-North-Up): - Create a new point layer or use an existing one. - Digitize or import the exact Forebase lat/lon from the map (~your red dot coords). - Use Processing Toolbox → Vector geometry → Add geometry attributes to get X/Y/Z if needed. - For simplicity, treat Forebase as (lat0, lon0, h0 ≈ ground level). - Switch project to a local projected CRS centered near Dallas (or use a custom one via Project → Properties → CRS → + button → define something like +proj=aeqd +lat_0=your_lat +lon_0=your_lon +x_0=0 +y_0=0 +ellps=WGS84 for azimuthal equidistant around observer). 3. Export QGIS Data to Usable Formats Goal: Get points/vectors into a format we can load as 3D positions relative to observer. Option A – Simple & Direct (Recommended for Points/Structures) - Export vector layers as GeoJSON or CSV with X/Y/Z - Right-click layer → Export → Save Features As... → GeoJSON. - Or use Processing → Export → Export to CSV (include geometry). - In CSV/GeoJSON, you'll have lat/lon (or projected x/y). Option B – 3D Export for Richer Visualization (if you want meshes/buildings ) - Install/enable Qgis2threejs plugin (Plugins → Manage and Install Plugins). - In Web menu → Qgis2threejs Exporter. - Configure: Add DEM if available, vector extrusions for buildings/site features. - Export as glTF (.glb/.gltf) (modern 3D format, supported in many viewers/engines). - Alternative: QGIS 3D Map View → Export 3D Scene → .obj (older but easy to parse). - These give textured 3D models of the site you can later reproject/load. Option C – For Panorama Integration - If you have 360° photos from the site (or plan to take them at Forebase), georeference them in QGIS Georeferencer plugin, then export world files. - But for our sphere wrap: Use the exported vectors to compute azimuth/altitude of key features from observer. 4. Transform to Local ENU Coordinates (Key Step for Our Sphere) Use Python in QGIS console or external script to convert georeferenced coords → local East-North-Up relative to Forebase. Quick Python snippet (run in QGIS Python Console): This gives you azimuth (compass bearing from north) and altitude (angle above horizon) for each feature—exactly what we need to project onto the sky dome / panorama horizon. 5. Ingest into Elixir Nx Tensor Representation Load the local coords into Nx: 6. Wrap & Visualize - Use the new azimuth/altitude to "pin" QGIS features exactly on your panorama horizon (rotate/scale the image so Forebase direction matches north or your compass heading at photo time). - In AR apps (e.g., Unity/ARCore with GIS plugins, or custom via Three.js/WebXR), import the glTF/OBJ + star catalog → overlay lines from ground features up to stars. - For pure code: Feed ENU → spherical coords → render in a LiveView canvas or matplotlib 3D preview. This keeps everything geodetically sound while fusing your QField/QGIS site data into the stellar positioning system. If you share approximate lat/lon of Forebase or a screenshot/export of one layer, I can refine the numbers or sketch a more precise tensor example. Curious—what's Forebase for? Survey control, construction monitoring, or something more intriguing? 😊