進階篩選

Technical category
技術研發單位
    • Smart glove with soft force sensors for virtual reality somatosensory equipment training

      Smart machinerynovel materials FutureTech Smart glove with soft force sensors for virtual reality somatosensory equipment training

      The smart glove was made by soft force sensor with the multi-walled carbon nanotube cast in the mesh structurecombined with an interdigitated electrode together, then used AC/DC conversionspace calculation to establish the interactive virtual reality(VR) somatosensory system under the wireless. The recipient can achieve the same feedback as the demonstrator's behaviorskills in VR.
    • 擴增實境之光場顯示光機

      FutureTech 擴增實境之光場顯示光機

      PetaRay’s solution is called LiFiD, which is delivered in the form of optical module for near-eye displays. It reproduces the light rays of a sceneprojects them directly onto user’s retinas. Besides projecting light rays directly onto user’s retina to solve the vergence-accommodation conflict, LiFid has continuous focal planesallows the display content to precisely match the desired depth of field. No eye tracker and/or other sensors are needed for the projection.
    • 虛擬實境應用於船舶機艙管路布置設計

      FutureTech 虛擬實境應用於船舶機艙管路布置設計

      A virtual reality (VR) system for piping planning in shipbuilding has been developed to help engineers easily checking their designdiscussing with each other. The VR system includes two servers: a database server for storing piping dataanother for multi-users interaction. Within same virtual engine room, users can freelyclosely viewcheck the auto-generated piping systems based on the data accessed from a database server. They can also use any independent vocal communication software to discuss with each other online.
    • 結合虛擬實境與慣性姿態感測器之防疫機器代理人

      FutureTech 結合虛擬實境與慣性姿態感測器之防疫機器代理人

      We have designed a remotely-controlled robot which equipped with a 360 degree full HD camera. User’s postures are detected by measuring the accelerationrotation of changes of limb postures using inertial measurement units (IMU). User’s motion commands are wirelessly transmitted to actuate the remote robot in accordance with user’s posture. The 360 degree full scene camera keeps transmitting the real-time situation of the scene surrounding the robot back to the user, displayed on the VR headset.
    • 陣列感測光達之智慧三維感測影像處理系統

      FutureTech 陣列感測光達之智慧三維感測影像處理系統

      Artificial intelligence 3D sensing image processing system based on array sensing Lidar aims to construct 3D image with high-quality immersion for AR/VR. The developed 3D scene recording system is based on the color camerahigh-accuracy chaotic LiDAR. The chaotic Lidar with APD arrayTOF sensors supports millimeter-accuracyinterference-avoiding capability within 100 meter in both indooroutdoor environments. High-performance embedded CNN processor supports high-throughput, high energy-efficient,low DRAM bandwidth computations for various image AI applications.
    • Artificial Intelligent 3D Sensing Image Processing System for Array Sensing Lidar

      Smart machinerynovel materials FutureTech Artificial Intelligent 3D Sensing Image Processing System for Array Sensing Lidar

      High-accuracy 3D sensingAI image processing system for constructing high-quality immersion 3D image for AR/VR. Chaotic Lidar with APD arrayTOF sensors supports millimeter-accuracyinterference-avoiding capability. High-performance CNN processor supports high-performancelow DRAM bandwidth computations for various image AI applications.
  • 1
本網站使用您的Cookie於優化網站。繼續瀏覽網站即表示您同意本公司隱私權政策,您可至隱私權政策了解詳細資訊。