M400 & M4000 Knowledge Base
Getting to Know the M400 and M4000
Overview
The Vuzix M400 and M4000 are monocular Android-based wearable computers. It is designed to allow users to quickly and easily leverage specialized Android based applications to provide hands-free access to critical information. This allows users to be to complete tasks faster while simultaneously reducing errors in the process.
The M400 and M4000 contain many of the same capabilities as a standard Android smartphone, in an unobtrusive head-mounted form factor. See below for a full list of hardware and interaction capabilities of the device:
- Full color display:
- M400: 640x360 nHD resolution occluded OLED
- M4000: 854x480 see-through waveguide optics
- 8 Core 2.52Ghz Qualcomm XR1 processor
- 6GB system RAM
- 64GB internal flash storage
- Camera capable of up to 12.8MP stills and 4k video with auto-focus and image stabilization
- Orientation sensors (gyroscope, accelerometer, magnetometer)
- User-facing speaker
- Triple noise-cancelling microphones
- 4 standard Android control buttons
- Touch pad with multi finger support
- Voice control
The M400 and M4000 are huge leaps forward in the head-mounted wearable device market. Leveraging the power of the Qualcomm XR1 processor will allow you to push the boundaries of video processing and 3D rendering.
The M400 and M4000 also include an updated camera that improves resolution, frame rates and auto-focus time over previously available devices. These rates are documented in the Camera Knowledge Base.
Interaction Methods
The M400 and M4000 feature interaction methods which differ significantly from traditional touchscreen Android devices, and it is particularly important to keep these considerations in mind when designing the User Interface of an application intended to run on this device.
Existing applications which heavily leverage touchscreen interactions do not translate well to this device. This is due to touchscreen UI’s leveraging taps for input based on specific screen coordinates, which is not possible with the available interaction methods.
Voice
Voice commands are the ideal method of interacting with the device under many circumstances, as they will allow users to quickly control the device and provide input without requiring them to physically interact with the device, thus interrupting their workflow.
The device includes a Speech Recognition engine. Refer to the Speech SDK section for additional details on the engine and the speech vocabulary it supports.
Applications can leverage alternate recognition engines by including them within the application itself.
Navigation Buttons
The three navigation buttons on the device include both short and long-press functionality.
The buttons generate KeyEvents which can be intercepted and handled explicitly in your application, or can be left to the system to handle. Reference Android KeyEvent documentation for details.
Short presses on the buttons will perform the following functions:
- Foremost Button – Move focus to the right within a UI or move down if no focusable objects are available to the right. Returns the KEYCODE_DPAD_RIGHT KeyEvent.
- Middle Button – Move focus to the left within a UI or move up if no focusable objects are available to the left. Returns the KEYCODE_DPAD_LEFT KeyEvent.
- Rearmost Button – Will select the current UI element which has focus. Returns the KEYCODE_DPAD_CENTER KeyEvent.
Long presses on the buttons will perform the following functions:
- Foremost Button – Brings up a context menu for the current area of the UI, allowing users to access additional functions without crowding the UI. (KEYCODE_MENU)
- Middle Button – Returns to the Home screen. Returns KEYCODE_HOME.
- Rearmost Button – Moves back one step in the UI. Returns KEYCODE_BACK.
Touchpad
The M400 and M4000 feature a two-axis touchpad to that can detect a wide variety of user gestures.
The touchpad is implemented as a trackball device, and methods such as dispatchTrackballEvent() and onTrackballEvent() can be used to capture and process the raw touchpad events.
As a fallback, if you do not handle the trackball events in your application, there are predefined single, double, and triple-finger gestures that generate key presses. These keys can be captured with standard Android methods. Refer Android KeyEvent documentation for details.
One finger
- Swipe back to front: KEYCODE_DPAD_RIGHT
- Swipe front to back: KEYCODE_DPAD_LEFT
- Swipe bottom to top: KEYCODE_DPAD_UP
- Swipe top to bottom: KEYCODE_DPAD_DOWN
- Tap: KEYCODE_DPAD_CENTER
- Hold: KEYCODE_MENU
Two fingers
- Swipe back to front: KEYCODE_FORWARD_DEL
- Swipe front to back: KEYCODE_DEL
- Swipe bottom to top: KEYCODE_VOLUME_UP
- Swipe top to bottom: KEYCODE_VOLUME_DOWN
- Swipe top to bottom and hold: KEYCODE_VOLUME_MUTE
- Tap: KEYCODE_BACK
- Hold: KEYCODE_HOME
Three fingers
- Tap: KEYCODE_POWER
- Hold: KEY_F12
Touchpad Mouse
With the release of version 2.1.0 of the M400 and M4000 OS, users can now configure their touchpad to act as a virtual mouse.
Enable this by selecting "Mouse" mode in Settings > System > Language & input > Touchpad.
Usage:
- Swipe with one finger to move the cursor.
- Tap with one finger to click the screen at the cursor.
- Swipe with two fingers to scroll the view.
- Tap with two fingers to go back.
Technical Details
The Android OS running on the M400 and M4000 is a modified version of Android 9.0, tailored to the components and capabilities of the device.
For the most part, development of applications intended to be used on the M400 or M4000 can be accomplished by following standard Android development methodologies and by leveraging existing Android APIs. The APIs listed below are some of the prominent features of Android for which default APIs should be leveraged:
- Camera – android.hardware.Camera or android.hardware.Camera2 may be used
- Sensors – use SensorManager
- Bluetooth – use BluetoothManager and BluetoothAdapter. Standard Bluetooth and BLE are supported
- Database – standard Android SQLite supported
- Google Cloud Messaging – use Google Play Services Client Library 9.8.0 or earlier
- Maps – use Google Play Services Client Library 9.8.0 or earlier
- Speech Recognition – use Vuzix Speech SDK
- Barcode Engine - use Vuzix Barcode SDK
There are some components of the M400 and M4000 which will require device-specific APIs to access, these APIs will be covered in detail in other sections of the SDK documentation.
Getting to Know the M400 and M4000
Overview
The Vuzix M400 and M4000 are monocular Android-based wearable computers. It is designed to allow users to quickly and easily leverage specialized Android based applications to provide hands-free access to critical information. This allows users to be to complete tasks faster while simultaneously reducing errors in the process.
The M400 and M4000 contain many of the same capabilities as a standard Android smartphone, in an unobtrusive head-mounted form factor. See below for a full list of hardware and interaction capabilities of the device:
- Full color display:
- M400: 640x360 nHD resolution occluded OLED
- M4000: 854x480 see-through waveguide optics
- 8 Core 2.52Ghz Qualcomm XR1 processor
- 6GB system RAM
- 64GB internal flash storage
- Camera capable of up to 12.8MP stills and 4k video with auto-focus and image stabilization
- Orientation sensors (gyroscope, accelerometer, magnetometer)
- User-facing speaker
- Triple noise-cancelling microphones
- 4 standard Android control buttons
- Touch pad with multi finger support
- Voice control
The M400 and M4000 are huge leaps forward in the head-mounted wearable device market. Leveraging the power of the Qualcomm XR1 processor will allow you to push the boundaries of video processing and 3D rendering.
The M400 and M4000 also include an updated camera that improves resolution, frame rates and auto-focus time over previously available devices. These rates are documented in the Camera Knowledge Base.
Interaction Methods
The M400 and M4000 feature interaction methods which differ significantly from traditional touchscreen Android devices, and it is particularly important to keep these considerations in mind when designing the User Interface of an application intended to run on this device.
Existing applications which heavily leverage touchscreen interactions do not translate well to this device. This is due to touchscreen UI’s leveraging taps for input based on specific screen coordinates, which is not possible with the available interaction methods.
Voice
Voice commands are the ideal method of interacting with the device under many circumstances, as they will allow users to quickly control the device and provide input without requiring them to physically interact with the device, thus interrupting their workflow.
The device includes a Speech Recognition engine. Refer to the Speech SDK section for additional details on the engine and the speech vocabulary it supports.
Applications can leverage alternate recognition engines by including them within the application itself.
Navigation Buttons
The three navigation buttons on the device include both short and long-press functionality.
The buttons generate KeyEvents which can be intercepted and handled explicitly in your application, or can be left to the system to handle. Reference Android KeyEvent documentation for details.
Short presses on the buttons will perform the following functions:
- Foremost Button – Move focus to the right within a UI or move down if no focusable objects are available to the right. Returns the KEYCODE_DPAD_RIGHT KeyEvent.
- Middle Button – Move focus to the left within a UI or move up if no focusable objects are available to the left. Returns the KEYCODE_DPAD_LEFT KeyEvent.
- Rearmost Button – Will select the current UI element which has focus. Returns the KEYCODE_DPAD_CENTER KeyEvent.
Long presses on the buttons will perform the following functions:
- Foremost Button – Brings up a context menu for the current area of the UI, allowing users to access additional functions without crowding the UI. (KEYCODE_MENU)
- Middle Button – Returns to the Home screen. Returns KEYCODE_HOME.
- Rearmost Button – Moves back one step in the UI. Returns KEYCODE_BACK.
Touchpad
The M400 and M4000 feature a two-axis touchpad to that can detect a wide variety of user gestures.
The touchpad is implemented as a trackball device, and methods such as dispatchTrackballEvent() and onTrackballEvent() can be used to capture and process the raw touchpad events.
As a fallback, if you do not handle the trackball events in your application, there are predefined single, double, and triple-finger gestures that generate key presses. These keys can be captured with standard Android methods. Refer Android KeyEvent documentation for details.
One finger
- Swipe back to front: KEYCODE_DPAD_RIGHT
- Swipe front to back: KEYCODE_DPAD_LEFT
- Swipe bottom to top: KEYCODE_DPAD_UP
- Swipe top to bottom: KEYCODE_DPAD_DOWN
- Tap: KEYCODE_DPAD_CENTER
- Hold: KEYCODE_MENU
Two fingers
- Swipe back to front: KEYCODE_FORWARD_DEL
- Swipe front to back: KEYCODE_DEL
- Swipe bottom to top: KEYCODE_VOLUME_UP
- Swipe top to bottom: KEYCODE_VOLUME_DOWN
- Swipe top to bottom and hold: KEYCODE_VOLUME_MUTE
- Tap: KEYCODE_BACK
- Hold: KEYCODE_HOME
Three fingers
- Tap: KEYCODE_POWER
- Hold: KEY_F12
Touchpad Mouse
With the release of version 2.1.0 of the M400 and M4000 OS, users can now configure their touchpad to act as a virtual mouse.
Enable this by selecting "Mouse" mode in Settings > System > Language & input > Touchpad.
Usage:
- Swipe with one finger to move the cursor.
- Tap with one finger to click the screen at the cursor.
- Swipe with two fingers to scroll the view.
- Tap with two fingers to go back.
Technical Details
The Android OS running on the M400 and M4000 is a modified version of Android 9.0, tailored to the components and capabilities of the device.
For the most part, development of applications intended to be used on the M400 or M4000 can be accomplished by following standard Android development methodologies and by leveraging existing Android APIs. The APIs listed below are some of the prominent features of Android for which default APIs should be leveraged:
- Camera – android.hardware.Camera or android.hardware.Camera2 may be used
- Sensors – use SensorManager
- Bluetooth – use BluetoothManager and BluetoothAdapter. Standard Bluetooth and BLE are supported
- Database – standard Android SQLite supported
- Google Cloud Messaging – use Google Play Services Client Library 9.8.0 or earlier
- Maps – use Google Play Services Client Library 9.8.0 or earlier
- Speech Recognition – use Vuzix Speech SDK
- Barcode Engine - use Vuzix Barcode SDK
There are some components of the M400 and M4000 which will require device-specific APIs to access, these APIs will be covered in detail in other sections of the SDK documentation.