At one point in time, the only company designing both work and in-home tables with built-in multitouch displays was Microsoft. Their 2007 project which was originally called "Surface," was rebranded as PixelSense last year when they officially partnered with Samsung. PixelSense is targeting industry verticals such as retail, media and entertainment, healthcare, financial services, education, and government. Considering the shear scope of this emerging market, we now know that China's Lenovo wants in on this market before it really takes off. A newly discovered patent reveals Lenovo's detailed system. While it may take a few years before this technology trickles down into the general consumer space, Lenovo's entry into this segment of the market should stir on healthy competition that translates into products coming to market sooner rather than later.
The Limitations of Multi-touch Displays
Multi-touch technology, while solving the limitations of single-touch technology, introduced a new range of issues including multiple users. In other words, current multi-touch solutions suffer from interpreting gesture input from multiple users as the systems struggle to distinguish the input of one user from the input of another user. Furthermore, as these touch screen displays increase in size, the position of the user with relation to the screen introduces other gesture input issues.
Dynamically Zoning a Touchscreen Display
Lenovo recognizes a need for an apparatus and method that dynamically zones touchscreen displays. Beneficially, such an apparatus and method would partition a display into user zones.
Lenovo's invention relates to an apparatus that provides a plurality of modules configured to functionally execute the necessary steps of dynamically zoning a touchscreen display. These modules in the described embodiments include an identification module detecting a number of users around a perimeter of a display, and a zoning module generating a plurality of user zones in response to the number of users detected. The apparatus also includes a positioning module orienting a gesture zone, within each of the plurality of user zones, in relation to a corresponding user.
In one embodiment, the apparatus includes an input module receiving user input, and each of the user zones includes an instance of the input module. Each instance of the input module ignores user input that corresponds to an adjacent user zone. The positioning module rotates the gesture zone in response to a position of the user with relation to the display.
In a further embodiment, the apparatus includes a characterization module visually distinguishing each of the plurality of user zones from an adjacent user zone and a virtual machine module operating a virtual machine within each of the plurality of user zones. Additionally, the apparatus may include an environment module detecting the type of user environment, and a sizing module resizing the plurality of user zones in response to the detected type of user environment.
A method is also presented. The method in the disclosed embodiments substantially includes the steps necessary to carry out the functions presented above with respect to the operation of the described apparatus and system. In one embodiment, the method includes detecting a number of users around a perimeter of a display, generating a plurality of user zones in response to the number of users detected, and orienting a gesture zone, within each of the plurality of user zones, in relation to a corresponding user.
In a further embodiment, the method includes receiving user input, ignoring user input that corresponds to an adjacent user zone, and visually distinguishing each of the plurality of user zones from an adjacent user zone. Additionally, the method may include operating a virtual machine within each of the plurality of user zones, and resizing the plurality of user zones in response to the detected type of user environment.
Setting Boundaries for User Zones
Lenovo's patent FIG. 1 is a perspective view diagram illustrating one embodiment of a multi-zoned computing device 100 (hereinafter "device") includes a touchscreen display 102. The touchscreen display 102, as depicted, is capable of receiving input from one or more users. Input may be in the form of hand "gestures," across the display as illustrated by arrows 104.
The device illustrated in patent FIG. 1 above is an example of a tablet computer sitting on a table 106 with two users seated across from each other. As used herein, the term "user zone" refers to a portion of a display or screen dedicated to a user. The user zone may occupy an entire display, or alternatively, a portion of the display. A boundary line 110 may distinguish the boundary between user zones. In another embodiment, the device may utilize other visual indicators to distinguish between user zones, including but not limited to, different colored backgrounds, different levels of brightness, different orientations of the same background image, etc.
Gesture Zones
In Lenovo's patent FIG. 4 shown below we're able to see a top view diagram illustrating one embodiment of a device having a single user zone 400 while in patent FIG. 5 we see a device with multiple user zones. In FIG. 4 the user zone includes a gesture zone 402 oriented towards the user as indicated by arrow 404. The gesture zones could be distinguished with increased or decreased brightness or color as noted above. The dotted outlines seen in the patent figures below are for illustrative purposes only.
Lenovo states that the gesture zone could depict a virtual keyboard and/or mouse in addition to allowing a user to use known gestures (flipping a page, sizing/resizing photos, playing a game, scrolling a webpage etc).
Examples of Lenovo's Zoning, Positioning & Environmental Modules
Lenovo's patent FIG. 6 is a top view diagram illustrating another embodiment of a device having multiple user zones. In the depicted embodiment, the display of the device is partitioned or zoned into four user zones 602 corresponding to four users 604. The device's zoning module 300 (shown at the bottom of this report) may zone the display into user zones of equal surface area, or alternatively, the zoning module may determine the surface area of the zones based on the shape of the display and/or the activity of the users.
Lenovo's positioning module 304 (shown at the bottom of this report) is designed to orient the gesture zones 606 based upon the location of the user. Although the depicted embodiment illustrates a gesture zone having a surface area smaller than that of the associated user zone 602, the gesture zone 606 may occupy the entire user zone 602.
In Lenovo's patent FIG. 7 shown above we see a collaborative mode illustrated whereby the environment module 310 of FIG. 3 has identified a collaboration environment and therefore the zoning module has zoned a single user zone 702 with multiple gesture zones 704. Examples of such an environment include, but are not limited to, document editing, collaborative gaming, etc. The users 706 may simultaneously interact with a single user zone 702 through individual gesture zones 704.
An Alternative Application with a Projector
In one oddball alternative application, Lenovo states that a "touchscreen" like environment may be implemented using a camera and projector configuration. For example, the camera is configured to detect user gestures and communicate the gestures with the device. The projector may be used to project what would otherwise be shown on the touchscreen display described above. The camera and the projector may be directed at a common surface, for example, a wall. In other words, a user makes gestures on the wall which are detected by the camera and the projector projects an image on the wall.
In a further embodiment, the camera and the projector are directed towards different areas. In this embodiment, the camera detects the motion or the gestures of a user that are not in a close spatial relationship. Stated differently, the projector may project the image or user interface on a wall (or other surface) while the user is seated on a couch.
Lenovo's patent application was originally filed in Q1 2011 and published by the US Patent and Trademark Office in July.
Note to Referring Sites: We ask that referring sites limit the use of our graphics to a maximum of two per report. Thank you for your cooperation.
Notice
The Patent Bolt blog presents a detailed summary of patent applications with associated graphics for journalistic news purposes as each such patent application is revealed by the U.S. Patent & Trade Office. Readers are cautioned that the full text of any patent application should be read in its entirety for full and accurate details. Revelations found in patent applications shouldn't be interpreted as rumor or fast-tracked according to rumor timetables. About Comments: Patent Bolt reserves the right to post, dismiss or edit comments.
Comments