|
Post by Farmer on Oct 9, 2013 18:30:18 GMT -8
I would love to see more touch controls. The one I really want is an XY control. Basically a rectangle area that accepts touch and reports back the x and y value of the touch. Also it would be nice to have a similar XY control that you could draw a line on and set the playback speed. Think playback from left to right. I have seen controls like these in TuioControl. It would be a simple way to do automation of motor speed or light brightness.
|
|
|
Post by SDL on Oct 9, 2013 21:14:58 GMT -8
Hi Farmer,
Great idea! To be more specific, you are asking for an XY control (with an image background) that sends the x,y coordinates to the server from the iPad. Are you thinking streaming a set of data to the server, or just the last xy coordinate sent to the server (in a similar form as a button does.).
Help us out with some more information and I'll go talk to marketing/engineering.
Best regards,
BP
|
|
|
Post by Farmer on Oct 9, 2013 21:53:07 GMT -8
Thanks for the quick response. I want the coordinates streamed. I'm thinking something like send data on touch change event. If that makes any sense. For the other idea I was thinking something like a rectangle you can draw a waveform on that represents the data to be sent. The x a is would be time and the y axis would be value. With the addition of a speed setting to scroll across the waveform to playback the data you get a simple way to do automation of say a motor speed or other action. Saving and loading waveforms would be a nice added feature. I have coded been on the verge of coding this myself but I saw your app and thought "why not ask for it ". Example of touch pad XY danielglyde.blogspot.com/2012/08/touchit-jquery-plugin-for-touch-events.html?m=1Demo for mobile device. glyde.eu/touchit-demo.htmThe other control I can't find a decent example of so here is a quick sketch. It needs more though on ui but that was a quick example. i.imgur.com/swQWiZa.png
|
|
|
Post by SDL on Oct 10, 2013 8:14:24 GMT -8
Farmer, Thanks for the examples. And the sketch. Makes more sense now. Comments: RasPiConnect, et al does not currently support streaming data in its current configuration. Remember that we are using web.py and not apache for the server. However, after looking at things there are three ways to proceed. 1) Like your touch it-demo.htm, we run the touch box in a REMOTE_WEBVIEW_UITYPE (serving the html code up from the RasPiConnect Server). That allows us to detect the touches. We then can add code to the HTML to send via sockets (remember, the REMOTE_WEBVIEW_UITYPE is a web-browser!) to the Raspberry Pi via sockets (like web sockets - see this tutorial yz.mit.edu/wp/web-sockets-tutorial-with-simple-python-server/). The REMOTE_WEBVIEW_UITYPE would send the x,y coordinates via the socket to either the RasPiConnect Server or another python process (seems to me it would be simpler with another python process but you could easily fire off a thread from RasPiConnect Server too). This gives you immediate streaming! 2) Implement your sketch as a native RasPiConnect control, making it look really cool. Running this as a "batch" mode (i.e. define the waveform and send it) fits perfectly in the current request/response model in RasPiConnect. 3) Implement your sketch as a native RasPiConnect control and hooking it up to a socket (as above, but in non-HTML code) to provide the streaming data to the Server. It would start up by establishing a connection with the server (to make sure things are still there) and would have a keep-alive function and a method for dealing with socket connection loss. We would also have to implement this such that it would always work with wifi, but the user would have to specifically allow it to connect over the cell network (think of the data charges!!!). #1 can be done now, #2 is easier, but #3 is quite appealing to me. Help me build a set of Use cases for me to present to marketing/engineering. Basically, what would you use this for? Let's both brainstorm this. On your sketch, tell me what "Play", "Draw" and "Loop" , "+", "-" do, especially in terms of the data being streamed to the server. Thanks again for the great ideas! I think I'll put a prototype of #1 together over the weekend. Maybe I can do something cool to impress the wife. Best regards, BP
|
|
|
Post by Farmer on Oct 10, 2013 17:15:07 GMT -8
Thanks again for responding. I appreciate your effort. I especially like the idea of passing the wife test  if you can impress her your always golden. XY touch control doesn't have to actually stream. What I was wanting is just to send the coordinates on touch event change. Say finger down fires off a value to server. Finger moves fire off a value to server. I'm not sure how your event model works so I'm just assuming its like other GUI frameworks. However it gets implemented I just want a reliable value coming out of the control with the least amount of trouble on the server side. The main reasoni want this is to have a "Virtual joystick" I imagine many robot builders using this to drive around a robot or maybe steer a turret for aiming. It could be used for other things as well. In one finger press you could set duty cycle and frequency if they were mapped to x and y. Here is a list of optional things I think would be useful: Background images Labels for x and y axis / ruler marks Crosshairs on touch to show location of touch Optional self centering or last location of touch Here is an example of a XY touch control setup as a robot drive control i.imgur.com/lv4JCla.pngI think it would be great to build this and connect it up to an H bridge for the forward and reverse and a servo for left and right.
|
|
|
Post by Farmer on Oct 11, 2013 1:18:54 GMT -8
Ok sorry for the delay. The second thing was the "sketch" control. This is not as obvious as the XY control. I thought about what you said in point number 2 and I had to think about it. You are saying sketch the waveform and send it to the server to " execute". That would be better than nothing ! The downside is no feedback on the control side and no way to stop the execution. I am thinking of something like an audio program such as audacity. You can see the waveform and play it. As it plays you can see the line scroll across the waveform as iit plays. I think drawing a waveform and playing it back would be an intuitive way to interact with many things. Say for kntance you want to run a motor. You need it to soft start and gradually speed up to max speed. Or you have a motor that needs to be hard started and after its moving slowed back down. That would be easy to draw the graph and difficult for most to work out the code to step through the sequence.
The buttons were just a quick mockup and are not fully though out. I was just thinking it needs to have a draw mode and a playback. It also needs to be oneshot or looped. The +,- where speed controls. It needs a way to set the playback speed or just overall sketch length in time. If you set it to three seconds or three minutes it plays back as expected. I assume they control could have an internal timer and just fire off data on the timer event. Maybe nice to set the timer resolution and overall sketch length. Server latency and overall connection issues will set a lower limit I'm sure. It's just a thought The last thing I think this needs is a save / load ability for sketches.
I've built little demos for this years ago that were just simple pictures you draw in paint and them with some python code and PIL. I read the column data and look for the drawn pixel on the lowest row. That is the data value. step through he columns in sequence and you have what I'm talkkng about.
The main reason this is useful is automaton of a single value. Light brightness for a cool movie theatre setup. Motor speed for startup and shutdown. Fan speed control. Automated pan control for a camera doing stop motion.
It may be best to have the sketch just sent to server and have it report back it's state to update ui. Whatever works best. I haven't seen this built before.
I hope some of that was coherent.
|
|
|
Post by Farmer on Dec 17, 2013 6:53:12 GMT -8
Any update on this? Did you have any success with your demo? Any possibility this will be put in RaspiConnect?
|
|
|
Post by SDL on Dec 17, 2013 12:51:04 GMT -8
Farmer,
Yes, it will be in RasPiConnect, but not until after the first of the year. Native support of the Arduino Yun has been made higher priority.
BP
|
|
|
Post by Farmer on Dec 17, 2013 15:52:59 GMT -8
Awesome! thanks for the response.
|
|